Showing posts with label order. Show all posts
Showing posts with label order. Show all posts

14 August 2021

M Mitchell Waldrop - Collected Quotes

"At the same time, Kaufmann discovered that in developing his genetic networks, he had reinvented some of the most avant-garde work in physics and applied mathematics - albeit in a totally new context. The dynamics of his genetic regulatory networks turned out to be a special case of what the physicists were calling 'nonlinear dynamics'. From the nonlinear point of view, in fact, it was easy to see why his sparsely connected networks could organize themselves into stable cycles so easily: mathematically, their behavior was equivalent to the way all the rain falling on the hillsides around a valley will flow into a lake at the bottom of the valley. In the space of all possible network behaviors, the stable cycles were like basins-or as the physicists put it, 'attractors'." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"In nonlinear systems - and the economy is most certainly nonlinear - chaos theory tells you that the slightest uncertainty in your knowledge of the initial conditions will often grow inexorably. After a while, your predictions are nonsense." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"In short, complex adaptive systems are characterized by perpetual novelty." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"In the everyday world of human affairs, no one is surprised to learn that a tiny event over here can have an enormous effect over there. For want of a nail, the shoe was lost, et cetera. But when the physicists started paying serious attention to nonlinear systems in their own domain, they began to realize just how profound a principle this really was. […] Tiny perturbations won't always remain tiny. Under the right circumstances, the slightest uncertainty can grow until the system's future becomes utterly unpredictable - or, in a word, chaotic." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"[...] it's essentially meaningless to talk about a complex adaptive system being in equilibrium: the system can never get there. It is always unfolding, always in transition. In fact, if the system ever does reach equilibrium, it isn't just stable. It's dead." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"Indeed, except for the very simplest physical systems, virtually everything and everybody in the world is caught up in a vast, nonlinear web of incentives and constraints and connections. The slightest change in one place causes tremors everywhere else. We can't help but disturb the universe, as T.S. Eliot almost said. The whole is almost always equal to a good deal more than the sum of its parts. And the mathematical expression of that property - to the extent that such systems can be described by mathematics at all - is a nonlinear equation: one whose graph is curvy." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

08 June 2021

On Patterns (2010-2019)

"Because the question for me was always whether that shape we see in our lives was there from the beginning or whether these random events are only called a pattern after the fact. Because otherwise we are nothing." (Cormac McCarthy, "All the Pretty Horses", 2010)

"The human mind delights in finding pattern - so much so that we often mistake coincidence or forced analogy for profound meaning. No other habit of thought lies so deeply within the soul of a small creature trying to make sense of a complex world not constructed for it." (Stephen J Gould, "The Flamingo's Smile: Reflections in Natural History", 2010)

"What advantages do diagrams have over verbal descriptions in promoting system understanding? First, by providing a diagram, massive amounts of information can be presented more efficiently. A diagram can strip down informational complexity to its core - in this sense, it can result in a parsimonious, minimalist description of a system. Second, a diagram can help us see patterns in information and data that may appear disordered otherwise. For example, a diagram can help us see mechanisms of cause and effect or can illustrate sequence and flow in a complex system. Third, a diagram can result in a less ambiguous description than a verbal description because it forces one to come up with a more structured description." (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)

"A surprising proportion of mathematicians are accomplished musicians. Is it because music and mathematics share patterns that are beautiful?" (Martin Gardner, "The Dover Math and Science Newsletter", 2011)

"It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern." (Daniel Kahneman, "Thinking, Fast and Slow", 2011)

"Knowing the importance of luck, you should be particularly suspicious when highly consistent patterns emerge from the comparison of successful and less successful firms. In the presence of randomness, regular patterns can only be mirages." (Daniel Kahneman, "Thinking, Fast and Slow", 2011)

"Once a myth becomes established, it forms part of our mental model of the world and alters our perception, the way our brains interpret the fleeting patterns our eyes pick up." (Jeremy Wade, "River Monsters: True Stories of the Ones that Didn't Get Away", 2011)

"Randomness might be defined in terms of order - its absence, that is. […] Everything we care about lies somewhere in the middle, where pattern and randomness interlace." (James Gleick, "The Information: A History, a Theory, a Flood", 2011)

"Equations have hidden powers. They reveal the innermost secrets of nature. […] The power of equations lies in the philosophically difficult correspondence between mathematics, a collective creation of human minds, and an external physical reality. Equations model deep patterns in the outside world. By learning to value equations, and to read the stories they tell, we can uncover vital features of the world around us." (Ian Stewart, "In Pursuit of the Unknown", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Mathematical intuition is the mind’s ability to sense form and structure, to detect patterns that we cannot consciously perceive. Intuition lacks the crystal clarity of conscious logic, but it makes up for that by drawing attention to things we would never have consciously considered." (Ian Stewart, "Visions of Infinity", 2013)

"Proof, in fact, is the requirement that makes great problems problematic. Anyone moderately competent can carry out a few calculations, spot an apparent pattern, and distil its essence into a pithy statement. Mathematicians demand more evidence than that: they insist on a complete, logically impeccable proof. Or, if the answer turns out to be negative, a disproof. It isn’t really possible to appreciate the seductive allure of a great problem without appreciating the vital role of proof in the mathematical enterprise. Anyone can make an educated guess. What’s hard is to prove it’s right. Or wrong." (Ian Stewart, "Visions of Infinity", 2013)

"Swarm intelligence illustrates the complex and holistic way in which the world operates. Order is created from chaos; patterns are revealed; and systems are free to work out their errors and problems at their own level. What natural systems can teach humanity is truly amazing." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"To put it simply, we communicate when we display a convincing pattern, and we discover when we observe deviations from our expectations. These may be explicit in terms of a mathematical model or implicit in terms of a conceptual model. How a reader interprets a graphic will depend on their expectations. If they have a lot of background knowledge, they will view the graphic differently than if they rely only on the graphic and its surrounding text." (Andrew Gelman & Antony Unwin, "Infovis and Statistical Graphics: Different Goals, Different Looks", Journal of Computational and Graphical Statistics Vol. 22(1), 2013)

"Another way to secure statistical significance is to use the data to discover a theory. Statistical tests assume that the researcher starts with a theory, collects data to test the theory, and reports the results - whether statistically significant or not. Many people work in the other direction, scrutinizing the data until they find a pattern and then making up a theory that fits the pattern." (Gary Smith, "Standard Deviations", 2014)

"Intersections of lines, for example, remain intersections, and the hole in a torus (doughnut) cannot be transformed away. Thus a doughnut may be transformed topologically into a coffee cup (the hole turning into a handle) but never into a pancake. Topology, then, is really a mathematics of relationships, of unchangeable, or 'invariant', patterns." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"One of the remarkable features of these complex systems created by replicator dynamics is that infinitesimal differences in starting positions create vastly different patterns. This sensitive dependence on initial conditions is often called the butterfly-effect aspect of complex systems - small changes in the replicator dynamics or in the starting point can lead to enormous differences in outcome, and they change one’s view of how robust the current reality is. If it is complex, one small change could have led to a reality that is quite different." (David Colander & Roland Kupers, "Complexity and the art of public policy : solving society’s problems from the bottom up", 2014)

"[…] regard it in fact as the great advantage of the mathematical technique that it allows us to describe, by means of algebraic equations, the general character of a pattern even where we are ignorant of the numerical values which will determine its particular manifestation." (Friedrich A von Hayek, "The Market and Other Orders", 2014)

"We are genetically predisposed to look for patterns and to believe that the patterns we observe are meaningful. […] Don’t be fooled into thinking that a pattern is proof. We need a logical, persuasive explanation and we need to test the explanation with fresh data." (Gary Smith, "Standard Deviations", 2014)

"We are hardwired to make sense of the world around us - to notice patterns and invent theories to explain these patterns. We underestimate how easily pat - terns can be created by inexplicable random events - by good luck and bad luck." (Gary Smith, "Standard Deviations", 2014)

"A pattern is a design or model that helps grasp something. Patterns help connect things that may not appear to be connected. Patterns help cut through complexity and reveal simpler understandable trends. […] Patterns can be temporal, which is something that regularly occurs over time. Patterns can also be spatial, such as things being organized in a certain way. Patterns can be functional, in that doing certain things leads to certain effects. Good patterns are often symmetric. They echo basic structures and patterns that we are already aware of." (Anil K Maheshwari, "Business Intelligence and Data Mining", 2015)

"The human mind builds up theories by recognising familiar patterns and glossing over details that are well understood, so that it can concentrate on the new material. In fact it is limited by the amount of new information it can hold at any one time, and the suppression of familiar detail is often essential for a grasp of the total picture. In a written proof, the step-by-step logical deduction is therefore foreshortened where it is already a part of the reader's basic technique, so that they can comprehend the overall structure more easily." (Ian Stewart & David Tall, "The Foundations of Mathematics" 2nd Ed., 2015)

"Why do mathematicians care so much about pi? Is it some kind of weird circle fixation? Hardly. The beauty of pi, in part, is that it puts infinity within reach. Even young children get this. The digits of pi never end and never show a pattern. They go on forever, seemingly at random - except that they can’t possibly be random, because they embody the order inherent in a perfect circle. This tension between order and randomness is one of the most tantalizing aspects of pi." (Steven Strogatz, "Why PI Matters" 2015)

"Without chaos there would be no creation, no structure and no existence. After all, order is merely the repetition of patterns; chaos is the process that establishes those patterns. Without this creative self-organizing force, the universe would be devoid of biological life, the birth of stars and galaxies - everything we have come to know. (Lawrence K Samuels, "Chaos Gets a Bad Rap: Importance of Chaology to Liberty", 2015)

"A mental representation is a mental structure that corresponds to an object, an idea, a collection of information, or anything else, concrete or abstract, that the brain is thinking about. […] Because the details of mental representations can differ dramatically from field to field, it’s hard to offer an overarching definition that is not too vague, but in essence these representations are preexisting patterns of information - facts, images, rules, relationships, and so on - that are held in long-term memory and that can be used to respond quickly and effectively in certain types of situations." (Anders Ericsson & Robert Pool," Peak: Secrets from  the  New  Science  of  Expertise", 2016)

"String theory today looks almost fractal. The more closely people explore any one corner, the more structure they find. Some dig deep into particular crevices; others zoom out to try to make sense of grander patterns. The upshot is that string theory today includes much that no longer seems stringy. Those tiny loops of string whose harmonics were thought to breathe form into every particle and force known to nature (including elusive gravity) hardly even appear anymore on chalkboards at conferences." (K C Cole, "The Strange Second Life of String Theory", Quanta Magazine", 2016)

"The relationship of math to the real world has been a conundrum for philosophers for centuries, but it is also an inspiration for poets. The patterns of mathematics inhabit a liminal space - they were initially derived from the natural world and yet seem to exist in a separate, self-contained system standing apart from that world. This makes them a source of potential metaphor: mapping back and forth between the world of personal experience and the world of mathematical patterns opens the door to novel connections." (Alice Major, "Mapping from e to Metaphor", 2018)

"Apart from the technical challenge of working with the data itself, visualization in big data is different because showing the individual observations is just not an option. But visualization is essential here: for analysis to work well, we have to be assured that patterns and errors in the data have been spotted and understood. That is only possible by visualization with big data, because nobody can look over the data in a table or spreadsheet." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

26 May 2021

On Randomness XXI (Statistical Tools I)

"If you take a pack of cards as it comes from the maker and shuffle it for a few minutes, all trace of the original systematic order disappears. The order will never come back however long you shuffle. Something has been done which cannot be undone, namely, the introduction of a random element in place of the arrangement." (Sir Arthur S Eddington, "The Nature of the Physical World", 1928)

"We must emphasize that such terms as 'select at random', 'choose at random', and the like, always mean that some mechanical device, such as coins, cards, dice, or tables of random numbers, is used." (Frederick Mosteller et al, "Principles of Sampling", Journal of the American Statistical Association Vol. 49 (265), 1954)

"It is seen that continued shuffling may reasonably be expected to produce perfect 'randomness' and to eliminate all traces of the original order. It should be noted, however, that the number of operations required for this purpose is extremely large."  (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

"The urn model is to be the expression of three postulates: (1) the constancy of a probability distribution, ensured by the solidity of the vessel, (2) the random-character of the choice, ensured by the narrowness of the mouth, which is to prevent visibility of the contents and any consciously selective choice, (3) the independence of successive choices, whenever the drawn balls are put back into the urn. Of course in abstract probability and statistics the word 'choice' can be avoided and all can be done without any reference to such a model. But as soon as the abstract theory is to be applied, random choice plays an essential role." (Hans Freudenthal, "The Concept and the Role of the Model in Mathematics and Natural and Social Sciences", 1961)

"Sequences of random numbers also inevitably display certain regularities. […] The trouble is, just as no real die, coin, or roulette wheel is ever likely to be perfectly fair, no numerical recipe produces truly random numbers. The mere existence of a formula suggests some sort of predictability or pattern." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Indeed a deterministic die behaves very much as if it has six attractors, the steady states corresponding to its six faces, all of whose basins are intertwined. For technical reasons that can't quite be true, but it is true that deterministic systems with intertwined basins are wonderful substitutes for dice; in fact they're super-dice, behaving even more ‘randomly’ - apparently - than ordinary dice. Super-dice are so chaotic that they are uncomputable. Even if you know the equations for the system perfectly, then given an initial state, you cannot calculate which attractor it will end up on. The tiniest error of approximation – and there will always be such an error - will change the answer completely." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"It's a bit like having a theory about coins that move in space, but only being able to measure their state by interrupting them with a table. We hypothesize that the coin may be able to revolve in space, a state that is neither ‘heads’ nor ‘tails’ but a kind of mixture. Our experimental proof is that when you stick a table in, you get heads half the time and tails the other half - randomly. This is by no means a perfect analogy with standard quantum theory - a revolving coin is not exactly in a superposition of heads and tails - but it captures some of the flavour." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns - which frequently occur in random data - could equally well be derived from tossing a coin. So it is in the stock market as well." (Didier Sornette, "Why Stock Markets Crash: Critical events in complex financial systems", 2003)

"[…] we would like to observe that the butterfly effect lies at the root of many events which we call random. The final result of throwing a dice depends on the position of the hand throwing it, on the air resistance, on the base that the die falls on, and on many other factors. The result appears random because we are not able to take into account all of these factors with sufficient accuracy. Even the tiniest bump on the table and the most imperceptible move of the wrist affect the position in which the die finally lands. It would be reasonable to assume that chaos lies at the root of all random phenomena." (Iwo Bialynicki-Birula & Iwona Bialynicka-Birula, "Modeling Reality: How Computers Mirror Life", 2004)

"There is no such thing as randomness. No one who could detect every force operating on a pair of dice would ever play dice games, because there would never be any doubt about the outcome. The randomness, such as it is, applies to our ignorance of the possible outcomes. It doesn’t apply to the outcomes themselves. They are 100% determined and are not random in the slightest. Scientists have become so confused by this that they now imagine that things really do happen randomly, i.e. for no reason at all." (Thomas Stark, "God Is Mathematics: The Proofs of the Eternal Existence of Mathematics", 2018)

On Randomness XXVII (Patterns)

"To the untrained eye, randomness appears as regularity or tendency to cluster." (William Feller, "An Introduction to Probability Theory and its Applications", 1950) 

"Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns - which frequently occur in random data - could equally well be derived from tossing a coin. So it is in the stock market as well." (Burton G Malkiel, "A Random Walk Down Wall Street", 1989)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Sequences of random numbers also inevitably display certain regularities. […] The trouble is, just as no real die, coin, or roulette wheel is ever likely to be perfectly fair, no numerical recipe produces truly random numbers. The mere existence of a formula suggests some sort of predictability or pattern." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"There are only patterns, patterns on top of patterns, patterns that affect other patterns. Patterns hidden by patterns. Patterns within patterns. If you watch close, history does nothing but repeat itself. What we call chaos is just patterns we haven't recognized. What we call random is just patterns we can't decipher. what we can't understand we call nonsense. What we can't read we call gibberish. There is no free will. There are no variables." (Chuck Palahniuk, "Survivor", 1999)

"Why is the human need to be in control relevant to a discussion of random patterns? Because if events are random, we are not in control, and if we are in control of events, they are not random. There is therefore a fundamental clash between our need to feel we are in control and our ability to recognize randomness. That clash is one of the principal reasons we misinterpret random events."  (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"Randomness might be defined in terms of order - its absence, that is. […] Everything we care about lies somewhere in the middle, where pattern and randomness interlace." (James Gleick, "The Information: A History, a Theory, a Flood", 2011)

"Remember that even random coin flips can yield striking, even stunning, patterns that mean nothing at all. When someone shows you a pattern, no matter how impressive the person’s credentials, consider the possibility that the pattern is just a coincidence. Ask why, not what. No matter what the pattern, the question is: Why should we expect to find this pattern?" (Gary Smith, "Standard Deviations", 2014)

"We are hardwired to make sense of the world around us - to notice patterns and invent theories to explain these patterns. We underestimate how easily pat - terns can be created by inexplicable random events - by good luck and bad luck." (Gary Smith, "Standard Deviations", 2014)

09 May 2021

On Randomness XXVI (Universe)

"Random chance was not a sufficient explanation of the Universe - in fact, random chance was not sufficient to explain random chance; the pot could not hold itself." (Robert A Heinlein, "Stranger in a Strange Land", 1961)

"The line between inner and outer landscapes is breaking down. Earthquakes can result from seismic upheavals within the human mind. The whole random universe of the industrial age is breaking down into cryptic fragments." (William S Burroughs, [preface] 1972)

"There is no reason to assume that the universe has the slightest interest in intelligence -  or even in life. Both may be random accidental by-products of its operations like the beautiful patterns on a butterfly's wings. The insect would fly just as well without them […]" (Arthur C Clarke, "The Lost Worlds of 2001", 1972)

"It is tempting to wonder if our present universe, large as it is and complex though it seems, might not be merely the result of a very slight random increase in order over a very small portion of an unbelievably colossal universe which is virtually entirely in heat-death." (Isaac Asimov, 1976)

"Perhaps randomness is not merely an adequate description for complex causes that we cannot specify. Perhaps the world really works this way, and many events are uncaused in any conventional sense of the word." (Stephen J Gould, "Hen's Teeth and Horse's Toes", 1983)

"The world of science lives fairly comfortably with paradox. We know that light is a wave and also that light is a particle. The discoveries made in the infinitely small world of particle physics indicate randomness and chance, and I do not find it any more difficult to live with the paradox of a universe of randomness and chance and a universe of pattern and purpose than I do with light as a wave and light as a particle. Living with contradiction is nothing new to the human being." (Madeline L'Engle, "Two-Part Invention: The Story of a Marriage", 1988)

"Intriguingly, the mathematics of randomness, chaos, and order also furnishes what may be a vital escape from absolute certainty - an opportunity to exercise free will in a deterministic universe. Indeed, in the interplay of order and disorder that makes life interesting, we appear perpetually poised in a state of enticingly precarious perplexity. The universe is neither so crazy that we can’t understand it at all nor so predictable that there’s nothing left for us to discover." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1997)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"The first view of randomness is of clutter bred by complicated entanglements. Even though we know there are rules, the outcome is uncertain. Lotteries and card games are generally perceived to belong to this category. More troublesome is that nature's design itself is known imperfectly, and worse, the rules may be hidden from us, and therefore we cannot specify a cause or discern any pattern of order. When, for instance, an outcome takes place as the confluence of totally unrelated events, it may appear to be so surprising and bizarre that we say that it is due to blind chance." (Edward Beltrami. "What is Random?: Chance and Order in Mathematics and Life", 1999)

"The tissue of the world is built from necessities and randomness; the intellect of men places itself between both and can control them; it considers the necessity and the reason of its existence; it knows how randomness can be managed, controlled, and used." (Johann Wolfgang von Goethe)

20 April 2021

On Coincidence II

"People are entirely too disbelieving of coincidence. They are far too ready to dismiss it and to build arcane structures of extremely rickety substance in order to avoid it. I, on the other hand, see coincidence everywhere as an inevitable consequence of the laws of probability, according to which having no unusual coincidence is far more unusual than any coincidence could possibly be." (Isaac Asimov, "The Planet That Wasn't", 1976)

"Our form of life depends, in delicate and subtle ways, on several apparent ‘coincidences’ in the fundamental laws of nature which make the Universe tick. Without those coincidences, we would not be here to puzzle over the problem of their existence […] What does this mean? One possibility is that the Universe we know is a highly improbable accident, ‘just one of those things’." (John R Gribbin, "Genesis: The Origins of Man and the Universe", 1981)

"[…] a mathematician's ultimate concern is that his or her inventions be logical, not realistic. This is not to say, however, that mathematical inventions do not correspond to real things. They do, in most, and possibly all, cases. The coincidence between mathematical ideas and natural reality is so extensive and well documented, in fact, that it requires an explanation. Keep in mind that the coincidence is not the outcome of mathematicians trying to be realistic - quite to the contrary, their ideas are often very abstract and do not initially appear to have any correspondence to the real world. Typically, however, mathematical ideas are eventually successfully applied to describe real phenomena […]"(Michael Guillen, "Bridges to Infinity: The Human Side of Mathematics", 1983)

"Moreover, joint occurrences tend to be better recalled than instances when the effect does not occur. The proneness to remember confirming instances, but to overlook disconfirming ones, further serves to convert, in thought, coincidences into causalities." (Albert Bandura, "Social Foundations of Thought and Action: A social cognitive theory", 1986)

"There is no coherent knowledge, i.e. no uniform comprehensive account of the world and the events in it. There is no comprehensive truth that goes beyond an enumeration of details, but there are many pieces of information, obtained in different ways from different sources and collected for the benefit of the curious. The best way of presenting such knowledge is the list - and the oldest scientific works were indeed lists of facts, parts, coincidences, problems in several specialized domains." (Paul K Feyerabend, "Farewell to Reason", 1987)

"A tendency to drastically underestimate the frequency of coincidences is a prime characteristic of innumerates, who generally accord great significance to correspondences of all sorts while attributing too little significance to quite conclusive but less flashy statistical evidence." (John A Paulos, "Innumeracy: Mathematical Illiteracy and its Consequences", 1988)

"The law of truly large numbers states: With a large enough sample, any outrageous thing is likely to happen." (Frederick Mosteller, "Methods for Studying Coincidences", Journal of the American Statistical Association Vol. 84, 1989)

"Most coincidences are simply chance events that turn out to be far more probable than many people imagine." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1997)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Randomness is the very stuff of life, looming large in our everyday experience. […] The fascination of randomness is that it is pervasive, providing the surprising coincidences, bizarre luck, and unexpected twists that color our perception of everyday events." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

18 March 2021

On Chaos IV

"One of the central problems studied by mankind is the problem of the succession of form. Whatever is the ultimate nature of reality (assuming that this expression has meaning), it is indisputable that our universe is not chaos. We perceive beings, objects, things to which we give names. These beings or things are forms or structures endowed with a degree of stability; they take up some part of space and last for some period of time." (René Thom, "Structural Stability and Morphogenesis", 1972)

"'Disorder' is not mere chaos; it implies defective order." (John M Ziman, "Models of Disorder", 1979)

"Chaos and catastrophe theories are among the most interesting recent developments in nonlinear modeling, and both have captured the interests of scientists in many disciplines. It is only natural that social scientists should be concerned with these theories. Linear statistical models have proven very useful in a great deal of social scientific empirical analyses, as is evidenced by how widely these models have been used for a number of decades. However, there is no apparent reason, intuitive or otherwise, as to why human behavior should be more linear than the behavior of other things, living and nonliving. Thus an intellectual movement toward nonlinear models is an appropriate evolutionary movement in social scientific thinking, if for no other reason than to expand our paradigmatic boundaries by encouraging greater flexibility in our algebraic specifications of all aspects of human life." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"[...] chaos and catastrophe theories per se address behavioral phenomena that are consequences of two general types of nonlinear dynamic behavior. In the most elementary of behavioral terms, chaotic phenomena are a class of deterministic processes that seem to mimic random or stochastic dynamics. Catastrophe phenomena, on the other hand, are a class of dynamic processes that exhibit a sudden and large scale change in at least one variable in correspondence with relatively small changes in other variables or, in some cases, parameters." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"Nature normally hates power laws. In ordinary systems all quantities follow bell curves, and correlations decay rapidly, obeying exponential laws. But all that changes if the system is forced to undergo a phase transition. Then power laws emerge-nature's unmistakable sign that chaos is departing in favor of order. The theory of phase transitions told us loud and clear that the road from disorder to order is maintained by the powerful forces of self-organization and is paved by power laws. It told us that power laws are not just another way of characterizing a system's behavior. They are the patent signatures of self-organization in complex systems." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"Chaos is not pure disorder, it carries within itself the indistinctness between the potentialities of order, of disorder, and of organization from which a cosmos will be born, which is an ordered universe." (Edgar Morin, "Restricted Complexity, General Complexity" [in (Carlos Gershenson et al [Eds.], "Worldviews, Science and Us: Philosophy and Complexity", 2007)])

"Chaos can be understood as a dynamical process in which microscopic information hidden in the details of a system’s state is dug out and expanded to a macroscopically visible scale (stretching), while the macroscopic information visible in the current system’s state is continuously discarded (folding)." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"God has put a secret art into the forces of Nature so as to enable it to fashion itself out of chaos into a perfect world system." (Immanuel Kant)

"Science, like art, music and poetry, tries to reduce chaos to the clarity and order of pure beauty." (Detlev W Bronk)

22 February 2021

Steven H Strogatz - Collected Quotes

"An equilibrium is defined to be stable if all sufficiently small disturbances away from it damp out in time. Thus stable equilibria are represented geometrically by stable fixed points. Conversely, unstable equilibria, in which disturbances grow in time, are represented by unstable fixed points." (Steven H Strogatz, "Non-Linear Dynamics and Chaos, 1994)

"[…] chaos and fractals are part of an even grander subject known as dynamics. This is the subject that deals with change, with systems that evolve in time. Whether the system in question settles down to equilibrium, keeps repeating in cycles, or does something more complicated, it is dynamics that we use to analyze the behavior." (Steven H Strogatz, "Non-Linear Dynamics and Chaos, 1994)

"The qualitative structure of the flow can change as parameters are varied. In particular, fixed points can be created or destroyed, or their stability can change. These qualitative changes in the dynamics are called bifurcations , and the parameter values at which they occur are called bifurcation points." (Steven H Strogatz, "Non-Linear Dynamics and Chaos, 1994)

"Why are nonlinear systems so much harder to analyze than linear ones? The essential difference is that linear systems can be broken down into parts. Then each part can be solved separately and finally recombined to get the answer. This idea allows a fantastic simplification of complex problems, and underlies such methods as normal modes, Laplace transforms, superposition arguments, and Fourier analysis. In this sense, a linear system is precisely equal to the sum of its parts." (Steven H Strogatz, "Non-Linear Dynamics and Chaos, 1994)

"A depressing corollary of the butterfly effect (or so it was widely believed) was that two chaotic systems could never synchronize with each other. Even if you took great pains to start them the same way, there would always be some infinitesimal difference in their initial states. Normally that small discrepancy would remain small for a long time, but in a chaotic system, the error cascades and feeds on itself so swiftly that the systems diverge almost immediately, destroying the synchronization. Unfortunately, it seemed, two of the most vibrant branches of nonlinear science - chaos and sync - could never be married. They were fundamentally incompatible." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"[…] all human beings - professional mathematicians included - are easily muddled when it comes to estimating the probabilities of rare events. Even figuring out the right question to ask can be confusing." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Although the shape of chaos is nightmarish, its voice is oddly soothing. When played through a loudspeaker, chaos sounds like white noise, like the soft static that helps insomniacs fall asleep." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"At an anatomical level - the level of pure, abstract connectivity - we seem to have stumbled upon a universal pattern of complexity. Disparate networks show the same three tendencies: short chains, high clustering, and scale-free link distributions. The coincidences are eerie, and baffling to interpret." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Average path length reflects the global structure; it depends on the way the entire network is connected, and cannot be inferred from any local measurement. Clustering reflects the local structure; it depends only on the interconnectedness of a typical neighborhood, the inbreeding among nodes tied to a common center. Roughly speaking, path length measures how big the network is. Clustering measures how incestuous it is." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"But linearity is often an approximation to a more complicated reality. Most systems behave linearly only when they are close to equilibrium, and only when we don't push them too hard." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"By its very nature, the mathematical study of networks transcends the usual boundaries between disciplines. Network theory is concerned with the relationships between individuals, the patterns of interactions. The precise nature of the individuals is downplayed, or even suppressed, in hopes of uncovering deeper laws. A network theorist will look at any system of interlinked components and see an abstract pattern of dots connected by lines. It's the pattern that matters, the architecture of relationships, not the identities of the dots themselves. Viewed from these lofty heights, many networks, seemingly unrelated, begin to look the same." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Chaos theory revealed that simple nonlinear systems could behave in extremely complicated ways, and showed us how to understand them with pictures instead of equations. Complexity theory taught us that many simple units interacting according to simple rules could generate unexpected order. But where complexity theory has largely failed is in explaining where the order comes from, in a deep mathematical sense, and in tying the theory to real phenomena in a convincing way. For these reasons, it has had little impact on the thinking of most mathematicians and scientists." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"[…] equilibrium means nothing changes; stability means slight disturbances die out." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"From a purely mathematical perspective, a power law signifies nothing in particular - it's just one of many possible kinds of algebraic relationship. But when a physicist sees a power law, his eyes light up. For power laws hint that a system may be organizing itself. They arise at phase transitions, when a system is poised at the brink, teetering between order and chaos. They arise in fractals, when an arbitrarily small piece of a complex shape is a microcosm of the whole. They arise in the statistics of natural hazards - avalanches and earthquakes, floods and forest fires - whose sizes fluctuate so erratically from one event to the next that the average cannot adequately stand in for the distribution as a whole." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"In colloquial usage, chaos means a state of total disorder. In its technical sense, however, chaos refers to a state that only appears random, but is actually generated by nonrandom laws. As such, it occupies an unfamiliar middle ground between order and disorder. It looks erratic superficially, yet it contains cryptic patterns and is governed by rigid rules. It's predictable in the short run but unpredictable in the long run. And it never repeats itself: Its behavior is nonperiodic." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Just as a circle is the shape of periodicity, a strange attractor is the shape of chaos. It lives in an abstract mathematical space called state space, whose axes represent all the different variables in a physical system." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Like regular networks, random ones are seductive idealizations. Theorists find them beguiling, not because of their verisimilitude, but because they're the easiest ones to analyze. [...] Random networks are small and poorly clustered; regular ones are big and highly clustered." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"One of the most wonderful things about curiosity-driven research - aside from the pleasure it brings - is that it often has unexpected spin-offs." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Scientists have long been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite, that nature should inexorably degenerate to - ward a state of greater disorder, greater entropy." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Structure always affects function. The structure of social networks affects the spread of information and disease; the structure of the power grid affects the stability of power transmission. The same must be true for species in an ecosystem, companies in the global marketplace, cascades of enzyme reactions in living cells. The layout of the web must profoundly shape its dynamics." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The best case that can be made for human sync to the environment (outside of circadian entrainment) has to do with the possibility that electrical rhythms in our brains can be influenced by external signals." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The butterfly effect came to be the most familiar icon of the new science, and appropriately so, for it is the signature of chaos. […] The idea is that in a chaotic system, small disturbances grow exponentially fast, rendering long-term prediction impossible." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The nonlinear dynamics of systems with that many variables is still beyond us. Even with the help of supercomputers, the collective behavior of gigantic systems of oscillators remains a forbidding terra incognita." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"[...] the transition to a small world is essentially undetectable at a local level. If you were living through the morph, nothing about your immediate neighborhood would tell you that the world had become small." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The uncertainty principle expresses a seesaw relationship between the fluctuations of certain pairs of variables, such as an electron's position and its speed. Anything that lowers the uncertainty of one must necessarily raise the uncertainty of the other; you can't push both down at the same time. For example, the more tightly you confine an electron, the more wildly it thrashes. By lowering the position end of the seesaw, you force the velocity end to lift up. On the other hand, if you try to constrain the electron's velocity instead, its position becomes fuzzier and fuzzier; the electron can turn up almost anywhere.(Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"These, then, are the defining features of chaos: erratic, seemingly random behavior in an otherwise deterministic system; predictability in the short run, because of the deterministic laws; and unpredictability in the long run, because of the butterfly effect." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"This synergistic character of nonlinear systems is precisely what makes them so difficult to analyze. They can't be taken apart. The whole system has to be examined all at once, as a coherent entity. As we've seen earlier, this necessity for global thinking is the greatest challenge in understanding how large systems of oscillators can spontaneously synchronize themselves. More generally, all problems about self-organization are fundamentally nonlinear. So the study of sync has always been entwined with the study of nonlinearity." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"[…] topology, the study of continuous shape, a kind of generalized geometry where rigidity is replaced by elasticity. It's as if everything is made of rubber. Shapes can be continuously deformed, bent, or twisted, but not cut - that's never allowed. A square is topologically equivalent to a circle, because you can round off the corners. On the other hand, a circle is different from a figure eight, because there's no way to get rid of the crossing point without resorting to scissors. In that sense, topology is ideal for sorting shapes into broad classes, based on their pure connectivity." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Unanticipated forms of collective behavior emerge that are not obvious from the properties of the individuals themselves. All the models are extremely simplified, of course, but that's the point. If even their idealized behavior can surprise us, we may find clues about what to expect in the real thing. […] the collective dynamics of a crowd can be exquisitely sensitive to its composition, which may be one reason why mobs are so unpredictable, which may be one reason why mobs are so unpredictable." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"We’re accustomed to  in terms of centralized control, clear chains of command, the straightforward logic of cause and effect. But in huge, interconnected systems, where every player ultimately affects every other, our standard ways of thinking fall apart. Simple pictures and verbal arguments are too feeble, too myopic." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"When you’re trying to prove something, it helps to know it’s true. That gives you the confidence you need to keep searching for a rigorous proof." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Change is most sluggish at the extremes precisely because the derivative is zero there." (Steven Strogatz, "The Joy of X: A Guided Tour of Mathematics, from One to Infinity", 2012)

"In mathematics, our freedom lies in the questions we ask - and in how we pursue them - but not in the answers awaiting us." (Steven Strogatz, "The Joy of X: A Guided Tour of Mathematics, from One to Infinity", 2012)

"Proofs can cause dizziness or excessive drowsiness. Side effects of prolonged exposure may include night sweats, panic attacks, and, in rare cases, euphoria. Ask your doctor if proofs are right for you." (Steven Strogatz, "The Joy of X: A Guided Tour of Mathematics, from One to Infinity", 2012)

"[...] things that seem hopelessly random and unpredictable when viewed in isolation often turn out to be lawful and predictable when viewed in aggregate." (Steven Strogatz, "The Joy of X: A Guided Tour of Mathematics, from One to Infinity", 2012)

"A limit cycle is an isolated closed trajectory. Isolated means that neighboring trajectories are not closed; they spiral either toward or away from the limit cycle. If all neighboring trajectories approach the limit cycle, we say the limit cycle is stable or attracting. Otherwise the limit cycle is unstable, or in exceptional cases, half-stable. Stable limit cycles are very important scientifically - they model systems that exhibit self-sustained oscillations. In other words, these systems oscillate even in the absence of external periodic forcing." (Steven H Strogatz, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering", 2015)

"An equilibrium is defined to be stable if all sufficiently small disturbances away from it damp out in time. Thus stable equilibria are represented geometrically by stable fixed points. Conversely, unstable equilibria, in which disturbances grow in time, are represented by unstable fixed points." (Steven H Strogatz, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering", 2015)

"[…] chaos and fractals are part of an even grander subject known as dynamics. This is the subject that deals with change, with systems that evolve in time. Whether the system in question settles down to equilibrium, keeps repeating in cycles, or does something more complicated, it is dynamics that we use to analyze the behavior." (Steven H Strogatz, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering", 2015)

"The qualitative structure of the flow can change as parameters are varied. In particular, fixed points can be created or destroyed, or their stability can change. These qualitative changes in the dynamics are called bifurcations, and the parameter values at which they occur are called bifurcation points. Bifurcations are important scientifically - they provide models of transitions and instabilities as some control parameter is varied." (Steven H Strogatz, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering", 2015)

"[…] what exactly do we mean by a bifurcation? The usual definition involves the concept of 'topological equivalence': if the phase portrait changes its topological structure as a parameter is varied, we say that a bifurcation has occurred. Examples include changes in the number or stability of fixed points, closed orbits, or saddle connections as a parameter is varied." (Steven H Strogatz, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering", 2015)

"Why do mathematicians care so much about π? Is it some kind of weird circle fixation? Hardly. The beauty of π, in part, is that it puts infinity within reach. Even young children get this. The digits of π never end and never show a pattern. They go on forever, seemingly at random - except that they can’t possibly be random, because they embody the order inherent in a perfect circle. This tension between order and randomness is one of the most tantalizing aspects of π." (Steven Strogatz, "Why π Matters" 2015)

"Although base e is uniquely distinguished, other exponential functions obey a similar principle of growth. The only difference is that the rate of exponential growth is proportional to the function’s current level, not strictly equal to it. Still, that proportionality is sufficient to generate the explosiveness we associate with exponential growth." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"An infinitesimal is a hazy thing. It is supposed to be the tiniest number you can possibly imagine that isn’t actually zero. More succinctly, an infinitesimal is smaller than everything but greater than nothing. Even more paradoxically, infinitesimals come in different sizes. An infinitesimal part of an infinitesimal is incomparably smaller still. We could call it a second-order infinitesimal." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Because of its intimate connection to the backward problem, the area problem is not just about area. It’s not just about shape or the relationship between distance and speed or anything that narrow. It’s completely general. From a modern perspective, the area problem is about predicting the relationship between anything that changes at a changing rate and how much that thing builds up over time." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Because of the geometry of a circle, there’s always a quarter-cycle off set between any sine wave and the wave derived from it as its derivative, its rate of change. In this analogy, the point’s direction of travel is like its rate of change. It determines where the point will go next and hence how it changes its location. Moreover, this compass heading of the arrow itself rotates in a circular fashion at a constant speed as the point goes around the circle, so the compass heading of the arrow follows a sine-wave pattern in time. And since the compass heading is like the rate of change, voilà! The rate of change follows a sine-wave pattern too." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Calculus succeeds by breaking complicated problems down into simpler parts. That strategy, of course, is not unique to calculus. All good problem-solvers know that hard problems become easier when they’re split into chunks. The truly radical and distinctive move of calculus is that it takes this divide-and-conquer strategy to its utmost extreme - all the way out to infinity." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Chaotic systems are finicky. A little change in how they’re started can make a big difference in where they end up. That’s because small changes in their initial conditions get magnified exponentially fast. Any tiny error or disturbance snowballs so rapidly that in the long term, the system becomes unpredictable. Chaotic systems are not random - they’re deterministic and hence predictable in the short run - but in the long run, they’re so sensitive to tiny disturbances that they look effectively random in many respects." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Generally speaking, things can change in one of three ways: they can go up, they can go down, or they can go up and down. In other words, they can grow, decay, or fluctuate. Different functions are suitable for different occasions." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"In analysis, one solves a problem by starting at the end, as if the answer had already been obtained, and then works back wishfully toward the beginning, hoping to find a path to the given assumptions. [….] Synthesis goes in the other direction. It starts with the givens, and then, by stabbing in the dark, trying things, you are somehow supposed to move forward to a solution, step by logical step, and eventually arrive at the desired result. Synthesis tends to be much harder than analysis because you don’t ever know how you’re going to get to the solution until you do." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"In mathematical modeling, as in all of science, we always have to make choices about what to stress and what to ignore. The art of abstraction lies in knowing what is essential and what is minutia, what is signal and what is noise, what is trend and what is wiggle. It’s an art because such choices always involve an element of danger; they come close to wishful thinking and intellectual dishonesty." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"In mathematics, pendulums stimulated the development of calculus through the riddles they posed. In physics and engineering, pendulums became paradigms of oscillation. […] In some cases, the connections between pendulums and other phenomena are so exact that the same equations can be recycled without change. Only the symbols need to be reinterpreted; the syntax stays the same. It’s as if nature keeps returning to the same motif again and again, a pendular repetition of a pendular theme. For example, the equations for the swinging of a pendulum carry over without change to those for the spinning of generators that produce alternating current and send it to our homes and offices. In honor of that pedigree, electrical engineers refer to their generator equations as swing equations." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"If real numbers are not real, why do mathematicians love them so much? And why are schoolchildren forced to learn about them? Because calculus needs them. From the beginning, calculus has stubbornly insisted that everything - space and time, matter and energy, all objects that ever have been or will be - should be regarded as continuous. Accordingly, everything can and should be quantified by real numbers. In this idealized, imaginary world, we pretend that everything can be split finer and finer without end. The whole theory of calculus is built on that assumption. Without it, we couldn’t compute limits, and without limits, calculus would come to a clanking halt." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Mathematically, circles embody change without change. A point moving around the circumference of a circle changes direction without ever changing its distance from a center. It’s a minimal form of change, a way to change and curve in the slightest way possible. And, of course, circles are symmetrical. If you rotate a circle about its center, it looks unchanged. That rotational symmetry may be why circles are so ubiquitous. Whenever some aspect of nature doesn’t care about direction, circles are bound to appear. Consider what happens when a raindrop hits a puddle: tiny ripples expand outward from the point of impact. Because they spread equally fast in all directions and because they started at a single point, the ripples have to be circles. Symmetry demands it." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Mathematicians don’t come up with the proofs first. First comes intuition. Rigor comes later. This essential role of in- tuition and imagination is often left out of high-school geometry courses, but it is essential to all creative mathematics." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Nonlinearity is responsible for the richness in the world, for its beauty and complexity and, often, its inscrutability. […] When a system is nonlinear, its behavior can be impossible to forecast with formulas, even though that behavior is completely determined. In other words, determinism does not imply predictability. […] Chaotic systems can be predicted perfectly well up to a time known as the predictability horizon. Before that, the determinism of the system makes it predictable." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"On a linear system like a scale, the whole is equal to the sum of the parts. That’s the first key property of linearity. The second is that causes are proportional to effects. […] These two properties - the proportionality between cause and effect, and the equality of the whole to the sum of the parts - are the essence of what it means to be linear. […] The great advantage of linearity is that it allows for reductionist thinking. To solve a linear problem, we can break it down to its simplest parts, solve each part separately, and put the parts back together to get the answer." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Pi is fundamentally a child of calculus. It is defined as the unattainable limit of a never-ending process. But unlike a sequence of polygons steadfastly approaching a circle or a hapless walker stepping halfway to a wall, there is no end in sight for pi, no limit we can ever know. And yet pi exists. There it is, defined so crisply as the ratio of two lengths we can see right before us, the circumference of a circle and its diameter. That ratio defines pi, pinpoints it as clearly as can be, and yet the number itself slips through our fingers." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"So there is a lot to be said for following one’s curiosity in mathematics. It often has scientific and practical payoff s that can’t be foreseen. It also gives mathematicians great pleasure for its own sake and reveals hidden connections between different parts of mathematics." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Somewhere in the dark recesses of prehistory, somebody realized that numbers never end. And with that thought, infinity was born. It’s the numerical counterpart of something deep in our psyches, in our nightmares of bottomless pits, and in our hopes for eternal life. Infinity lies at the heart of so many of our dreams and fears and unanswerable questions: How big is the universe? How long is forever? How powerful is God? In every branch of human thought, from religion and philosophy to science and mathematics, infinity has befuddled the world’s finest minds for thousands of years."(Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"[…] the derivative of a sine wave is another sine wave, shifted by a quarter cycle. That’s a remarkable property. It’s not true of other kinds of waves. Typically, when we take the derivative of a curve of any kind, that curve will become distorted by being differentiated. It won’t have the same shape before and after. Being differentiated is a traumatic experience for most curves. But not for a sine wave. After its derivative is taken, it dusts itself of f and appears unfazed, as sinusoidal as ever. The only injury it suffers - and it isn’t even an injury, really - is that the sine wave shifts in time. It peaks a quarter of a cycle earlier than it used to." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"The great advantage of infinitesimals in general and differentials in particular is that they make calculations easier. They provide shortcuts. They free the mind for more imaginative thought, just as algebra did for geometry in an earlier era. […] The only thing wrong with infinitesimals is that they don’t exist, at least not within the system of real numbers." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"The important point about e is that an exponential function with this base grows at a rate precisely equal to the function itself. Let me say that again. The rate of growth of ex is ex itself. This marvelous property simplifies all calculations about exponential functions when they are expressed in base e. No other base enjoys this simplicity. Whether we are working with derivatives, integrals, differential equations, or any of the other tools of calculus, exponential functions expressed in base e are always the cleanest, most elegant, and most beautiful." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"The reason why integration is so much harder than differentiation has to do with the distinction between local and global. Local problems are easy. Global problems are hard. Differentiation is a local operation. [...] when we are calculating a derivative, it’s like we’re looking under a microscope. We zoom in on a curve or a function, repeatedly magnifying the field of view. As we zoom in on that little local patch, the curve appears to become less and less curved. […] Integration is a global operation. Instead of a microscope, we are now using a telescope. We are trying to peer far of f into the distance - or far ahead into the future, although in that case we need a crystal ball. Naturally, these problems are a lot harder. All the intervening events matter and cannot be discarded. Or so it would seem." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"There’s something so paradoxical about pi. On the one hand, it represents order, as embodied by the shape of a circle, long held to be a symbol of perfection and eternity. On the other hand, pi is unruly, disheveled in appearance, its digits obeying no obvious rule, or at least none that we can perceive. Pi is elusive and mysterious, forever beyond reach. Its mix of order and disorder is what makes it so bewitching." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"Thus, calculus proceeds in two phases: cutting and rebuilding. In mathematical terms, the cutting process always involves infinitely fine subtraction, which is used to quantify the differences between the parts. Accordingly, this half of the subject is called differential calculus. The reassembly process always involves infinite addition, which integrates the parts back into the original whole. This half of the subject is called integral calculus." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"To shed light on any continuous shape, object, motion, process, or phenomenon - no matter how wild and complicated it may appear - reimagine it as an infinite series of simpler parts, analyze those, and then add the results back together to make sense of the original whole." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"We feel we are discovering mathematics. The results are there, waiting for us. They have been inherent in the figures all along. We are not inventing them. […] we are discovering facts that already exist, that are inherent in the objects we study. Although we have creative freedom to invent the objects themselves - to create idealizations like perfect spheres and circles and cylinders - once we do, they take on lives of their own." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"[…] when a curve does look increasingly straight when we zoom in on it sufficiently at any point, that curve is said to be smooth. […] In modern calculus, however, we have learned how to cope with curves that are not smooth. The inconveniences and pathologies of non-smooth curves sometimes arise in applications due to sudden jumps or other discontinuities in the behavior of a physical system." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"With a linear growth of errors, improving the measurements could always keep pace with the desire for longer prediction. But when errors grow exponentially fast, a system is said to have sensitive dependence on its initial conditions. Then long-term prediction becomes impossible. This is the philosophically disturbing message of chaos." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"With its yin and yang binaries, pi is like all of calculus in miniature. Pi is a portal between the round and the straight, a single number yet infinitely complex, a balance of order and chaos. Calculus, for its part, uses the infinite to study the finite, the unlimited to study the limited, and the straight to study the curved. The Infinity Principle is the key to unlocking the mystery of curves, and it arose here first, in the mystery of pi." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

21 January 2021

On Synergy I

 "Synergy is the only word in our language that means behavior of whole systems unpredicted by the separately observed behaviors of any of the system's separate parts or any subassembly of the system's parts." (R Buckminster Fuller, "Operating Manual for Spaceship Earth", 1963)

"Synergy means behavior of whole systems unpredicted by the behavior of their parts taken separately." (R Buckminster Fuller, "Synergetics: Explorations in the Geometry of Thinking", 1975)

"There is a multilayering of global networks in the key strategic activities that structure and destructure the planet. When these multilayered networks overlap in some node, when there is a node that belongs to different networks, two major consequences follow. First, economies of synergy between these different networks take place in that node: between financial markets and media businesses; or between academic research and technology development and innovation; between politics and media." (Manuel Castells, "The Rise of the Network Society", 1996)

"With the growing interest in complex adaptive systems, artificial life, swarms and simulated societies, the concept of “collective intelligence” is coming more and more to the fore. The basic idea is that a group of individuals (e. g. people, insects, robots, or software agents) can be smart in a way that none of its members is. Complex, apparently intelligent behavior may emerge from the synergy created by simple interactions between individuals that follow simple rules." (Francis Heylighen, "Collective Intelligence and its Implementation on the Web", 1999)

"Systems thinking means the ability to see the synergy of the whole rather than just the separate elements of a system and to learn to reinforce or change whole system patterns. Many people have been trained to solve problems by breaking a complex system, such as an organization, into discrete parts and working to make each part perform as well as possible. However, the success of each piece does not add up to the success of the whole. to the success of the whole. In fact, sometimes changing one part to make it better actually makes the whole system function less effectively." (Richard L Daft, "The Leadership Experience", 2002)

"Self-organization can be seen as a spontaneous coordination of the interactions between the components of the system, so as to maximize their synergy. This requires the propagation and processing of information, as different components perceive different aspects of the situation, while their shared goal requires this information to be integrated. The resulting process is characterized by distributed cognition: different components participate in different ways to the overall gathering and processing of information, thus collectively solving the problems posed by any perceived deviation between the present situation and the desired situation." (Carlos Gershenson & Francis Heylighen, "How can we think the complex?", 2004)

"Synergy is the combined action that occurs when people work together to create new alternatives and solutions. In addition, the greatest opportunity for synergy occurs when people have different viewpoints, because the differences present new opportunities. The essence of synergy is to value and respect differences and take advantage of them to build on strengths and compensate for weaknesses." (Richard L Daft, "The Leadership Experience" 4th Ed., 2008)

"Synergy occurs when organizational parts interact to produce a joint effect that is greater than the sum of the parts acting alone. As a result the organization may attain a special advantage with respect to cost, market power, technology, or employee." (Richard L Daft, "The Leadership Experience" 4th Ed., 2008)

"In short, synergy is the consequence of the energy expended in creating order. It is locked up in the viable system created, be it an organism or a social system. It is at the level of the system. It is not discernible at the level of the system. It is not discernible at the level of the system's components. Whenever the system is dismembered to examine its components, this binding energy dissipates." (J-C Spender, "Organizational Knowledge, Collective Practice and Penrose Rents", 2009)

"Synergy is defined as the surplus gained by working together. A task which couldn’t be fulfilled by one individual, can be completed by the work of different individuals together. To maximize synergy, first, the initial task is divided into different sub-tasks. Different agents perform different tasks, which is called division of labor. An end product of one work is used for another work, which is called workflow. Finally, everything needs to be put together. We call this aggregation. This isn’t as linear as it looks. At every step in the process it can happen that a task is divided into sub tasks or aggregated with other tasks." (Evo Busseniers, "Self-organization versus hierarchical organization", [thesis] 2018)

Complex Systems III

"Complexity must be grown from simple systems that already work." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Even though these complex systems differ in detail, the question of coherence under change is the central enigma for each." (John H Holland," Hidden Order: How Adaptation Builds Complexity", 1995)

"By irreducibly complex I mean a single system composed of several well-matched, interacting parts that contribute to the basic function, wherein the removal of any one of the parts causes the system to effectively cease functioning. An irreducibly complex system cannot be produced directly (that is, by continuously improving the initial function, which continues to work by the same mechanism) by slight, successive modification of a precursor, system, because any precursors to an irreducibly complex system that is missing a part is by definition nonfunctional." (Michael Behe, "Darwin’s Black Box", 1996)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"When the behavior of the system depends on the behavior of the parts, the complexity of the whole must involve a description of the parts, thus it is large. The smaller the parts that must be described to describe the behavior of the whole, the larger the complexity of the entire system. […] A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"There is no over-arching theory of complexity that allows us to ignore the contingent aspects of complex systems. If something really is complex, it cannot by adequately described by means of a simple theory. Engaging with complexity entails engaging with specific complex systems. Despite this we can, at a very basic level, make general remarks concerning the conditions for complex behaviour and the dynamics of complex systems. Furthermore, I suggest that complex systems can be modelled." (Paul Cilliers," Complexity and Postmodernism", 1998)

"The self-similarity of fractal structures implies that there is some redundancy because of the repetition of details at all scales. Even though some of these structures may appear to teeter on the edge of randomness, they actually represent complex systems at the interface of order and disorder."  (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

21 December 2020

On Nonlinearity V (Chaos I)

"When one combines the new insights gained from studying far-from-equilibrium states and nonlinear processes, along with these complicated feedback systems, a whole new approach is opened that makes it possible to relate the so-called hard sciences to the softer sciences of life - and perhaps even to social processes as well. […] It is these panoramic vistas that are opened to us by Order Out of Chaos." (Ilya Prigogine, "Order Out of Chaos: Man's New Dialogue with Nature", 1984)

"Algorithmic complexity theory and nonlinear dynamics together establish the fact that determinism reigns only over a quite finite domain; outside this small haven of order lies a largely uncharted, vast wasteland of chaos." (Joseph Ford, "Progress in Chaotic Dynamics: Essays in Honor of Joseph Ford's 60th Birthday", 1988)

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"In the everyday world of human affairs, no one is surprised to learn that a tiny event over here can have an enormous effect over there. For want of a nail, the shoe was lost, et cetera. But when the physicists started paying serious attention to nonlinear systems in their own domain, they began to realize just how profound a principle this really was. […] Tiny perturbations won't always remain tiny. Under the right circumstances, the slightest uncertainty can grow until the system's future becomes utterly unpredictable - or, in a word, chaotic." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"There is a new science of complexity which says that the link between cause and effect is increasingly difficult to trace; that change (planned or otherwise) unfolds in non-linear ways; that paradoxes and contradictions abound; and that creative solutions arise out of diversity, uncertainty and chaos." (Andy P Hargreaves & Michael Fullan, "What’s Worth Fighting for Out There?", 1998)

"Let's face it, the universe is messy. It is nonlinear, turbulent, and chaotic. It is dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity, not uniformity. That's what makes the world interesting, that's what makes it beautiful, and that's what makes it work." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

"Complexity theory can be defined broadly as the study of how order, structure, pattern, and novelty arise from extremely complicated, apparently chaotic systems and conversely, how complex behavior and structure emerges from simple underlying rules. As such, it includes those other areas of study that are collectively known as chaos theory, and nonlinear dynamical theory." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system. This property of a non-linear system such as the weather is known as the butterfly effect where it is purported that a butterfly flapping its wings in Japan can give rise to a tornado in Kansas. This unpredictable behaviour of nonlinear dynamical systems, i.e. its extreme sensitivity to initial conditions, seems to be random and is therefore referred to as chaos. This chaotic and seemingly random behaviour occurs for non-linear deterministic system in which effects can be linked to causes but cannot be predicted ahead of time." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)

"Even more important is the way complex systems seem to strike a balance between the need for order and the imperative for change. Complex systems tend to locate themselves at a place we call 'the edge of chaos'. We imagine the edge of chaos as a place where there is enough innovation to keep a living system vibrant, and enough stability to keep it from collapsing into anarchy. It is a zone of conflict and upheaval, where the old and new are constantly at war. Finding the balance point must be a delicate matter - if a living system drifts too close, it risks falling over into incoherence and dissolution; but if the system moves too far away from the edge, it becomes rigid, frozen, totalitarian. Both conditions lead to extinction. […] Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"To remedy chaotic situations requires a chaotic approach, one that is non-linear, constantly morphing, and continually sharpening its competitive edge with recurring feedback loops that build upon past experiences and lessons learned. Improvement cannot be sustained without reflection. Chaos arises from myriad sources that stem from two origins: internal chaos rising within you, and external chaos being imposed upon you by the environment. The result of this push/pull effect is the disequilibrium [...]." (Jeff Boss, "Navigating Chaos: How to Find Certainty in Uncertain Situations", 2015)

20 December 2020

On Linearity I

"Today it is no longer questioned that the principles of the analysts are the more far-reaching. Indeed, the synthesists lack two things in order to engage in a general theory of algebraic configurations: these are on the one hand a definition of imaginary elements, on the other an interpretation of general algebraic concepts. Both of these have subsequently been developed in synthetic form, but to do this the essential principle of synthetic geometry had to be set aside. This principle which manifests itself so brilliantly in the theory of linear forms and the forms of the second degree, is the possibility of immediate proof by means of visualized constructions." (Felix Klein, "Riemannsche Flächen", 1906)

"The conception of tensors is possible owing to the circumstance that the transition from one co-ordinate system to another expresses itself as a linear transformation in the differentials. One here uses the exceedingly fruitful mathematical device of making a problem 'linear' by reverting to infinitely small quantities." (Hermann Weyl, "Space - Time - Matter", 1922)

"Any organism must be treated as-a-whole; in other words, that an organism is not an algebraic sum, a linear function of its elements, but always more than that. It is seemingly little realized, at present, that this simple and innocent-looking statement involves a full structural revision of our language […]" (Alfred Korzybski, "Science and Sanity", 1933)

"Beauty had been born, not, as we so often conceive it nowadays, as an ideal of humanity, but as measure, as the reduction of the chaos of appearances to the precision of linear symbols. Symmetry, balance, harmonic division, mated and mensurated intervals - such were its abstract characteristics." (Herbert E Read, "Icon and Idea", 1955)

"We've seen that even in the simplest situations nonlinearities can interfere with a linear approach to aggregates. That point holds in general: nonlinear interactions almost always make the behavior of the aggregate more complicated than would be predicted by summing or averaging." (Lewis Mumford, "The Myth of the Machine" Vol 1, 1967)

"It is sometimes said that the great discovery of the nineteenth century was that the equations of nature were linear, and the great discovery of the twentieth century is that they are not." (Thomas W Körner, "Fourier Analysis", 1988)

"A major clash between economics and ecology derives from the fact that nature is cyclical, whereas our industrial systems are linear. Our businesses take resources, transform them into products plus waste, and sell the products to consumers, who discard more waste […]" (Fritjof Capra, "The Web of Life", 1996)

"The first idea is that human progress is exponential (that is, it expands by repeatedly multiplying by a constant) rather than linear (that is, expanding by repeatedly adding a constant). Linear versus exponential: Linear growth is steady; exponential growth becomes explosive." (Ray Kurzweil, "The Singularity is Near", 2005)

"Without precise predictability, control is impotent and almost meaningless. In other words, the lesser the predictability, the harder the entity or system is to control, and vice versa. If our universe actually operated on linear causality, with no surprises, uncertainty, or abrupt changes, all future events would be absolutely predictable in a sort of waveless orderliness." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"There is no linear additive process that, if all the parts are taken together, can be understood to create the total system that occurs at the moment of self-organization; it is not a quantity that comes into being. It is not predictable in its shape or subsequent behavior or its subsequent qualities. There is a nonlinear quality that comes into being at the moment of synchronicity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)
Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...