21 December 2020

On Nonlinearity IV (Organizations)

"In strategic thinking, one first seeks a clear understanding of the particular character of each element of a situation and then makes the fullest possible use of human brainpower to restructure the elements in the most advantageous way. Phenomena and events in the real word do not always fit a linear model. Hence the most reliable means of dissecting a situation into its constituent parts and reassembling then in the desired pattern is not a step-by-step methodology such as systems analysis. Rather, it is that ultimate nonlinear thinking tool, the human brain. True strategic thinking thus contrasts sharply with the conventional mechanical systems approach based on linear thinking. But it also contrasts with the approach that stakes everything on intuition, reaching conclusions without any real breakdown or analysis. [...] No matter how difficult or unprecedented the problem, a breakthrough to the best possible solution can come only from a combination of rational analysis, based on the real nature of things, and imaginative reintegration of all the different items into a new pattern, using nonlinear brainpower. This is always the most effective approach to devising strategies for dealing successfully with challenges and opportunities, in the market arena as on the battlefield." (Kenichi Ohmae, "The Mind Of The Strategist", 1982)

"So we pour in data from the past to fuel the decision-making mechanisms created by our models, be they linear or nonlinear. But therein lies the logician's trap: past data from real life constitute a sequence of events rather than a set of independent observations, which is what the laws of probability demand.[...] It is in those outliers and imperfections that the wildness lurks." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"In a linear world of equilibrium and predictability, the sparse research into an evidence base for management prescriptions and the confused findings it produces would be a sign of incompetence; it would not make much sense. Nevertheless, if organizations are actually patterns of nonlinear interaction between people; if small changes could produce widespread major consequences; if local interaction produces emergent global pattern; then it will not be possible to provide a reliable evidence base. In such a world, it makes no sense to conduct studies looking for simple causal relationships between an action and an outcome. I suggest that the story of the last few years strongly indicates that human action is nonlinear, that time and place matter a great deal, and that since this precludes simple evidence bases we do need to rethink the nature of organizations and the roles of managers and leaders in them." (Ralph D Stacey, "Complexity and Organizational Reality", 2000)

"The world is nonlinear. Trying to make it linear for our mathematical or administrative convenience is not usually a good idea even when feasible, and it is rarely feasible." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

"Complexity theory shows that great changes can emerge from small actions. Change involves a belief in the possible, even the 'impossible'. Moreover, social innovators don’t follow a linear pathway of change; there are ups and downs, roller-coaster rides along cascades of dynamic interactions, unexpected and unanticipated divergences, tipping points and critical mass momentum shifts. Indeed, things often get worse before they get better as systems change creates resistance to and pushback against the new. Traditional evaluation approaches are not well suited for such turbulence. Traditional evaluation aims to control and predict, to bring order to chaos. Developmental evaluation accepts such turbulence as the way the world of social innovation unfolds in the face of complexity. Developmental evaluation adapts to the realities of complex nonlinear dynamics rather than trying to impose order and certainty on a disorderly and uncertain world." (Michael Q Patton, "Developmental Evaluation", 2010)

"Internal friction is exacerbated by the fact that in business as in war, we are operating in a nonlinear, semi-chaotic environment in which our endeavors will collide and possibly clash with the actions of other independent wills (customers, suppliers, competitors, regulators, lobbyists, and so on). The internal and external worlds are in constant contact and the effects of our actions are the result of their reciprocal interaction. Friction gives rise to three gaps: the knowledge gap, the alignment gap, and the effects gap. To execute effectively, we must address all three. Our instinctive reaction to the three gaps is to demand more detail. We gather more data in order to craft more detailed plans, issue more detailed instructions, and exercise more detailed control. This not only fails to solve the problem, it usually makes it worse. We need to think about the problem differently and adopt a systemic approach to solving it." (Stephen Bungay, "The Art of Action: How Leaders Close the Gaps between Plans, Actions, and Results", 2010)

"Motivation is a fine example of social complexity. It is nonlinear and sometimes unpredictable. It cannot be defined or modeled with a single diagram." (Jurgen Appelo, "Management 3.0: Leading Agile Developers, Developing Agile Leaders", 2010)

"We have minds that are equipped for certainty, linearity and short-term decisions, that must instead make long-term decisions in a non-linear, probabilistic world. (Paul Gibbons, "The Science of Successful Organizational Change", 2015)

On Nonlinearity V (Chaos I)

"When one combines the new insights gained from studying far-from-equilibrium states and nonlinear processes, along with these complicated feedback systems, a whole new approach is opened that makes it possible to relate the so-called hard sciences to the softer sciences of life - and perhaps even to social processes as well. […] It is these panoramic vistas that are opened to us by Order Out of Chaos." (Ilya Prigogine, "Order Out of Chaos: Man's New Dialogue with Nature", 1984)

"Algorithmic complexity theory and nonlinear dynamics together establish the fact that determinism reigns only over a quite finite domain; outside this small haven of order lies a largely uncharted, vast wasteland of chaos." (Joseph Ford, "Progress in Chaotic Dynamics: Essays in Honor of Joseph Ford's 60th Birthday", 1988)

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"In the everyday world of human affairs, no one is surprised to learn that a tiny event over here can have an enormous effect over there. For want of a nail, the shoe was lost, et cetera. But when the physicists started paying serious attention to nonlinear systems in their own domain, they began to realize just how profound a principle this really was. […] Tiny perturbations won't always remain tiny. Under the right circumstances, the slightest uncertainty can grow until the system's future becomes utterly unpredictable - or, in a word, chaotic." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"There is a new science of complexity which says that the link between cause and effect is increasingly difficult to trace; that change (planned or otherwise) unfolds in non-linear ways; that paradoxes and contradictions abound; and that creative solutions arise out of diversity, uncertainty and chaos." (Andy P Hargreaves & Michael Fullan, "What’s Worth Fighting for Out There?", 1998)

"Let's face it, the universe is messy. It is nonlinear, turbulent, and chaotic. It is dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity, not uniformity. That's what makes the world interesting, that's what makes it beautiful, and that's what makes it work." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

"Complexity theory can be defined broadly as the study of how order, structure, pattern, and novelty arise from extremely complicated, apparently chaotic systems and conversely, how complex behavior and structure emerges from simple underlying rules. As such, it includes those other areas of study that are collectively known as chaos theory, and nonlinear dynamical theory." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system. This property of a non-linear system such as the weather is known as the butterfly effect where it is purported that a butterfly flapping its wings in Japan can give rise to a tornado in Kansas. This unpredictable behaviour of nonlinear dynamical systems, i.e. its extreme sensitivity to initial conditions, seems to be random and is therefore referred to as chaos. This chaotic and seemingly random behaviour occurs for non-linear deterministic system in which effects can be linked to causes but cannot be predicted ahead of time." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)

"Even more important is the way complex systems seem to strike a balance between the need for order and the imperative for change. Complex systems tend to locate themselves at a place we call 'the edge of chaos'. We imagine the edge of chaos as a place where there is enough innovation to keep a living system vibrant, and enough stability to keep it from collapsing into anarchy. It is a zone of conflict and upheaval, where the old and new are constantly at war. Finding the balance point must be a delicate matter - if a living system drifts too close, it risks falling over into incoherence and dissolution; but if the system moves too far away from the edge, it becomes rigid, frozen, totalitarian. Both conditions lead to extinction. […] Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity. (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"To remedy chaotic situations requires a chaotic approach, one that is non-linear, constantly morphing, and continually sharpening its competitive edge with recurring feedback loops that build upon past experiences and lessons learned. Improvement cannot be sustained without reflection. Chaos arises from myriad sources that stem from two origins: internal chaos rising within you, and external chaos being imposed upon you by the environment. The result of this push/pull effect is the disequilibrium [...]." (Jeff Boss, "Navigating Chaos: How to Find Certainty in Uncertain Situations", 2015)

20 December 2020

On Nonlinearity III

"Finite systems of deterministic ordinary nonlinear differential equations may be designed to represent forced dissipative hydrodynamic flow. Solutions of these equations can be identified with trajectories in phase space. For those systems with bounded solutions, it is found that nonperiodic solutions are ordinarily unstable with respect to small modifications, so that slightly differing initial states can evolve into considerably different states. Systems with bounded solutions are shown to possess bounded numerical solutions. (Edward N Lorenz, "Deterministic Nonperiodic Flow", Journal of the Atmospheric Science 20, 1963)

"We've seen that even in the simplest situations nonlinearities can interfere with a linear approach to aggregates. That point holds in general: nonlinear interactions almost always make the behavior of the aggregate more complicated than would be predicted by summing or averaging." (Lewis Mumford, "The Myth of the Machine" Vol 1, 1967)

"The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by non‐linear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive‐feedback loops describing growth processes as well as negative, goal‐seeking loops." (Jay F Forrester, "Urban Dynamics", 1969)

"I would therefore urge that people be introduced to [the logistic equation] early in their mathematical education. This equation can be studied phenomenologically by iterating it on a calculator, or even by hand. Its study does not involve as much conceptual sophistication as does elementary calculus. Such study would greatly enrich the student’s intuition about nonlinear systems. Not only in research but also in the everyday world of politics and economics, we would all be better off if more people realized that simple nonlinear systems do not necessarily possess simple dynamical properties." (Robert M May, "Simple Mathematical Models with Very Complicated Dynamics", Nature Vol. 261 (5560), 1976)

"Most physical systems, particularly those complex ones, are extremely difficult to model by an accurate and precise mathematical formula or equation due to the complexity of the system structure, nonlinearity, uncertainty, randomness, etc. Therefore, approximate modeling is often necessary and practical in real-world applications. Intuitively, approximate modeling is always possible. However, the key questions are what kind of approximation is good, where the sense of 'goodness' has to be first defined, of course, and how to formulate such a good approximation in modeling a system such that it is mathematically rigorous and can produce satisfactory results in both theory and applications." (Guanrong Chen & Trung Tat Pham, "Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems", 2001) 

"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly.  A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Where simplifications fail, causing the most damage, is when something nonlinear is simplified with the linear as a substitute. That is the most common Procrustean bed." (Nassim N Taleb, "Antifragile: Things that Gain from Disorder", 2012)

"Complex systems defy intuitive solutions. Even a third-order, linear differential equation is unsolvable by inspection. Yet, important situations in management, economics, medicine, and social behavior usually lose reality if simplified to less than fifth-order nonlinear dynamic systems. Attempts to deal with nonlinear dynamic systems using ordinary processes of description and debate lead to internal inconsistencies. Underlying assumptions may have been left unclear and contradictory, and mental models are often logically incomplete. Resulting behavior is likely to be contrary to that implied by the assumptions being made about' underlying system structure and governing policies." (Jay W Forrester, "Modeling for What Purpose?", The Systems Thinker Vol. 24 (2), 2013)

"There is no linear additive process that, if all the parts are taken together, can be understood to create the total system that occurs at the moment of self-organization; it is not a quantity that comes into being. It is not predictable in its shape or subsequent behavior or its subsequent qualities. There is a nonlinear quality that comes into being at the moment of synchronicity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Exponentially growing systems are prevalent in nature, spanning all scales from biochemical reaction networks in single cells to food webs of ecosystems. How exponential growth emerges in nonlinear systems is mathematically unclear. […] The emergence of exponential growth from a multivariable nonlinear network is not mathematically intuitive. This indicates that the network structure and the flux functions of the modeled system must be subjected to constraints to result in long-term exponential dynamics." (Wei-Hsiang Lin et al, "Origin of exponential growth in nonlinear reaction networks", PNAS 117 (45), 2020)

On Linearity I

"Today it is no longer questioned that the principles of the analysts are the more far-reaching. Indeed, the synthesists lack two things in order to engage in a general theory of algebraic configurations: these are on the one hand a definition of imaginary elements, on the other an interpretation of general algebraic concepts. Both of these have subsequently been developed in synthetic form, but to do this the essential principle of synthetic geometry had to be set aside. This principle which manifests itself so brilliantly in the theory of linear forms and the forms of the second degree, is the possibility of immediate proof by means of visualized constructions." (Felix Klein, "Riemannsche Flächen", 1906)

"The conception of tensors is possible owing to the circumstance that the transition from one co-ordinate system to another expresses itself as a linear transformation in the differentials. One here uses the exceedingly fruitful mathematical device of making a problem 'linear' by reverting to infinitely small quantities." (Hermann Weyl, "Space - Time - Matter", 1922)

"Any organism must be treated as-a-whole; in other words, that an organism is not an algebraic sum, a linear function of its elements, but always more than that. It is seemingly little realized, at present, that this simple and innocent-looking statement involves a full structural revision of our language […]" (Alfred Korzybski, "Science and Sanity", 1933)

"Beauty had been born, not, as we so often conceive it nowadays, as an ideal of humanity, but as measure, as the reduction of the chaos of appearances to the precision of linear symbols. Symmetry, balance, harmonic division, mated and mensurated intervals - such were its abstract characteristics." (Herbert E Read, "Icon and Idea", 1955)

"We've seen that even in the simplest situations nonlinearities can interfere with a linear approach to aggregates. That point holds in general: nonlinear interactions almost always make the behavior of the aggregate more complicated than would be predicted by summing or averaging." (Lewis Mumford, "The Myth of the Machine" Vol 1, 1967)

"It is sometimes said that the great discovery of the nineteenth century was that the equations of nature were linear, and the great discovery of the twentieth century is that they are not." (Thomas W Körner, "Fourier Analysis", 1988)

"A major clash between economics and ecology derives from the fact that nature is cyclical, whereas our industrial systems are linear. Our businesses take resources, transform them into products plus waste, and sell the products to consumers, who discard more waste […]" (Fritjof Capra, "The Web of Life", 1996)

"The first idea is that human progress is exponential (that is, it expands by repeatedly multiplying by a constant) rather than linear (that is, expanding by repeatedly adding a constant). Linear versus exponential: Linear growth is steady; exponential growth becomes explosive." (Ray Kurzweil, "The Singularity is Near", 2005)

"Without precise predictability, control is impotent and almost meaningless. In other words, the lesser the predictability, the harder the entity or system is to control, and vice versa. If our universe actually operated on linear causality, with no surprises, uncertainty, or abrupt changes, all future events would be absolutely predictable in a sort of waveless orderliness." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"There is no linear additive process that, if all the parts are taken together, can be understood to create the total system that occurs at the moment of self-organization; it is not a quantity that comes into being. It is not predictable in its shape or subsequent behavior or its subsequent qualities. There is a nonlinear quality that comes into being at the moment of synchronicity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

On Nonlinearity II

"Indeed, except for the very simplest physical systems, virtually everything and everybody in the world is caught up in a vast, nonlinear web of incentives and constraints and connections. The slightest change in one place causes tremors everywhere else. We can't help but disturb the universe, as T.S. Eliot almost said. The whole is almost always equal to a good deal more than the sum of its parts. And the mathematical expression of that property - to the extent that such systems can be described by mathematics at all - is a nonlinear equation: one whose graph is curvy." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"Today the network of relationships linking the human race to itself and to the rest of the biosphere is so complex that all aspects affect all others to an extraordinary degree. Someone should be studying the whole system, however crudely that has to be done, because no gluing together of partial studies of a complex nonlinear system can give a good idea of the behaviour of the whole." (Murray Gell-Mann, 1997)

"Much of the art of system dynamics modeling is discovering and representing the feedback processes, which, along with stock and flow structures, time delays, and nonlinearities, determine the dynamics of a system. […] the most complex behaviors usually arise from the interactions (feedbacks) among the components of the system, not from the complexity of the components themselves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The mental models people use to guide their decisions are dynamically deficient. […] people generally adopt an event-based, open-loop view of causality, ignore feedback processes, fail to appreciate time delays between action and response and in the reporting of information, do not understand stocks and flows and are insensitive to nonlinearities that may alter the strengths of different feedback loops as a system evolves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Most physical processes in the real world are nonlinear. It is our abstraction of the real world that leads us to the use of linear systems in modeling these processes. These linear systems are simple, understandable, and, in many situations, provide acceptable simulations of the actual processes. Unfortunately, only the simplest of linear processes and only a very small fraction of the nonlinear having verifiable solutions can be modeled with linear systems theory. The bulk of the physical processes that we must address are, unfortunately, too complex to reduce to algorithmic form - linear or nonlinear. Most observable processes have only a small amount of information available with which to develop an algorithmic understanding. The vast majority of information that we have on most processes tends to be nonnumeric and nonalgorithmic. Most of the information is fuzzy and linguistic in form." (Timothy J Ross & W Jerry Parkinson, "Fuzzy Set Theory, Fuzzy Logic, and Fuzzy Systems", 2002)

"Swarm intelligence can be effective when applied to highly complicated problems with many nonlinear factors, although it is often less effective than the genetic algorithm approach [...]. Swarm intelligence is related to swarm optimization […]. As with swarm intelligence, there is some evidence that at least some of the time swarm optimization can produce solutions that are more robust than genetic algorithms. Robustness here is defined as a solution’s resistance to performance degradation when the underlying variables are changed. (Michael J North & Charles M Macal, Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation, 2007)

"[…] our mental models fail to take into account the complications of the real world - at least those ways that one can see from a systems perspective. It is a warning list. Here is where hidden snags lie. You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long-term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy." (Donella H Meadows, "Thinking in Systems: A Primer", 2008)

"You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays." (Donella H Meadow, "Thinking in Systems: A Primer", 2008)

"A network of many simple processors ('units' or 'neurons') that imitates a biological neural network. The units are connected by unidirectional communication channels, which carry numeric data. Neural networks can be trained to find nonlinear relationships in data, and are used in various applications such as robotics, speech recognition, signal processing, medical diagnosis, or power systems." (Adnan Khashman et al, "Voltage Instability Detection Using Neural Networks", 2009)

"Linearity is a reductionist’s dream, and nonlinearity can sometimes be a reductionist’s nightmare. Understanding the distinction between linearity and nonlinearity is very important and worthwhile." (Melanie Mitchell, "Complexity: A Guided Tour", 2009)


On Nonlinearity I

"In complex systems cause and effect are often not closely related in either time or space. The structure of a complex system is not a simple feedback loop where one system state dominates the behavior. The complex system has a multiplicity of interacting feedback loops. Its internal rates of flow are controlled by nonlinear relationships. The complex system is of high order, meaning that there are many system states (or levels). It usually contains positive-feedback loops describing growth processes as well as negative, goal-seeking loops. In the complex system the cause of a difficulty may lie far back in time from the symptoms, or in a completely different and remote part of the system. In fact, causes are usually found, not in prior events, but in the structure and policies of the system." (Jay Wright Forrester, "Urban dynamics", 1969)

"Self-organization can be defined as the spontaneous creation of a globally coherent pattern out of local interactions. Because of its distributed character, this organization tends to be robust, resisting perturbations. The dynamics of a self-organizing system is typically non-linear, because of circular or feedback relations between the components. Positive feedback leads to an explosive growth, which ends when all components have been absorbed into the new configuration, leaving the system in a stable, negative feedback state. Non-linear systems have in general several stable states, and this number tends to increase (bifurcate) as an increasing input of energy pushes the system farther from its thermodynamic equilibrium. " (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25 (11), 1972)

"An artificial neural network is an information-processing system that has certain performance characteristics in common with biological neural networks. Artificial neural networks have been developed as generalizations of mathematical models of human cognition or neural biology, based on the assumptions that: 1. Information processing occurs at many simple elements called neurons. 2. Signals are passed between neurons over connection links. 3. Each connection link has an associated weight, which, in a typical neural net, multiplies the signal transmitted. 4. Each neuron applies an activation function (usually nonlinear) to its net input (sum of weighted input signals) to determine its output signal." (Laurene Fausett, "Fundamentals of Neural Networks", 1994)

"Symmetry breaking in psychology is governed by the nonlinear causality of complex systems (the 'butterfly effect'), which roughly means that a small cause can have a big effect. Tiny details of initial individual perspectives, but also cognitive prejudices, may 'enslave' the other modes and lead to one dominant view." (Klaus Mainzer, "Thinking in Complexity", 1994)

"[…] nonlinear interactions almost always make the behavior of the aggregate more complicated than would be predicted by summing or averaging."  (John H Holland," Hidden Order: How Adaptation Builds Complexity", 1995)

“[…] self-organization is the spontaneous emergence of new structures and new forms of behavior in open systems far from equilibrium, characterized by internal feedback loops and described mathematically by nonlinear equations.” (Fritjof  Capra, “The web of life: a new scientific understanding of living  systems”, 1996)

"Bounded rationality simultaneously constrains the complexity of our cognitive maps and our ability to use them to anticipate the system dynamics. Mental models in which the world is seen as a sequence of events and in which feedback, nonlinearity, time delays, and multiple consequences are lacking lead to poor performance when these elements of dynamic complexity are present." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Even if our cognitive maps of causal structure were perfect, learning, especially double-loop learning, would still be difficult. To use a mental model to design a new strategy or organization we must make inferences about the consequences of decision rules that have never been tried and for which we have no data. To do so requires intuitive solution of high-order nonlinear differential equations, a task far exceeding human cognitive capabilities in all but the simplest systems."  (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"All forms of complex causation, and especially nonlinear transformations, admittedly stack the deck against prediction. Linear describes an outcome produced by one or more variables where the effect is additive. Any other interaction is nonlinear. This would include outcomes that involve step functions or phase transitions. The hard sciences routinely describe nonlinear phenomena. Making predictions about them becomes increasingly problematic when multiple variables are involved that have complex interactions. Some simple nonlinear systems can quickly become unpredictable when small variations in their inputs are introduced." (Richard N Lebow, "Forbidden Fruit: Counterfactuals and International Relations", 2010)


On Randomness XII (Chaos I)

"Chaos is but unperceived order; it is a word indicating the limitations of the human mind and the paucity of observational facts. The words ‘chaos’, ‘accidental’, ‘chance’, ‘unpredictable’ are conveniences behind which we hide our ignorance." (Harlow Shapley, "Of Stars and Men: Human Response to an Expanding Universe", 1958)

"The term ‘chaos’ currently has a variety of accepted meanings, but here we shall use it to mean deterministically, or nearly deterministically, governed behavior that nevertheless looks rather random. Upon closer inspection, chaotic behavior will generally appear more systematic, but not so much so that it will repeat itself at regular intervals, as do, for example, the oceanic tides." (Edward N Lorenz, "Chaos, spontaneous climatic variations and detection of the greenhouse effect", 1991)

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"In nonlinear systems - and the economy is most certainly nonlinear - chaos theory tells you that the slightest uncertainty in your knowledge of the initial conditions will often grow inexorably. After a while, your predictions are nonsense." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"Intriguingly, the mathematics of randomness, chaos, and order also furnishes what may be a vital escape from absolute certainty - an opportunity to exercise free will in a deterministic universe. Indeed, in the interplay of order and disorder that makes life interesting, we appear perpetually poised in a state of enticingly precarious perplexity. The universe is neither so crazy that we can’t understand it at all nor so predictable that there’s nothing left for us to discover." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1997)

"When we look at the world around us, we find that we are not thrown into chaos and randomness but are part of a great order, a grand symphony of life. Every molecule in our body was once a part of previous bodies-living or nonliving-and will be a part of future bodies. In this sense, our body will not die but will live on, again and again, because life lives on. We share not only life's molecules but also its basic principles of organization with the rest of the living world. Arid since our mind, too, is embodied, our concepts and metaphors are embedded in the web of life together with our bodies and brains. We belong to the universe, we are at home in it, and this experience of belonging can make our lives profoundly meaningful." (Fritjof Capra, "The Hidden Connections", 2002)

"[…] we would like to observe that the butterfly effect lies at the root of many events which we call random. The final result of throwing a dice depends on the position of the hand throwing it, on the air resistance, on the base that the die falls on, and on many other factors. The result appears random because we are not able to take into account all of these factors with sufficient accuracy. Even the tiniest bump on the table and the most imperceptible move of the wrist affect the position in which the die finally lands. It would be reasonable to assume that chaos lies at the root of all random phenomena." (Iwo Białynicki-Birula & Iwona Białynicka-Birula, "Modeling Reality: How Computers Mirror Life", 2004) 

"Although the potential for chaos resides in every system, chaos, when it emerges, frequently stays within the bounds of its attractor(s): No point or pattern of points is ever repeated, but some form of patterning emerges, rather than randomness. Life scientists in different areas have noticed that life seems able to balance order and chaos at a place of balance known as the edge of chaos. Observations from both nature and artificial life suggest that the edge of chaos favors evolutionary adaptation." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)

"A system in which a few things interacting produce tremendously divergent behavior; deterministic chaos; it looks random but its not." (Christopher Langton) 

On Randomness X (From Fiction to Science-Fiction)

"Our lives today are not conducted in linear terms. They are much more quantified; a stream of random events is taking place." (James G Ballard, [Conversation with George MacBeth on Third Programme - BBC], 1967)

"The line between inner and outer landscapes is breaking down. Earthquakes can result from seismic upheavals within the human mind. The whole random universe of the industrial age is breaking down into cryptic fragments." (William S Burroughs, [preface] 1972)

"There is no reason to assume that the universe has the slightest interest in intelligence -  or even in life. Both may be random accidental by-products of its operations like the beautiful patterns on a butterfly's wings. The insect would fly just as well without them […]" (Arthur C Clarke, "The Lost Worlds of 2001", 1972)

"It is tempting to wonder if our present universe, large as it is and complex though it seems, might not be merely the result of a very slight random increase in order over a very small portion of an unbelievably colossal universe which is virtually entirely in heat-death." (Isaac Asimov, 1976)

"In the end, each life is no more than the sum of contingent facts, a chronicle of chance intersections, of flukes, of random events that divulge nothing but their own lack of purpose."
(Paul Auster, "The Locked Room", 1988)

"The natural world is full of irregularity and random alteration, but in the antiseptic, dust-free, shadowless, brightly lit, abstract realm of the mathematicians they like their cabbages spherical, please". (William A M Boyd, Brazzaville Beach, 1990)

"There are only patterns, patterns on top of patterns, patterns that affect other patterns. Patterns hidden by patterns. Patterns within patterns. If you watch close, history does nothing but repeat itself. What we call chaos is just patterns we haven't recognized. What we call random is just patterns we can't decipher. what we can't understand we call nonsense. What we can't read we call gibberish. There is no free will. There are no variables." (Chuck Palahniuk, "Survivor", 1999)

"Because the question for me was always whether that shape we see in our lives was there from the beginning or whether these random events are only called a pattern after the fact. Because otherwise we are nothing." (Cormac McCarthy, "All the Pretty Horses", 2010)

"All the ideas in the universe can be described by words. Therefore, if you simply take all the words and rearrange them randomly enough times, you’re bound to hit upon at least a few great ideas eventually."  (Jarod Kintz, "The Days of Yay are Here! Wake Me Up When They're Over", 2011)

"Chaos is impatient. It's random. And above all it's selfish. It tears down everything just for the sake of change, feeding on itself in constant hunger. But Chaos can also be appealing. It tempts you to believe that nothing matters except what you want." (Rick Riordan, "The Throne of Fire", 2011)

On Noise III

"Economists should study financial markets as they actually operate, not as they assume them to operate - observing the way in which information is actually processed, observing the serial correlations, bonanzas, and sudden stops, not assuming these away as noise around the edges of efficient and rational markets." (Adair Turner, "Economics after the Crisis: Objectives and means", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The signal is the truth. The noise is what distracts us from the truth." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"When some systems are stuck in a dangerous impasse, randomness and only randomness can unlock them and set them free. You can see here that absence of randomness equals guaranteed death. The idea of injecting random noise into a system to improve its functioning has been applied across fields. By a mechanism called stochastic resonance, adding random noise to the background makes you hear the sounds (say, music) with more accuracy." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"A signal is a useful message that resides in data. Data that isn’t useful is noise. […] When data is expressed visually, noise can exist not only as data that doesn’t inform but also as meaningless non-data elements of the display (e.g. irrelevant attributes, such as a third dimension of depth in bars, color variation that has no significance, and artificial light and shadow effects)." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)

"Data contain descriptions. Some are true, some are not. Some are useful, most are not. Skillful use of data requires that we learn to pick out the pieces that are true and useful. [...] To find signals in data, we must learn to reduce the noise - not just the noise that resides in the data, but also the noise that resides in us. It is nearly impossible for noisy minds to perceive anything but noise in data." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)

"When we find data quality issues due to valid data during data exploration, we should note these issues in a data quality plan for potential handling later in the project. The most common issues in this regard are missing values and outliers, which are both examples of noise in the data." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, worked examples, and case studies", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"Repeated observations of the same phenomenon do not always produce the same results, due to random noise or error. Sampling errors result when our observations capture unrepresentative circumstances, like measuring rush hour traffic on weekends as well as during the work week. Measurement errors reflect the limits of precision inherent in any sensing device. The notion of signal to noise ratio captures the degree to which a series of observations reflects a quantity of interest as opposed to data variance. As data scientists, we care about changes in the signal instead of the noise, and such variance often makes this problem surprisingly difficult." (Steven S Skiena, "The Data Science Design Manual", 2017)

"Using noise (the uncorrelated variables) to fit noise (the residual left from a simple model on the genuinely correlated variables) is asking for trouble." (Steven S Skiena, "The Data Science Design Manual", 2017)

On Noise II

"Noise signals are unwanted signals that are always present in a transmission system." (John R Pierce, "Signals: The Telephone and Beyond", 1981)

"Neither noise nor information is predictable." (Ray Kurzweil, "The Age of Spiritual Machines: When Computers Exceed Human Intelligence", 1999)

"No matter what the data, and no matter how the values are arranged and presented, you must always use some method of analysis to come up with an interpretation of the data. While every data set contains noise, some data sets may contain signals. Therefore, before you can detect a signal within any given data set, you must first filter out the noise." (Donald J Wheeler," Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"We analyze numbers in order to know when a change has occurred in our processes or systems. We want to know about such changes in a timely manner so that we can respond appropriately. While this sounds rather straightforward, there is a complication - the numbers can change even when our process does not. So, in our analysis of numbers, we need to have a way to distinguish those changes in the numbers that represent changes in our process from those that are essentially noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"While all data contain noise, some data contain signals. Before you can detect a signal, you must filter out the noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"In fact, an information theory that leaves out the issue of noise turns out to have no content." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"This phenomenon, common to chaos theory, is also known as sensitive dependence on initial conditions. Just a small change in the initial conditions can drastically change the long-term behavior of a system. Such a small amount of difference in a measurement might be considered experimental noise, background noise, or an inaccuracy of the equipment." (Greg Rae, Chaos Theory: A Brief Introduction, 2006)

"Data analysis is not generally thought of as being simple or easy, but it can be. The first step is to understand that the purpose of data analysis is to separate any signals that may be contained within the data from the noise in the data. Once you have filtered out the noise, anything left over will be your potential signals. The rest is just details." (Donald J Wheeler," Myths About Data Analysis", International Lean & Six Sigma Conference, 2012)

"Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

On Noise I

"Noise is the most impertinent of all forms of interruption. It is not only an interruption, but also a disruption of thought." (Arthur Schopenhauer, "Parerga and Paralipomena", 1851)

"Mathematics is the predominant science of our time; its conquests grow daily, though without noise; he who does not employ it for himself, will some day find it employed against himself." (Johann F Herbart, Werke, 1890)

"Life pushes its way through this fatalistically determined world like a river flowing upstream. It is a system of utterly improbable order, a message in a world of noise." (Joseph H Rush, "The Dawn of Life", 1957)

"Higher, directed forms of energy (e.g., mechanical, electric, chemical) are dissipated, that is, progressively converted into the lowest form of energy, i.e., undirected heat movement of molecules; chemical systems tend toward equilibria with maximum entropy; machines wear out owing to friction; in communication channels, information can only be lost by conversion of messages into noise but not vice versa, and so forth." (Ludwig von Bertalanffy, "Robots, Men and Minds", 1967)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of 'nois' is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"An essential element of dynamics systems is a positive feedback that self-enhances the initial deviation from the mean. The avalanche is proverbial. Cities grow since they attract more people, and in the universe, a local accumulation of dust may attract more dust, eventually leading to the birth of a star. Earlier or later, self-enhancing processes evoke an antagonistic reaction. A collapsing stock market stimulates the purchase of shares at a low price, thereby stabilizing the market. The increasing noise, dirt, crime and traffic jams may discourage people from moving into a big city." (Hans Meinhardt, "The Algorithmic Beauty of Sea Shells", 1995)

"Rather mathematicians like to look for patterns, and the primes probably offer the ultimate challenge. When you look at a list of them stretching off to infinity, they look chaotic, like weeds growing through an expanse of grass representing all numbers. For centuries mathematicians have striven to find rhyme and reason amongst this jumble. Is there any music that we can hear in this random noise? Is there a fast way to spot that a particular number is prime? Once you have one prime, how much further must you count before you find the next one on the list? These are the sort of questions that have tantalized generations." (Marcus du Sautoy, "The Music of the Primes", 1998)

"Data are collected as a basis for action. Yet before anyone can use data as a basis for action the data have to be interpreted. The proper interpretation of data will require that the data be presented in context, and that the analysis technique used will filter out the noise."  (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"Data are generally collected as a basis for action. However, unless potential signals are separated from probable noise, the actions taken may be totally inconsistent with the data. Thus, the proper use of data requires that you have simple and effective methods of analysis which will properly separate potential signals from probable noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

19 December 2020

On Randomness IX (Probabilities)

"The most important application of the theory of probability is to what we may call 'chance-like' or 'random' events, or occurrences. These seem to be characterized by a peculiar kind of incalculability which makes one disposed to believe - after many unsuccessful attempts - that all known rational methods of prediction must fail in their case. We have, as it were, the feeling that not a scientist but only a prophet could predict them. And yet, it is just this incalculability that makes us conclude that the calculus of probability can be applied to these events." (Karl R Popper, "The Logic of Scientific Discovery", 1934)

"The classical theory of probability was devoted mainly to a study of the gamble's gain, which is again a random variable; in fact, every random variable can be interpreted as the gain of a real or imaginary gambler in a suitable game." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

"To every event defined for the original random walk there corresponds an event of equal probability in the dual random walk, and in this way almost every probability relation has its dual." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

“The epistemological value of probability theory is based on the fact that chance phenomena, considered collectively and on a grand scale, create non-random regularity.” (Andrey Kolmogorov, “Limit Distributions for Sums of Independent Random Variables”, 1954)

"The urn model is to be the expression of three postulates: (1) the constancy of a probability distribution, ensured by the solidity of the vessel, (2) the random-character of the choice, ensured by the narrowness of the mouth, which is to prevent visibility of the contents and any consciously selective choice, (3) the independence of successive choices, whenever the drawn balls are put back into the urn. Of course in abstract probability and statistics the word 'choice' can be avoided and all can be done without any reference to such a model. But as soon as the abstract theory is to be applied, random choice plays an essential role."(Hans Freudenthal, "The Concept and the Role of the Model in Mathematics and Natural and Social Sciences", 1961)

"Probability theory is an ideal tool for formalizing uncertainty in situations where class frequencies are known or where evidence is based on outcomes of a sufficiently long series of independent random experiments. Possibility theory, on the other hand, is ideal for formalizing incomplete information expressed in terms of fuzzy propositions." (George Klir, "Fuzzy sets and fuzzy logic", 1995)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"The subject of probability begins by assuming that some mechanism of uncertainty is at work giving rise to what is called randomness, but it is not necessary to distinguish between chance that occurs because of some hidden order that may exist and chance that is the result of blind lawlessness. This mechanism, figuratively speaking, churns out a succession of events, each individually unpredictable, or it conspires to produce an unforeseeable outcome each time a large ensemble of possibilities is sampled."  (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"Chance is just as real as causation; both are modes of becoming.  The way to model a random process is to enrich the mathematical theory of probability with a model of a random mechanism. In the sciences, probabilities are never made up or 'elicited' by observing the choices people make, or the bets they are willing to place.  The reason is that, in science and technology, interpreted probability exactifies objective chance, not gut feeling or intuition. No randomness, no probability." (Mario Bunge, "Chasing Reality: Strife over Realism", 2006) 

"[...] according to the quantum theory, randomness is a basic trait of reality, whereas in classical physics it is a derivative property, though an equally objective one. Note, however, that this conclusion follows only under the realist interpretation of probability as the measure of possibility. If, by contrast, one adopts the subjectivist or Bayesian conception of probability as the measure of subjective uncertainty, then randomness is only in the eye of the beholder." (Mario Bunge, "Matter and Mind: A Philosophical Inquiry", 2010)

On Randomness VII (Events I)

"The very events which in their own nature appear most capricious and uncertain, and which in any individual case no attainable degree of knowledge would enable us to foresee, occur, when considerable numbers are taken into account, with a degree of regularity approaching to mathematical." (John S Mills, "A System of Logic", 1862)

"To every event defined for the original random walk there corresponds an event of equal probability in the dual random walk, and in this way almost every probability relation has its dual." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

"Perhaps randomness is not merely an adequate description for complex causes that we cannot specify. Perhaps the world really works this way, and many events are uncaused in any conventional sense of the word." (Stephen J Gould, "Hen's Teeth and Horse's Toes", 1983).

"If you perceive the world as some place where things happen at random - random events over which you have sometimes very little control, sometimes fairly good control, but still random events - well, one has to be able to have some idea of how these things behave. […] People who are not used to statistics tend to see things in data - there are random fluctuations which can sometimes delude them - so you have to understand what can happen randomly and try to control whatever can be controlled. You have to expect that you are not going to get a clean-cut answer. So how do you interpret what you get? You do it by statistics." (Lucien LeCam, [interview] 1988)

"Randomness is the very stuff of life, looming large in our everyday experience. […] The fascination of randomness is that it is pervasive, providing the surprising coincidences, bizarre luck, and unexpected twists that color our perception of everyday events." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"The subject of probability begins by assuming that some mechanism of uncertainty is at work giving rise to what is called randomness, but it is not necessary to distinguish between chance that occurs because of some hidden order that may exist and chance that is the result of blind lawlessness. This mechanism, figuratively speaking, churns out a succession of events, each individually unpredictable, or it conspires to produce an unforeseeable outcome each time a large ensemble of possibilities is sampled."  (Edward Beltrami, "Chaos and Order in Mathematics and Life", 1999)

"Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns - which frequently occur in random data - could equally well be derived from tossing a coin. So it is in the stock market as well." (Didier Sornette, "Why Stock Markets Crash: Critical events in complex financial systems", 2003)

"[…] we would like to observe that the butterfly effect lies at the root of many events which we call random. The final result of throwing a dice depends on the position of the hand throwing it, on the air resistance, on the base that the die falls on, and on many other factors. The result appears random because we are not able to take into account all of these factors with sufficient accuracy. Even the tiniest bump on the table and the most imperceptible move of the wrist affect the position in which the die finally lands. It would be reasonable to assume that chaos lies at the root of all random phenomena." (Iwo Białynicki-Birula & Iwona Białynicka-Birula, "Modeling Reality: How Computers Mirror Life", 2004)

"A Black Swan is a highly improbable event with three principal characteristics: It is unpredictable; it carries a massive impact; and, after the fact, we concoct an explanation that makes it appear less random, and more predictable, than it was. […] The Black Swan idea is based on the structure of randomness in empirical reality. [...] the Black Swan is what we leave out of simplification." (Nassim N Taleb, "The Black Swan", 2007)

"Regression toward the mean. That is, in any series of random events an extraordinary event is most likely to be followed, due purely to chance, by a more ordinary one." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

On Randomness V (Systems I)

"Is a random outcome completely determined, and random only by virtue of our ignorance of the most minute contributing factors? Or are the contributing factors unknowable, and therefore render as random an outcome that can never be determined? Are seemingly random events merely the result of fluctuations superimposed on a determinate system, masking its predictability, or is there some disorderliness built into the system itself?” (Deborah J Bennett, "Randomness", 1998)

"The self-similarity of fractal structures implies that there is some redundancy because of the repetition of details at all scales. Even though some of these structures may appear to teeter on the edge of randomness, they actually represent complex systems at the interface of order and disorder."  (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"If a network is solely composed of neighborhood connections, information must traverse a large number of connections to get from place to place. In a small-world network, however, information can be transmitted between any two nodes using, typically, only a small number of connections. In fact, just a small percentage of random, long-distance connections is required to induce such connectivity. This type of network behavior allows the generation of 'six degrees of separation' type results, whereby any agent can connect to any other agent in the system via a path consisting of only a few intermediate nodes." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"Although the potential for chaos resides in every system, chaos, when it emerges, frequently stays within the bounds of its attractor(s): No point or pattern of points is ever repeated, but some form of patterning emerges, rather than randomness. Life scientists in different areas have noticed that life seems able to balance order and chaos at a place of balance known as the edge of chaos. Observations from both nature and artificial life suggest that the edge of chaos favors evolutionary adaptation." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system. This property of a non-linear system such as the weather is known as the butterfly effect where it is purported that a butterfly flapping its wings in Japan can give rise to a tornado in Kansas. This unpredictable behaviour of nonlinear dynamical systems, i.e. its extreme sensitivity to initial conditions, seems to be random and is therefore referred to as chaos. This chaotic and seemingly random behaviour occurs for non-linear deterministic system in which effects can be linked to causes but cannot be predicted ahead of time." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010)

"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini, "Chaos: From Simple Models to Complex Systems", 2010)

"Although cascading failures may appear random and unpredictable, they follow reproducible laws that can be quantified and even predicted using the tools of network science. First, to avoid damaging cascades, we must understand the structure of the network on which the cascade propagates. Second, we must be able to model the dynamical processes taking place on these networks, like the flow of electricity. Finally, we need to uncover how the interplay between the network structure and dynamics affects the robustness of the whole system." (Albert-László Barabási, "Network Science", 2016)

Émile Boutroux - Collected Quotes

"In spite of their relations, science and religion remain, and must remain, distinct. If there were no other way of establishing a rational order between things than that of reducing the many to the one, either by assimilation or by elimination, the destiny of religion would appear doubtful." (Émile Boutroux, "Science and Religion in Contemporary Philosophy", 1908)

"There can be nothing clearer or more convenient for the purpose of setting one's ideas in order and for conducting an abstract discussion, than precise definitions and inviolable lines of demarcation." (Émile Boutroux, "Science and Religion in Contemporary Philosophy", 1908)

"The senses afford a primary conception of the world, which they show to be a mass of facts, endless in their variety. Man may observe, analyse, and describe them with ever-increasing exactness: this very description constitutes science." (Émile Boutroux, "The Contingency of the Laws of Nature", 1911)

"Science is reduction. Mathematics is its ideal, its form par excellence, for it is in mathematics that assimilation, identification, is most perfectly realized. The universe, scientifically explained, would be a certain formula, one and eternal, regarded as the equivalent of the entire diversity and movement of things." (Émile Boutroux, "Natural law in Science and Philosophy", 1914)

"The mathematical laws presuppose a very complex elaboration. They are not known exclusively either a priori or a posteriori, but are a creation of the mind; and this creation is not an arbitrary one, but, owing to the mind’s resources, takes place with reference to experience and in view of it. Sometimes the mind starts with intuitions which it freely creates; sometimes, by a process of elimination, it gathers up the axioms it regards as most suitable for producing a harmonious development, one that is both simple and fertile. The mathematics is a voluntary and intelligent adaptation of thought to things, it represents the forms that will allow of qualitative diversity being surmounted, the moulds into which reality must enter in order to become as intelligible as possible." (Émile Boutroux, "Natural Law in Science and Philosophy", 1914)

"The philosopher asks himself whether natural law as assumed by science, wholly coincides with law as really existing in nature; whether science and reality are so alike that science may be regarded as exhausting everything intelligible and true that the real contains." (Émile Boutroux, "Natural Law in Science and Philosophy", 1914)

"The world is an endless variety of facts, linked together by necessary and immutable bonds." (Émile Boutroux, "Natural law in Science and Philosophy", 1914)

Burton G Malkiel - Collected Quotes

"Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns - which frequently occur in random data - could equally well be derived from tossing a coin. So it is in the stock market as well." (Burton G Malkiel, "A Random Walk Down Wall Street", 1989)

"A random walk is one in which future steps or directions cannot be predicted on the basis of past history. When the term is applied to the stock market, it means that short-run changes in stock prices are unpredictable. Investment advisory services, earnings forecasts, and chart patterns are useless. [...] What are often called 'persistent patterns' in the stock market occur no more frequently than the runs of luck in the fortunes of any gambler playing a game of chance. This is what economists mean when they say that stock prices behave very much like a random walk." (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"[...] an accurate statement of the 'weak' form of the random-walk hypothesis goes as follows: The history of stock price movements contains no useful information that will enable an investor consistently to outperform a buy-and-hold strategy in managing a portfolio. [...] Moreover, new fundamental information about a company [...] is also unpredictable. It will occur randomly over time. Indeed, successive appearances of news items must be random. If an item of news were not random, that is, if it were dependent on an earlier item of news, then it wouldn't be news at all. The weak form of the random-walk theory says only that stock prices cannot be predicted on the basis of past stock prices. [...] the weak form of the efficient-market hypothesis (the random-walk notion) says simply that the technical analysis of past price patterns to forecast the future is useless because any information from such an analysis will already have been incorporated in current market prices." (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"Informational cascades occur when individuals choose to ignore or downplay their private information and instead jump on the bandwagon by mimicking the actions of individuals who acted previously. Informational cascades occur when the existing aggregate information becomes so overwhelming that an individual’s single piece of private information is not strong enough to reverse the decision of the crowd. Therefore, the individual chooses to mimic the action of the crowd, rather than act on his private information. If this scenario holds for one individual, then it likely also holds for anyone acting after this person. This domino-like effect is often referred to as a cascade. The two crucial ingredients for an informational cascade to develop are: (i) sequential decisions with subsequent actors observing decisions (not information) of previous actors; and (ii) a limited action space." (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"Knowledge is encoded in models. Models are synthetic sets of rules, pictures, and algorithms providing us with useful representations of the world of our perceptions and of their patterns. As argued by philosophers and shown by scientists, we do not have access to 'reality', only to some of its manifestations, whose regularities are used to determine rules, which when widely applicable become 'laws of nature'. These laws are constantly tested in the scientific march, and they evolve, develop and transmute as the frontier of knowledge recedes further away."  (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"Perhaps the most common complaint about the weakness of the random-walk theory is based on a distrust of mathematics and a misconception of what the theory means. 'The market isn't random', the complaint goes, 'and no mathematician is going to convince me it is'. [...] But, even if markets were dominated during certain periods by irrational crowd behavior, the stock market might still well be approximated by a random walk. The original illustrative analogy of a random walk concerned a drunken man staggering around an empty field. He is not rational, but he's not predictable either." (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"Reputational herding, like cascades, takes place when an agent chooses to ignore his or her private information and mimic the action of another agent who has acted previously. However, reputational herding models have an additional layer of mimicking, resulting from positive reputational properties that can be obtained by acting as part of a group or choosing a certain project."  (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"The random-walk theory does not, as some critics have proclaimed, state that stock prices move aimlessly and erratically and are insensitive to changes in fundamental information. On the contrary, the point of the random-walk theory is just the opposite: The market is so efficient - prices move so quickly when new information does arise, that no one can consistently buy or sell quickly enough to benefit.(Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

On Randomness III (Random-Walks)

"To every event defined for the original random walk there corresponds an event of equal probability in the dual random walk, and in this way almost every probability relation has its dual." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

"I suspect that even if the random walkers announced a perfect mathematic proof of randomness I would go on believing that in the long run future earnings influence present value, and that in the short run the dominant factor is the elusive Australopithecus, the temper of the crowd." (Adam Smith, "The Money Game", 1968)

"A weakness of the random-walk model lies in its assumption of instantaneous adjustment, whereas the information impelling a stock market toward its 'intrinsic value' gradually becomes disseminated throughout the market place." (Richard A Epstein, The Theory of Gambling and Statistical Logic, 1977)

"However, random walk theory also tells us that the chance that the balance never returns to zero - that is, that H stays in the lead for ever - is 0. This is the sense in which the 'law of averages' is true. If you wait long enough, then almost surely the numbers of heads and tails will even out. But this fact carries no implications about improving your chances of winning, if you're betting on whether H or T turns up. The probabilities are unchanged, and you don't know how long the 'long run' is going to be. Usually it is very long indeed." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"In everyday language, a fair coin is called random, but not a coin that shows head more often than tail. A coin that keeps a memory of its own record of heads and tails is viewed as even less random. This mental picture is present in the term random walk, especially as used in finance." (Benoit B Mandelbrot, "Fractals and Scaling in Finance: Discontinuity, concentration, risk", 1997)

"The 'law of averages' asserts itself not by removing imbalances, but by swamping them. Random walk theory tells us that if you wait long enough - on average, infinitely long - then eventually the numbers will balance out. If you stop at that very instant, then you may imagine that your intuition about a 'law of averages' is justified. But you're cheating: you stopped when you got the answer you wanted. Random walk theory also tells us that if you carry on for long enough, you will reach a situation where the number of H's is a billion more than the number of T's." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"A random walk is one in which future steps or directions cannot be predicted on the basis of past history. When the term is applied to the stock market, it means that short-run changes in stock prices are unpredictable. Investment advisory services, earnings forecasts, and chart patterns are useless. [...] What are often called 'persistent patterns' in the stock market occur no more frequently than the runs of luck in the fortunes of any gambler playing a game of chance. This is what economists mean when they say that stock prices behave very much like a random walk." (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"[...] an accurate statement of the 'weak' form of the random-walk hypothesis goes as follows: The history of stock price movements contains no useful information that will enable an investor consistently to outperform a buy-and-hold strategy in managing a portfolio. [...] Moreover, new fundamental information about a company [...] is also unpredictable. It will occur randomly over time. Indeed, successive appearances of news items must be random. If an item of news were not random, that is, if it were dependent on an earlier item of news, then it wouldn't be news at all. The weak form of the random-walk theory says only that stock prices cannot be predicted on the basis of past stock prices. [...] the weak form of the efficient-market hypothesis (the random-walk notion) says simply that the technical analysis of past price patterns to forecast the future is useless because any information from such an analysis will already have been incorporated in current market prices." (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"Perhaps the most common complaint about the weakness of the random-walk theory is based on a distrust of mathematics and a misconception of what the theory means. 'The market isn't random', the complaint goes, 'and no mathematician is going to convince me it is'. [...] But, even if markets were dominated during certain periods by irrational crowd behavior, the stock market might still well be approximated by a random walk. The original illustrative analogy of a random walk concerned a drunken man staggering around an empty field. He is not rational, but he's not predictable either." (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"The random-walk theory does not, as some critics have proclaimed, state that stock prices move aimlessly and erratically and are insensitive to changes in fundamental information. On the contrary, the point of the random-walk theory is just the opposite: The market is so efficient - prices move so quickly when new information does arise, that no one can consistently buy or sell quickly enough to benefit.(Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"The concept of a random walk is simple but rich for its many applications, not only in finance but also in physics and the description of natural phenomena. It is arguably one of the most founding concepts in modern physics as well as in finance, as it underlies the theories of elementary particles, which are the building blocks of our universe, as well as those describing the complex organization of matter around us." (Didier Sornette, "Why Stock Markets Crash: Critical Events in Complex Systems", 2003)

"The most important prediction of the random walk model is that the square of the fluctuations of its position should increase in proportion to the time scale. This is equivalent to saying that the typical amplitude of its position is proportional to the square root of the time scale. (Didier Sornette, "Why Stock Markets Crash: Critical events in complex financial systems", 2003)

"Just by looking at accelerating complexification of the Universe of which we are an integral part, we can conclude that we are not subjected to a random walk of evolution, nor are we subjected to a deterministic script of Nature, the truth lies somewhere in between – we are part of teleological evolution." (Alex M Vikoulov, "The Syntellect Hypothesis: Five Paradigms of the Mind's Evolution", 2019)

17 December 2020

Lars Skyttner - Collected Quotes

"A mathematical model uses mathematical symbols to describe and explain the represented system. Normally used to predict and control, these models provide a high degree of abstraction but also of precision in their application." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"A symbol is a mental representation regarding the internal reality referring to its object by a convention and produced by the conscious interpretation of a sign. In contrast to signals, symbols may be used every time if the receiver has the corresponding representation. Symbols also relate to feelings and thus give access not only to information but also to the communicator’s motivational and emotional state. The use of symbols makes it possible for the organism using it to evoke in the receiver the same response it evokes in himself. To communicate with symbols is to use a language." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"As a meta-discipline, systems science will transfer its content from discipline to discipline and address problems beyond conventional reductionist boundaries. Generalists, qualified to manage today’s problem better than the specialist, could be fostered. With these intentions, systems thinking and systems science should not replace but add, complement and integrate those aspects that seem not to be adequately treated by traditional science." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Defined from a societal standpoint, information may be seen as an entity which reduces maladjustment between system and environment. In order to survive as a thermodynamic entity, all social systems are dependent upon an information flow. This explanation is derived from the parallel between entropy and information where the latter is regarded as negative entropy (negentropy). In more common terms information is a form of processed data or facts about objects, events or persons, which are meaningful for the receiver, inasmuch as an increase in knowledge reduces uncertainty." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Expressed in terms of entropy, open systems are negentropic, that is, tend toward a more elaborate structure. As open systems, organisms which are in equilibrium are capable of working for a long time by use of the constant input of matter and energy. Closed systems, however, increase their entropy, tend to run down and can therefore be called ’dying systems’. When reaching a steady state the closed system is not capable of performing any work." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Heisenberg’s principle must be considered a special case of the complementarity principle […]. This states that an experiment on one aspect of a system (of atomic dimensions) destroys the possibility of learning about a complementarity aspect of the same system. Together these principles have shocking consequences for the comprehension of entropy and determinism." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"In the definition of meaning, it is assumed that both the source and receiver have previously coded (and stored) signals of the same or similar referents, such that the messages may have meaning and relate to behaviour. That is, the used symbols must have the same signification for both sender and receiver. If not, the receiver will create a different mental picture than intended by the transmitter. Meaning is generated by individuals in a process of social interaction with a more or less common environment. It is a relation subsisting within a field of experience and appears as an emergent property of a symbolic representation when used in culturally accepted interaction. The relation between the symbolic representation and its meaning is random. Of this, however, the mathematical theory has nothing to say. If human links in the chain of communication are missing, of course no questions of meaning will arise." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Information is neither matter nor energy, it is rather an abstract concept of the same kind as entropy, which must be considered a conceptual relative. 'Amount of information' is a metaphorical term and has in fact no numerical properties."  (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Information entropy has its own special interpretation and is defined as the degree of unexpectedness in a message. The more unexpected words or phrases, the higher the entropy. It may be calculated with the regular binary logarithm on the number of existing alternatives in a given repertoire. A repertoire of 16 alternatives therefore gives a maximum entropy of 4 bits. Maximum entropy presupposes that all probabilities are equal and independent of each other. Minimum entropy exists when only one possibility is expected to be chosen. When uncertainty, variety or entropy decreases it is thus reasonable to speak of a corresponding increase in information." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Living systems in general are energy transducers which use information to perform more efficiently, converting one form of energy into another, and converting energy into information. Living species have developed a genius system to overcome entropy by their procreative faculty. […] Storing the surplus energy in order to survive is to reverse the entropic process or to create negentropy. A living being can only resist the degradation of its own structure. The entropic process influencing the structure and environment of the whole system is beyond individual control." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Potential energy is organized energy, heat is disorganized energy and entropy therefore results in dissolution and disorder. The sum of all the quantities of heat lost in the course of all the activities that have taken place in the universe equals the total accumulation of entropy. A popular analogy of entropy is that it is not possible to warm oneself on something which is colder than oneself. […] Note also that maximun entropy is maximum randomization." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Reductionism argues that from scientific theories which explain phenomena on one level, explanations for a higher level can be deduced. Reality and our experience can be reduced to a number of indivisible basic elements. Also qualitative properties are possible to reduce to quantitative ones." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Systems science, with such an ambition and with its basic Systems Theory, provides a general language with which to tie together various areas in interdisciplinary communication. As such it automatically strives towards a universal science, i.e. to join together the many splintered disciplines with a 'law of laws', applicable to them all and integrating all scientific knowledge." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Systems thinking expands the focus of the observer, whereas analytical thinking reduces it. In other words, analysis looks into things, synthesis looks out of them. This attitude of systems thinking is often called expansionism, an alternative to classic reductionism. Whereas analytical thinking concentrates on static and structural properties, systems thinking concentrates on the function and behaviour of whole systems. Analysis gives description and knowledge; systems thinking gives explanation and understanding." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The function of living matter is apparently to expand the organization of the universe. Here, locally decreased entropy as a result of biological order in existing life is invalidating the effects of the second law of thermodynamics, although at the expense of increased entropy in the whole system. It is the running down of the universe that made the sun and the earth possible. It is the running down of the sun that made life and us possible." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The organizing principle of purpose can generally have two directions: one towards the system itself and one towards the environment. Directed towards the system, the aim is to maintain homeostasis. Directed towards the environment, the aim is often to modify it to resemble a desired state or, if this is not possible, to bypass or override the disturbances." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The second law of thermodynamics states that all energy in the universe degrades irreversibly. Thus, differences between energy forms must decrease over time. Everything is spread! (The principle of degradation of energy with regard to quality.) Translated to the area of systems the law tells us that the entropy of an isolated system always increases. Another consequence is that when two systems are joined together, the entropy of the united system is greater than the sum of the entropies of the individual systems." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)

Related Posts Plugin for WordPress, Blogger...

On Hypothesis Testing III

  "A little thought reveals a fact widely understood among statisticians: The null hypothesis, taken literally (and that’s the only way...