Showing posts with label connectedness. Show all posts
Showing posts with label connectedness. Show all posts

06 September 2025

On Connectedness (1975-1999)

"We have reversed the usual classical notion that the independent 'elementary parts' of the world are the fundamental reality, and that the various systems are merely particular contingent forms and arrangements of these parts. Rather, we say that inseparable quantum interconnectedness of the whole universe is the fundamental reality, and that relatively independent behaving parts are merely particular and contingent forms within this whole." (David Bohm, "On the Intuitive Understanding of Nonlocality as Implied by Quantum Theory", Foundations of Physics Vol 5 (1), 1975)

"Information is recorded in vast interconnecting networks. Each idea or image has hundreds, perhaps thousands, of associations and is connected to numerous other points in the mental network." (Peter Russell, "The Brain Book: Know Your Own Mind and How to Use it", 1979)

"The world is a complex, interconnected, finite, ecological–social–psychological–economic system. We treat it as if it were not, as if it were divisible, separable, simple, and infinite. Our persistent, intractable global problems arise directly from this mismatch." (Donella Meadows,"Whole Earth Models and Systems", 1982)

"All certainty in our relationships with the world rests on acknowledgement of causality. Causality is a genetic connection of phenomena through which one thing (the cause) under certain conditions gives rise to, causes something else (the effect). The essence of causality is the generation and determination of one phenomenon by another." (Alexander Spirkin, "Dialectical Materialism", 1983)

"When loops are present, the network is no longer singly connected and local propagation schemes will invariably run into trouble. [...] If we ignore the existence of loops and permit the nodes to continue communicating with each other as if the network were singly connected, messages may circulate indefinitely around the loops and process may not converges to a stable equilibrium. […] Such oscillations do not normally occur in probabilistic networks […] which tend to bring all messages to some stable equilibrium as time goes on. However, this asymptotic equilibrium is not coherent, in the sense that it does not represent the posterior probabilities of all nodes of the network." (Judea Pearl, "Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference", 1988)

"Systems thinking is a discipline for seeing wholes. It is a framework for seeing interrelationships rather than things, for seeing patterns of change rather than static 'snapshots'. It is a set of general principles- distilled over the course of the twentieth century, spanning fields as diverse as the physical and social sciences, engineering, and management. [...] During the last thirty years, these tools have been applied to understand a wide range of corporate, urban, regional, economic, political, ecological, and even psychological systems. And systems thinking is a sensibility for the subtle interconnectedness that gives living systems their unique character." (Peter Senge, "The Fifth Discipline", 1990)

"In sharp contrast (with the traditional social planning) the systems design approach seeks to understand a problem situation as a system of interconnected, interdependent, and interacting issues and to create a design as a system of interconnected, interdependent, interacting, and internally consistent solution ideas." (Béla H Bánáthy, "Designing Social Systems in a Changing World", 1996)

"In the new systems thinking, the metaphor of knowledge as a building is being replaced by that of the network. As we perceive reality as a network of relationships, our descriptions, too, form an interconnected network of concepts and models in which there are no foundations. For most scientists such a view of knowledge as a network with no firm foundations is extremely unsettling, and today it is by no means generally accepted. But as the network approach expands throughout the scientific community, the idea of knowledge as a network will undoubtedly find increasing acceptance." (Fritjof Capra," The Web of Life: a new scientific understanding of living systems", 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

On Connectedness (2000-2009)

"A self-organizing system not only regulates or adapts its behavior, it creates its own organization. In that respect it differs fundamentally from our present systems, which are created by their designer. We define organization as structure with function. Structure means that the components of a system are arranged in a particular order. It requires both connections, that integrate the parts into a whole, and separations that differentiate subsystems, so as to avoid interference. Function means that this structure fulfils a purpose." (Francis Heylighen & Carlos Gershenson, "The Meaning of Self-organization in Computing", IEEE Intelligent Systems, 2003)

"At an anatomical level - the level of pure, abstract connectivity - we seem to have stumbled upon a universal pattern of complexity. Disparate networks show the same three tendencies: short chains, high clustering, and scale-free link distributions. The coincidences are eerie, and baffling to interpret." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Average path length reflects the global structure; it depends on the way the entire network is connected, and cannot be inferred from any local measurement. Clustering reflects the local structure; it depends only on the interconnectedness of a typical neighborhood, the inbreeding among nodes tied to a common center. Roughly speaking, path length measures how big the network is. Clustering measures how incestuous it is." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"By its very nature, the mathematical study of networks transcends the usual boundaries between disciplines. Network theory is concerned with the relationships between individuals, the patterns of interactions. The precise nature of the individuals is downplayed, or even suppressed, in hopes of uncovering deeper laws. A network theorist will look at any system of interlinked components and see an abstract pattern of dots connected by lines. It's the pattern that matters, the architecture of relationships, not the identities of the dots themselves. Viewed from these lofty heights, many networks, seemingly unrelated, begin to look the same." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"By 'network' I mean a set of dynamical systems that are 'coupled together', with some influencing the behavior of others. The systems themselves are the nodes of the network- think of them as blobs - and two nodes are joined by an arrow if one of them (at the tail end) influences the other (at the head end). For example, each node might be a nerve cell in some organism, and the arrows might be connections along which signals pass from one cell to another." (Ian Stewart, "Letters to a Young Mathematician", 2006)

"Connectivity harbors other risks too. As we create more links among the nodes of our technological and social networks, these networks sometimes developed unexpected patterns of connections that make breakdown more likely. They can, for instance, develop harmful feedback loops - what people commonly call vicious circles - that reinforce instabilities and even lead to collapse." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"Initially, increasing connectedness and diversity helps, but as the connections become increasingly dense, the system gets very strongly coupled so that a failure in one part reverberates throughout the entire network." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"Nodes and connectors comprise the structure of a network. In contrast, an ecology is a living organism. It influences the formation of the network itself." (George Siemens, "Knowing Knowledge", 2006)

"Scale-free networks are particularly vulnerable to intentional attack: if someone wants to wreck the whole network, he simply needs to identify and destroy some of its hubs. And here we see how our world’s increasing connectivity really matters. Scientists have found that as a scale-free network like the Internet or our food-distribution system grows- as it adds more nodes - the new nodes tend to hook up with already highly connected hubs." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"If a network is solely composed of neighborhood connections, information must traverse a large number of connections to get from place to place. In a small-world network, however, information can be transmitted between any two nodes using, typically, only a small number of connections. In fact, just a small percentage of random, long-distance connections is required to induce such connectivity. This type of network behavior allows the generation of 'six degrees of separation' type results, whereby any agent can connect to any other agent in the system via a path consisting of only a few intermediate nodes." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"Networks may also be important in terms of view. Many models assume that agents are bunched together on the head of a pin, whereas the reality is that most agents exist within a topology of connections to other agents, and such connections may have an important influence on behavior. […] Models that ignore networks, that is, that assume all activity takes place on the head of a pin, can easily suppress some of the most interesting aspects of the world around us. In a pinhead world, there is no segregation, and majority rule leads to complete conformity - outcomes that, while easy to derive, are of little use." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"A graph enables us to visualize a relation over a set, which makes the characteristics of relations such as transitivity and symmetry easier to understand. […] Notions such as paths and cycles are key to understanding the more complex and powerful concepts of graph theory. There are many degrees of connectedness that apply to a graph; understanding these types of connectedness enables the engineer to understand the basic properties that can be defined for the graph representing some aspect of his or her system. The concepts of adjacency and reachability are the first steps to understanding the ability of an allocated architecture of a system to execute properly." (Dennis M Buede, "The Engineering Design of Systems: Models and methods", 2009)

"Complexity theory embraces things that are complicated, involve many elements and many interactions, are not deterministic, and are given to unexpected outcomes. […] A fundamental aspect of complexity theory is the overall or aggregate behavior of a large number of items, parts, or units that are entangled, connected, or networked together. […] In contrast to classical scientific methods that directly link theory and outcome, complexity theory does not typically provide simple cause-and-effect explanations." (Robert E Gunther et al, "The Network Challenge: Strategy, Profit, and Risk in an Interlinked World", 2009)

"The simplest basic architecture of an artificial neural network is composed of three layers of neurons - input, output, and intermediary (historically called perceptron). When the input layer is stimulated, each node responds in a particular way by sending information to the intermediary level nodes, which in turn distribute it to the output layer nodes and thereby generate a response. The key to artificial neural networks is in the ways that the nodes are connected and how each node reacts to the stimuli coming from the nodes it is connected to. Just as with the architecture of the brain, the nodes allow information to pass only if a specific stimulus threshold is passed. This threshold is governed by a mathematical equation that can take different forms. The response depends on the sum of the stimuli coming from the input node connections and is 'all or nothing'." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

On Connectedness (-1974)

"The first attempts to consider the behavior of so-called 'random neural nets' in a systematic way have led to a series of problems concerned with relations between the 'structure' and the 'function' of such nets. The 'structure' of a random net is not a clearly defined topological manifold such as could be used to describe a circuit with explicitly given connections. In a random neural net, one does not speak of 'this' neuron synapsing on 'that' one, but rather in terms of tendencies and probabilities associated with points or regions in the net." (Anatol Rapoport, "Cycle distributions in random nets", The Bulletin of Mathematical Biophysics 10(3), 1948)

"Equilibrium requires that the whole of the structure, the form of its elements, and the means of interconnection be so combined that at the supports there will automatically be produced passive forces or reactions that are able to balance the forces acting upon the structures, including the force of its own weight."  (Eduardo Torroja, "Philosophy of Structure", 1951)

"The principle of complementarity states that no single model is possible which could provide a precise and rational analysis of the connections between these phenomena [before and after measurement]. In such a case, we are not supposed, for example, to attempt to describe in detail how future phenomena arise out of past phenomena. Instead, we should simply accept without further analysis the fact that future phenomena do in fact somehow manage to be produced, in a way that is, however, necessarily beyond the possibility of a detailed description. The only aim of a mathematical theory is then to predict the statistical relations, if any, connecting the phenomena." (David Bohm, "A Suggested Interpretation of the Quantum Theory in Terms of ‘Hidden’ Variables", 1952)

"[…] there are three different but interconnected conceptions to be considered in every structure, and in every structural element involved: equilibrium, resistance, and stability." (Eduardo Torroja, "Philosophy of Structure", 1951)

"General Systems Theory is a name which has come into use to describe a level of theoretical model-building which lies somewhere between the highly generalized constructions of pure mathematics and the specific theories of the specialized disciplines. Mathematics attempts to organize highly general relationships into a coherent system, a system however which does not have any necessary connections with the 'real' world around us. It studies all thinkable relationships abstracted from any concrete situation or body of empirical knowledge." (Kenneth E Boulding, "General Systems Theory - The Skeleton of Science", Management Science Vol. 2 (3), 1956)

"The essential vision of reality presents us not with fugitive appearances but with felt patterns of order which have coherence and meaning for the eye and for the mind. Symmetry, balance and rhythmic sequences express characteristics of natural phenomena: the connectedness of nature - the order, the logic, the living process. Here art and science meet on common ground." (Gyorgy Kepes, "The New Landscape: In Art and Science", 1956)

"In fact, it is empirically ascertainable that every event is actually produced by a number of factors, or is at least accompanied by numerous other events that are somehow connected with it, so that the singling out involved in the picture of the causal chain is an extreme abstraction. Just as ideal objects cannot be isolated from their proper context, material existents exhibit multiple interconnections; therefore the universe is not a heap of things but a system of interacting systems." (Mario Bunge, "Causality: The place of the casual principles in modern science", 1959)

"The general models, even of the most elaborate kind, serve the simple purpose of demonstrating the interconnectedness of all economic phenomena, and show how, under certain conditions, price may act as a guiding link between them. Looked at in another way such models show how a complex set of interrelations can hang together consistently without any central administrative direction." (Ely Devons, "Essays in Economics", 1961)

"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […]  'Organizing' […] may also mean 'changing from a bad organization to a good one' […] The system would be 'self-organizing' if a change were automatically made to the feedback, changing it from positive to negative; then the whole would have changed from a bad organization to a good." (W Ross Ashby, "Principles of the self-organizing system", 1962)

"A NETWORK is a collection of connected lines, each of which indicates the movement of some quantity between two locations. Generally, entrance to a network is via a source (the starting point) and exit from a network is via a sink (the finishing point); the lines which form the network are called links (or arcs), and the points at which two or more links meet are called nodes." (Cecil W Lowe, "Critical Path Analysis by Bar Chart", 1966)

21 January 2021

Complex Systems V

"Complexity is the characteristic property of complicated systems we don’t understand immediately. It is the amount of difficulties we face while trying to understand it. In this sense, complexity resides largely in the eye of the beholder - someone who is familiar with s.th. often sees less complexity than someone who is less familiar with it. [...] A complex system is created by evolutionary processes. There are multiple pathways by which a system can evolve. Many complex systems are similar, but each instance of a system is unique." (Jochen Fromm, The Emergence of Complexity, 2004)

"In complexity thinking the darkness principle is covered by the concept of incompressibility [...] The concept of incompressibility suggests that the best representation of a complex system is the system itself and that any representation other than the system itself will necessarily misrepresent certain aspects of the original system." (Kurt Richardson, "Systems theory and complexity: Part 1", Emergence: Complexity & Organization Vol.6 (3), 2004)

"The basic concept of complexity theory is that systems show patterns of organization without organizer (autonomous or self-organization). Simple local interactions of many mutually interacting parts can lead to emergence of complex global structures. […] Complexity originates from the tendency of large dynamical systems to organize themselves into a critical state, with avalanches or 'punctuations' of all sizes. In the critical state, events which would otherwise be uncoupled became correlated." (Jochen Fromm, "The Emergence of Complexity", 2004)

"Complexity arises when emergent system-level phenomena are characterized by patterns in time or a given state space that have neither too much nor too little form. Neither in stasis nor changing randomly, these emergent phenomena are interesting, due to the coupling of individual and global behaviours as well as the difficulties they pose for prediction. Broad patterns of system behaviour may be predictable, but the system's specific path through a space of possible states is not." (Steve Maguire et al, "Complexity Science and Organization Studies", 2006)

"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly.  A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"The butterfly effect demonstrates that complex dynamical systems are highly responsive and interconnected webs of feedback loops. It reminds us that we live in a highly interconnected world. Thus our actions within an organization can lead to a range of unpredicted responses and unexpected outcomes. This seriously calls into doubt the wisdom of believing that a major organizational change intervention will necessarily achieve its pre-planned and highly desired outcomes. Small changes in the social, technological, political, ecological or economic conditions can have major implications over time for organizations, communities, societies and even nations." (Elizabeth McMillan, "Complexity, Management and the Dynamics of Change: Challenges for practice", 2008)

"[a complex system is] a system in which large networks of components with no central control and simple rules of operation give rise to complex collective behavior, sophisticated information processing, and adaptation via learning or evolution." (Melanie Mitchell, "Complexity: A Guided Tour", 2009)

"A typical complex system consists of a vast number of identical copies of several generic processes, which are operating and interacting only locally or with a limited number of not necessary close neighbours. There is no global leader or controller associated to such systems and the resulting behaviour is usually very complex." (Jirí Kroc & Peter M A Sloot, "Complex Systems Modeling by Cellular Automata", Encyclopedia of Artificial Intelligence, 2009)

Complex Systems IV

"With the growing interest in complex adaptive systems, artificial life, swarms and simulated societies, the concept of 'collective intelligence' is coming more and more to the fore. The basic idea is that a group of individuals (e. g. people, insects, robots, or software agents) can be smart in a way that none of its members is. Complex, apparently intelligent behavior may emerge from the synergy created by simple interactions between individuals that follow simple rules." (Francis Heylighen, "Collective Intelligence and its Implementation on the Web", 1999)

"A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques can not easily handle the problem." (M Jamshidi, "Autonomous Control on Complex Systems: Robotic Applications", Current Advances in Mechanical Design and Production VII, 2000)

"Bounded rationality simultaneously constrains the complexity of our cognitive maps and our ability to use them to anticipate the system dynamics. Mental models in which the world is seen as a sequence of events and in which feedback, nonlinearity, time delays, and multiple consequences are lacking lead to poor performance when these elements of dynamic complexity are present. Dysfunction in complex systems can arise from the misperception of the feedback structure of the environment. But rich mental models that capture these sources of complexity cannot be used reliably to understand the dynamics. Dysfunction in complex systems can arise from faulty mental simulation-the misperception of feedback dynamics. These two different bounds on rationality must both be overcome for effective learning to occur. Perfect mental models without a simulation capability yield little insight; a calculus for reliable inferences about dynamics yields systematically erroneous results when applied to simplistic models." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"Much of the art of system dynamics modeling is discovering and representing the feedback processes, which, along with stock and flow structures, time delays, and nonlinearities, determine the dynamics of a system. […] the most complex behaviors usually arise from the interactions (feedbacks) among the components of the system, not from the complexity of the components themselves." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"To avoid policy resistance and find high leverage policies requires us to expand the boundaries of our mental models so that we become aware of and understand the implications of the feedbacks created by the decisions we make. That is, we must learn about the structure and dynamics of the increasingly complex systems in which we are embedded." (John D Sterman, "Business dynamics: Systems thinking and modeling for a complex world", 2000) 

"Falling between order and chaos, the moment of complexity is the point at which self-organizing systems emerge to create new patterns of coherence and structures of behaviour." (Mark C Taylor, "The Moment of Complexity: Emerging Network Culture", 2001)

"[…] most earlier attempts to construct a theory of complexity have overlooked the deep link between it and networks. In most systems, complexity starts where networks turn nontrivial." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"[…] networks are the prerequisite for describing any complex system, indicating that complexity theory must inevitably stand on the shoulders of network theory. It is tempting to step in the footsteps of some of my predecessors and predict whether and when we will tame complexity. If nothing else, such a prediction could serve as a benchmark to be disproven. Looking back at the speed with which we disentangled the networks around us after the discovery of scale-free networks, one thing is sure: Once we stumble across the right vision of complexity, it will take little to bring it to fruition. When that will happen is one of the mysteries that keeps many of us going." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"A sudden change in the evolutive dynamics of a system (a ‘surprise’) can emerge, apparently violating a symmetrical law that was formulated by making a reduction on some (or many) finite sequences of numerical data. This is the crucial point. As we have said on a number of occasions, complexity emerges as a breakdown of symmetry (a system that, by evolving with continuity, suddenly passes from one attractor to another) in laws which, expressed in mathematical form, are symmetrical. Nonetheless, this breakdown happens. It is the surprise, the paradox, a sort of butterfly effect that can highlight small differences between numbers that are very close to one another in the continuum of real numbers; differences that may evade the experimental interpretation of data, but that may increasingly amplify in the system’s dynamics." (Cristoforo S Bertuglia & Franco Vaio, "Nonlinearity, Chaos, and Complexity: The Dynamics of Natural and Social Systems", 2003) 

19 December 2020

On Randomness V (Systems I)

"Is a random outcome completely determined, and random only by virtue of our ignorance of the most minute contributing factors? Or are the contributing factors unknowable, and therefore render as random an outcome that can never be determined? Are seemingly random events merely the result of fluctuations superimposed on a determinate system, masking its predictability, or is there some disorderliness built into the system itself?” (Deborah J Bennett, "Randomness", 1998)

"The self-similarity of fractal structures implies that there is some redundancy because of the repetition of details at all scales. Even though some of these structures may appear to teeter on the edge of randomness, they actually represent complex systems at the interface of order and disorder."  (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"If a network is solely composed of neighborhood connections, information must traverse a large number of connections to get from place to place. In a small-world network, however, information can be transmitted between any two nodes using, typically, only a small number of connections. In fact, just a small percentage of random, long-distance connections is required to induce such connectivity. This type of network behavior allows the generation of 'six degrees of separation' type results, whereby any agent can connect to any other agent in the system via a path consisting of only a few intermediate nodes." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"Although the potential for chaos resides in every system, chaos, when it emerges, frequently stays within the bounds of its attractor(s): No point or pattern of points is ever repeated, but some form of patterning emerges, rather than randomness. Life scientists in different areas have noticed that life seems able to balance order and chaos at a place of balance known as the edge of chaos. Observations from both nature and artificial life suggest that the edge of chaos favors evolutionary adaptation." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system. This property of a non-linear system such as the weather is known as the butterfly effect where it is purported that a butterfly flapping its wings in Japan can give rise to a tornado in Kansas. This unpredictable behaviour of nonlinear dynamical systems, i.e. its extreme sensitivity to initial conditions, seems to be random and is therefore referred to as chaos. This chaotic and seemingly random behaviour occurs for non-linear deterministic system in which effects can be linked to causes but cannot be predicted ahead of time." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010)

"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini, "Chaos: From Simple Models to Complex Systems", 2010)

"Although cascading failures may appear random and unpredictable, they follow reproducible laws that can be quantified and even predicted using the tools of network science. First, to avoid damaging cascades, we must understand the structure of the network on which the cascade propagates. Second, we must be able to model the dynamical processes taking place on these networks, like the flow of electricity. Finally, we need to uncover how the interplay between the network structure and dynamics affects the robustness of the whole system." (Albert-László Barabási, "Network Science", 2016)

15 November 2020

On Networks (1990-1999)

"A neural network is a massively parallel distributed processor that has a natural propensity for storing experiential knowledge and making it available for use. It resembles the brain in two respects: 1. Knowledge is acquired by the network through a learning process. 2. Interneuron connection strengths known as synaptic weights are used to store the knowledge." (Igor Aleksander, "An introduction to neural computing", 1990) 

"Neural Computing is the study of networks of adaptable nodes which through a process of learning from task examples, store experiential knowledge and make it available for use." (Igor Aleksander, "An introduction to neural computing", 1990) 

"Metaphor plays an essential role in establishing a link between scientific language and the world. Those links are not, however, given once and for all. Theory change, in particular, is accompanied by a change in some of the relevant metaphors and in the corresponding parts of the network of similarities through which terms attach to nature." (Thomas S Kuhn, "Metaphor in science", 1993)

"What is a system? A system is a network of interdependent components that work together to try to accomplish the aim of the system. A system must have an aim. Without an aim, there is no system. The aim of the system must be clear to everyone in the system. The aim must include plans for the future. The aim is a value judgment." (William E Deming, "The New Economics for Industry, Government, Education”, 1993)

"Mathematics says the sum value of a network increases as the square of the number of members. In other words, as the number of nodes in a network increases arithmetically, the value of the network increases exponentially. Adding a few more members can dramatically increase the value of the network." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The basic principle of an autocatalytic network is that even though nothing can make itself, everything in the pot has at least one reaction that makes it, involving only other things in the pot. It's a symbiotic system in which everything cooperates to make the metabolism work - the whole is greater than the sum of the parts." (J Doyne Farmer, "The Second Law of Organization" [in The Third Culture: Beyond the Scientific Revolution], 1995)

"The only organization capable of unprejudiced growth, or unguided learning, is a network. All other topologies limit what can happen." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The multiplier effect is a major feature of networks and flows. It arises regardless of the particular nature of the resource, be it goods, money, or messages." (John H Holland, "Hidden Order - How Adaptation Builds Complexity", 1995)

"There are a variety of swarm topologies, but the only organization that holds a genuine plurality of shapes is the grand mesh. In fact, a plurality of truly divergent components can only remain coherent in a network. No other arrangement-chain, pyramid, tree, circle, hub-can contain true diversity working as a whole. This is why the network is nearly synonymous with democracy or the market." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"In the new systems thinking, the metaphor of knowledge as a building is being replaced by that of the network. As we perceive reality as a network of relationships, our descriptions, too, form an interconnected network of concepts and models in which there are no foundations. For most scientists such a view of knowledge as a network with no firm foundations is extremely unsettling, and today it is by no means generally accepted. But as the network approach expands throughout the scientific community, the idea of knowledge as a network will undoubtedly find increasing acceptance." (Fritjof Capra, "The Web of Life: A new scientific understanding of living systems", 1996)

"Networks constitute the new social morphology of our societies, and the diffusion of networking logic substantially modifies the operation and outcomes in processes of production, experience, power, and culture. While the networking form of social organization has existed in other times and spaces, the new information technology paradigm provides the material basis for its pervasive expansion throughout the entire social structure." (Manuel Castells, "The Rise of the Network Society", 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"There is a multilayering of global networks in the key strategic activities that structure and destructure the planet. When these multilayered networks overlap in some node, when there is a node that belongs to different networks, two major consequences follow. First, economies of synergy between these different networks take place in that node: between financial markets and media businesses; or between academic research and technology development and innovation; between politics and media." (Manuel Castells, "The Rise of the Network Society", 1996) 

"When the knowledge base of an industry is both complex and expanding and the sources of expertise are widely dispersed, the locus of innovation will be found in networks of learning, rather than in individual firms." (Walter W. Powell et al, "Interorganizational collaboration and the locus of innovation: Networks of learning in biotechnology", Administrative science quarterly, 1996) 

"Mathematics says the sum value of a network increases as the square of the number of members. In other words, as the number of nodes in a network increases arithmetically, the value of the network increases exponentially. Adding a few more members can dramatically increase the value for all members." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"Networks have existed in every economy. What’s different now is that networks, enhanced and multiplied by technology, penetrate our lives so deeply that 'network' has become the central metaphor around which our thinking and our economy are organized. Unless we can understand the distinctive logic of networks, we can’t profit from the economic transformation now under way." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"The dynamic of our society, and particularly our new economy, will increasingly obey the logic of networks. Understanding how networks work will be the key to understanding how the economy works." (Kevin Kelly, "New Rules for the New Economy: 10 radical strategies for a connected world", 1998)

"The notion of system we are interested in may be described generally as a complex of elements or components directly or indirectly related in a network of interrelationships of various kinds, such that it constitutes a dynamic whole with emergent properties." (Walter F Buckley, "Society: A Complex Adaptive System - Essays in Social Theory", 1998)

On Networks (2010-2019)

"We are beginning to see the entire universe as a holographically interlinked network of energy and information, organically whole and self referential at all scales of its existence. We, and all things in the universe, are non-locally connected with each other and with all other things in ways that are unfettered by the hitherto known limitations of space and time." (Ervin László,"Cosmos: A Co-creator's Guide to the Whole-World", 2010)

"The people we get along with, trust, feel simpatico with, are the strongest links in our networks." (Daniel Goleman, "Working With Emotional Intelligence", 2011) 

"Cybernetics is the study of systems which can be mapped using loops (or more complicated looping structures) in the network defining the flow of information. Systems of automatic control will of necessity use at least one loop of information flow providing feedback." (Alan Scrivener, "A Curriculum for Cybernetics and Systems Theory", 2012)

"If we create networks with the sole intention of getting something, we won't succeed. We can't pursue the benefits of networks; the benefits ensue from investments in meaningful activities and relationships." (Adam Grant, "Give and Take: A Revolutionary Approach to Success", 2013) 

"Information is recorded in vast interconnecting networks. Each idea or image has hundreds, perhaps thousands, of associations and is connected to numerous other points in the mental network." (Peter Russell, "The Brain Book: Know Your Own Mind and How to Use it", 2013) 

"All living systems are networks of smaller components, and the web of life as a whole is a multilayered structure of living systems nesting within other living systems - networks within networks." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"All the variables we can observe in an ecosystem-population densities, availability of nutrients, weather patterns, and so forth-always fluctuate. This is how ecosystems maintain themselves in a flexible state, ready to adapt to changing conditions. The web of life is a flexible, ever-fluctuating network. The more variables are kept fluctuating, the more dynamic is the system; the greater is its flexibility; and the greater is its ability to adapt to changing conditions." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Deep ecology does not separate humans - or anything else-from the natural environment. It sees the world not as a collection of isolated objects, but as a network of phenomena that are fundamentally interconnected and interdependent. Deep ecology recognizes the intrinsic value of all living beings and views humans as just one particular strand in the web of life." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"In other words, the web of life consists of networks within networks. At each scale, under closer scrutiny, the nodes of the network reveal themselves as smaller networks. We tend to arange these systems, all nesting within larger systems, in a hierarchical scheme by placing the larger systems above the smaller ones in pyramid fashion. But this is a human projection. In nature there is no 'above' or 'below', and there are no hierarchies. There are only networks nesting within other networks." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"The first and most obvious property of any network is its nonlinearity – it goes in all directions. Thus the relationships in a network pattern are nonlinear relationships. In particular, an influence, or message, may travel along a cyclical path, which may become a feedback loop. In living networks, the concept of feedback is intimately connected with the network pattern." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Whenever we encounter living systems – organisms, parts of organisms, or communities of organisms – we can observe that their components are arranged in network fashion. Whenever we look at life, we look at networks." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"A network (or graph) consists of a set of nodes (or vertices, actors) and a set of edges (or links, ties) that connect those nodes. [...] The size of a network is characterized by the numbers of nodes and edges in it." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"A worldview consists of observations of the individual and other people with respect to the self, time and space, the natural and the supernatural and the sacred and profane. […] Beliefs about the world do not reside in the human mind in chaotic disorder; rather they form a latent system. A worldview cannot, however, be viewed as a well-organised network of cognitive models or a static collection of values; instead it should be regarded as the product of a process shaped by historical, cultural and social perspectives and contexts." (Helena Helve, "A longitudinal perspective on worldviews, values and identities", 2016)

"Although cascading failures may appear random and unpredictable, they follow reproducible laws that can be quantified and even predicted using the tools of network science. First, to avoid damaging cascades, we must understand the structure of the network on which the cascade propagates. Second, we must be able to model the dynamical processes taking place on these networks, like the flow of electricity. Finally, we need to uncover how the interplay between the network structure and dynamics affects the robustness of the whole system." (Albert-László Barabási, "Network Science", 2016)

"The exploding interest in network science during the first decade of the 21st century is rooted in the discovery that despite the obvious diversity of complex systems, the structure and the evolution of the networks behind each system is driven by a common set of fundamental laws and principles. Therefore, notwithstanding the amazing differences in form, size, nature, age, and scope of real networks, most networks are driven by common organizing principles. Once we disregard the nature of the components and the precise nature of the interactions between them, the obtained networks are more similar than different from each other." (Albert-László Barabási, "Network Science", 2016)

"Network theory confirms the view that information can take on 'a life of its own'. In the yeast network my colleagues found that 40 per cent of node pairs that are correlated via information transfer are not in fact physically connected; there is no direct chemical interaction. Conversely, about 35 per cent of node pairs transfer no information between them even though they are causally connected via a 'chemical wire' (edge). Patterns of information traversing the system may appear to be flowing down the 'wires' (along the edges of the graph) even when they are not. For some reason, 'correlation without causation' seems to be amplified in the biological case relative to random networks." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019)

"The concept of integrated information is clearest when applied to networks. Imagine a black box with input and output terminals. Inside are some electronics, such as a network with logic elements (AND, OR, and so on) wired together. Viewed from the outside, it will usually not be possible to deduce the circuit layout simply by examining the cause–effect relationship between inputs and outputs, because functionally equivalent black boxes can be built from very different circuits. But if the box is opened, it’s a different story. Suppose you use a pair of cutters to sever some wires in the network. Now rerun the system with all manner of inputs. If a few snips dramatically alter the outputs, the circuit can be described as highly integrated, whereas in a circuit with low integration the effect of some snips may make no difference at all." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019)

"[...] the Game of Life, in which a few simple rules executed repeatedly can generate a surprising degree of complexity. Recall that the game treats squares, or pixels, as simply on or off (filled or blank) and the update rules are given in terms of the state of the nearest neighbours. The theory of networks is closely analogous. An electrical network, for example, consists of a collection of switches with wires connecting them. Switches can be on or off, and simple rules determine whether a given switch is flipped, according to the signals coming down the wires from the neighbouring switches. The whole network, which is easy to model on a computer, can be put in a specific starting state and then updated step by step, just like a cellular automaton. The ensuing patterns of activity depend both on the wiring diagram (the topology of the network) and the starting state. The theory of networks can be developed quite generally as a mathematical exercise: the switches are called ‘nodes’ and the wires are called ‘edges’. From very simple network rules, rich and complex activity can follow." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019)

"[...] the same network may exhibit fundamentally different patterns of information flow under different dynamics: epidemic spread, ecological interactions, or genetic regulation." (Uzi Harush & Baruch Barzel, "Dynamic patterns of information flow in complex networks", Nature Communications, 2017)

"And that’s what good networkers do. No matter the field, discipline, or industry, if we want to succeed, we must master the networks. Because as the First Law of Success reminds us, the harder it is to measure performance, the less performance matters." (Albert-László Barabási, "The Formula: The Universal Laws of Success", 2018)

16 February 2020

From Parts to Wholes (1800-1849)

"In every moment of her duration Nature is one connected whole; in every moment each individual part must be what it is, because all the others are what they are; and you could not remove a single grain of sand from its place, without thereby, although perhaps imperceptibly to you, altering something throughout all parts of the immeasurable whole." (Johann G Fichte, "The Vocation of Man”, 1800)

"It is the destiny of our race to become united into one great body, thoroughly connected in all its parts, and possessed of similar culture. Nature, and even the passions and vices of Man, have from the beginning tended towards this end. A great part of the way towards it is already passed, and we may surely calculate that it will in time be reached." (Johann G Fichte, "The Vocation of Man", 1800)


"It is probable that what we call thought is not an actual being, but no more than the relation between certain parts of that infinitely varied mass, of which the rest of the universe is composed, and which ceases to exist as soon as those parts change their position with regard to each other." (Percy B Shelley, "On a Future State", 1815)


"Each of the parts of philosophy is a philosophical whole, a circle rounded and complete in itself. In each of these parts, however, the philosophical Idea is found in a particular specificality or medium. The single circle, because it is a real totality, bursts through the limits imposed by its special medium, and gives rise to a wider circle. The whole of philosophy in this way resembles a circle of circles. The Idea appears in each single circle, but, at the same time, the whole Idea is constituted by the system of these peculiar phases, and each is a necessary member of the organisation." (Georg W F Hegel, "Encyclopedia of the Philosophical Sciences", 1816)


"When the whole and the parts are seen at once, as mutually producing and explaining each other, as unity in multeity, there results shapeliness." (Samuel T Coleridge, "Letters", 1836)


"Science is nothing but the finding of analogy, identity, in the most remote parts." (Ralph W Emerson, 1837)


"So far we have studies how, for each commodity by itself, the law of demand in connection with the conditions of production of that commodity, determines the price of it and regulates the incomes of its producers. We considered as given and invariable the prices of other commodities and the incomes of other producers; but, in reality the economic system is a whole of which the parts are connected and react on each other. An increase in the incomes of the producers of commodity A will affect the demand for commodities Band C, etc., and the incomes of their producers, and, by its reaction will involve a change in the demand for A. It seems, therefore, as if, for a complete and rigorous solution of the problems relative to some parts of the economic system, it were indispensable to take the entire system into consideration. But this would surpass the powers of mathematical analysis and of our practical methods of calculation, even if the values of all the constants could be assigned to them numerically." (Antoine A Cournot, "Researches into the Mathematical Principles of the Theory of Wealth", 1838)

"The component parts of a vegetable or animal substance do not lose their mechanical and chemical properties as separate agents, when, by a peculiar mode of juxtaposition, they, as an aggregate whole, acquire physiological or vital properties in addition. Those bodies continue, as before, to obey mechanical and chemical laws, in so far as the operation of those laws is not counteracted by the new laws which govern them as organized beings. […] Though there are laws which, like those of chemistry and physiology, owe their existence to a breach of the principle of Composition of Causes, it does not follow that these peculiar, or, as they might be termed, heteropathic laws, are not capable of composition with one another." (John S Mill, "A System of Logic: Ratioconative and Inductive", 1843) [the heteropathic laws is synonymous with emergence]

From Parts to Wholes (1850-1899)

"The world of ideas which it [mathematics] discloses or illuminates, the contemplation of divine beauty and order which it induces, the harmonious connection of its parts, the infinite hierarchy and absolute evidence of the truths with which mathematical science is concerned, these, and such like, are the surest groimds of its title of human regard, and would remain unimpaired were the plan of the universe unrolled like a map at our feet, and the mind of man qualified to take in the whole scheme of creation at a glance.” (James J Sylvester, "A Plea for the Mathematician", Nature 1, 1870)

"Nature creates unity even in the parts of a whole." (Eugène Delacroix, 1857)

"Analysis and synthesis, though commonly treated as two different methods, are, if properly understood, only the two necessary parts of the same method. Each is the relative and correlative of the other. Analysis, without a subsequent synthesis, is incomplete; it is a mean cut of from its end. Synthesis, without a previous analysis, is baseless; for synthesis receives from analysis the elements which it recomposes." (Sir William Hamilton, "Lectures on Metaphysics and Logic: 6th Lecture on Metaphysics", 1858)

"[…] the besetting danger is not so much of embracing falsehood for truth, as of mistaking a part of the truth for the whole." (John S Mill, "Dissertations and Discussions: Political, Philosophical, and Historical”, 1859)

"We have repeatedly observed that while any whole is evolving, there is always going on an evolution of the parts into which it divides itself; but we have not observed that this equally holds of the totality of things, which is made up of parts within parts from the greatest down to the smallest." (Herbert Spencer, "First Principles", 1862)

"The adaptation observed in men, animals and plants [...] one part of this adaptation is explained from a thought-process in the interior of these bodies [...] another part, however, the adaptation of the organism, by a thought-process in a greater whole." (Bernhard Riemann, Gesammelte Mathematische Werke, 1876)

"All things, man included, are parts of one great whole." (Richard M Bucke, "Man's Moral Nature", 1879)

"The old and oft-repeated proposition ‘Totum est majus sua parte’ [the whole is larger than the part] may be applied without proof only in the case of entities that are based upon whole and part; then and only then is it an undeniable consequence of the concepts ‘totum’ and ‘pars’. Unfortunately, however, this ‘axiom’ is used innumerably often without any basis and in neglect of the necessary distinction between ‘reality’ and ‘quantity’, on the one hand, and ‘nnumbe’ and ‘set’, on the other, precisely in the sense in which it is generally false." (Georg Cantor, "Über unendliche, lineare Punktmannigfaltigkeiten", Mathematische Annalen 20, 1882)

"The part always has a tendency to reunite with its whole in order to escape from its imperfection." (Leonardo Da Vinci, "The Notebooks of Leonardo da Vinci", 1888)

From Parts to Wholes (1900-1909)

"And as the ideal in the whole of Nature moves in an infinite process toward an Absolute Perfection, we may say that art is in strict truth the apotheosis of Nature. Art is thus at once the exaltation of the natural toward its destined supernatural perfection, and the investiture of the Absolute Beauty with the reality of natural existence. Its work is consequently not a means to some higher end, but is itself a final aim; or, as we may otherwise say, art is its own end. It is not a mere recreation for man, a piece of by-play in human life, but is an essential mode of spiritual activity, the lack of which would be a falling short of the destination of man. It is itself part and parcel of man's eternal vocation." (George H Howison, "The Limits of Evolution, and Other Essays, Illustrating the Metaphysical Theory of Personal Idealism", 1901) 

"Mathematical science is in my opinion an indivisible whole, an organism whose vitality is conditioned upon the connection of its parts. For with all the variety of mathematical knowledge, we are still clearly conscious of the similarity of the logical devices, the relationship of the ideas in mathematics as a whole and the numerous analogies in its different departments." (David Hilbert, "Mathematical Problems", Bulletin American Mathematical Society Vol. 8, 1901-1902)

"For if society lacks the unity that derives from the fact that the relationships between its parts are exactly regulated, that unity resulting from the harmonious articulation of its various functions assured by effective discipline and if, in addition, society lacks the unity based upon the commitment of men's wills to a common objective, then it is no more than a pile of sand that the least jolt or the slightest puff will suffice to scatter.“ (Émile Durkheim, 1903)

"From that time, the universe has steadily become more complex and less reducible to a central control. With as much obstinacy as though it were human, it has insisted on expanding its parts; with as much elusiveness as though it were feminine, it has evaded the attempt to impose on it a single will. Modern science, like modern art, tends, in practice, to drop the dogma of organic unity. Some of the mediaeval habit of mind survives, but even that is said to be yielding before the daily evidence of increasing and extending complexity. The fault, then, was not in man, if he no longer looked at science or art as an organic whole or as the expression of unity. Unity turned itself into complexity, multiplicity, variety, and even contradiction." (Henry Adams, "Mont Saint Michel and Chartres", 1904)

"Reduced to their most pregnant difference, empiricism means the habit of explaining wholes by parts, and rationalism means the habit of explaining parts by wholes. Rationalism thus preserves affinities with monism, since wholeness goes with union, while empiricism inclines to pluralistic views. No philosophy can ever be anything but a summary sketch, a picture of the world in abridgment, a foreshortened bird's-eye view of the perspective of events. And the first thing to notice is this, that the only material we have at our disposal for making a picture of the whole world is supplied by the various portions of that world of which we have already had experience. We can invent no new forms of conception, applicable to the whole exclusively, and not suggested originally by the parts." (William James, "A Pluralistic Universe", 1908)

"A system is a whole which is composed of various parts. But it is not the same thing as an aggregate or heap. In an aggregate or heap, no essential relation exists between the units of which it is composed. In a heap of grain, or pile of stones, one may take away part without the other part being at all affected thereby. But in a system, each part has a fixed and necessary relation to the whole and to all the other parts. For this reason we may say that a building, or a peace of mechanisme, is a system. Each stone in the building, each wheel in the watch, plays a part, and is essential to the whole." (James E Creighton, "An Introductory Logic"‎, 1909)

From Parts to Wholes (1970-1979)

"In self-organizing systems, on the other hand, ‘control’ of the organization is typically distributed over the whole of the system. All parts contribute evenly to the resulting arrangement." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"The systems approach to problems focuses on systems taken as a whole, not on their parts taken separately. Such an approach is concerned with total - system performance even when a change in only one or a few of its parts is contemplated because there are some properties of systems that can only be treated adequately from a holistic point of view. These properties derive from the relationship between parts of systems: how the parts interact and fit together." (Russell L Ackoff, "Towards a System of Systems Concepts", 1971)

"A system in one perspective is a subsystem in another. But the systems view always treats systems as integrated wholes of their subsidiary components and never as the mechanistic aggregate of parts in isolable causal relations." (Ervin László, "Introduction to Systems Philosophy", 1972)

"In no system which shows mental characteristics can any part have unilateral control over the whole. In other words, the mental characteristics of the system are imminent, not in some part, but in the system as a whole." (Gregory Bateson, "Steps to an Ecology of Mind", 1972)

"Holists are distinguished from serialists in terms of the number of inferential statements they produce.[...] It is possible to distinguish the serialist from the holist by a tendency, on the part of a serialist, to preserve the order of the programme presentation format which is absent in the holist. Presented with a holist programme the serialist is unable to preserve the complete order but he does manage to preserve sequentially arranged fragments." (Gordon Pask, "Learning Strategies and Individual Competence", 1972)

"Yet while they exist, regardless of how long, each system has a specific structure made up of certain maintained relationships among its parts, and manifests irreducible characteristics of its own." (Ervin László, "Introduction to Systems Philosophy", 1972)

"In the Systems Age we tend to look at things as part of larger wholes rather than as wholes to be taken apart. This is the doctrine of expansionism. Expansionism brings with it the synthetic mode of thought much as reductionism brought with it." (Russell L Ackoff, "Redesigning the future", 1974)

"Science gets most of its information by the process of reductionism, exploring the details, then the details of the details, until all the smallest bits of the structure, or the smallest parts of the mechanism, are laid out for counting and scrutiny. Only when this is done can the investigation be extended to encompass the whole organism or the entire system. So we say. Sometimes it seems that we take a loss, working this way." (Lewis Thomas, "The Medusa and the Snail: More Notes of a Biology Watcher", 1974)


"When you are confronted by any complex social system […] with things about it that you’re dissatisfied with and anxious to fix, you cannot just step in and set about fixing with much hope of helping. This realization is one of the sore discouragements of our century […] You cannot meddle with one part of a complex system from the outside without the almost certain risk of setting off disastrous events that you hadn’t counted on in other, remote parts. If you want to fix something you are first obliged to understand […] the whole system. […] Intervening is a way of causing trouble." (Lewis Thomas, "The Medusa and the Snail: More Notes of a Biology Watcher", 1974)

"Synergy means behavior of whole systems unpredicted by the behavior of their parts taken separately." (R Buckminster Fuller, "Synergetics: Explorations in the Geometry of Thinking", 1975)

"We have reversed the usual classical notion that the independent 'elementary parts' of the world are the fundamental reality, and that the various systems are merely particular contingent forms and arrangements of these parts. Rather, we say that inseparable quantum interconnectedness of the whole universe is the fundamental reality, and that relatively independent behaving parts are merely particular and contingent forms within this whole." (David Bohm, "On the Intuitive Understanding of Nonlocality as Implied by Quantum Theory", Foundations of Physics Vol 5 (1), 1975)

"If all of the elements in a large system are loosely coupled to one another, then any one element can adjust to and modify a local a local unique contingency without affecting the whole system. These local adaptations can be swift, relatively economical, and substantial." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"In a loosely coupled system there is more room available for self-determination by the actors. If it is argued that a sense of efficacy is crucial for human beings. when a sense of efficacy might be greater in a loosely coupled system with autonomous units than it would be in a tightly coupled system where discretion is limited." (Karl E Weick, "Educational organizations as loosely coupled systems", 1976)

"There is a strong current in contemporary culture advocating ‘holistic’ views as some sort of cure-all […] Reductionism implies attention to a lower level while holistic implies attention to higher level. These are intertwined in any satisfactory description: and each entails some loss relative to our cognitive preferences, as well as some gain [...] there is no whole system without an interconnection of its parts and there is no whole system without an environment." (Francisco Varela, "On being autonomous: The lessons of natural history for systems theory", 1977)

"A threat to any part of the environment is a threat to the whole environment, but we must have a basis of assessment of these threats, not so that we can establish a priority of fears, but so that we can make a positive contribution to improvement and ultimate survival." (Prince Philip, "The Environmental Revolution: Speeches on Conservation, 1962–77", 1978)

"When a mess, which is a system of problems, is taken apart, it loses its essential properties and so does each of its parts. The behavior of a mess depends more on how the treatment of its parts interact than how they act independently of each other. A partial solution to a whole system of problems is better than whole solutions of each of its parts taken separately." (Russell L Ackoff, "The future of operational research is past", The Journal of the Operational Research Society Vol. 30 (2), 1979)

"Given the five parts of the organization - operating core, strategic apex, middle line, technostructure, and support staff - we may now ask how they all function together. In fact, we cannot describe the one way they function together, for research suggests that the linkages are varied and complex. The parts of the organization are joined together by different flows - of authority, of work material, of information, and of decision processes." (Henry Mintzberg, "The structuring of organizations", 1979)

03 August 2019

Fritjof Capra - Collected Quotes

"If physics leads us today to a world view which is essentially mystical, it returns, in a way, to its beginning, 2,500 years ago. […] Eastern thought and, more generally, mystical thought provide a consistent and relevant philosophical background to the theories of contemporary science; a conception of the world in which scientific discoveries can be in perfect harmony with spiritual aims and religious beliefs. The two basic themes of this conception are the unity and interrelation of all phenomena and the intrinsically dynamic nature of the universe. The further we penetrate into the submicroscopic world, the more we shall realize how the modern physicist, like the Eastern mystic, has come to see the world as a system of inseparable, interacting and ever-moving components with the observer being an integral part of this system." (Fritjof Capra, "The Tao of Physics: An Exploration of the Parallels Between Modern Physics and Eastern Mysticism", 1975)

"The influence of modern physics goes beyond technology. It extends to the realm of thought and culture where it has led to a deep revision in man's conception of the universe and his relation to it." (Fritjof Capra, "The Tao of Physics: An Exploration of the Parallels Between Modern Physics and Eastern Mysticism", 1975)

"Whenever the Eastern mystics express their knowledge in words - be it with the help of myths, symbols, poetic images or paradoxical statements-they are well aware of the limitations imposed by language and 'linear' thinking. Modern physics has come to take exactly the same attitude with regard to its verbal models and theories. They, too, are only approximate and necessarily inaccurate. They are the counterparts of the Eastern myths, symbols and poetic images, and it is at this level that I shall draw the parallels. The same idea about matter is conveyed, for example, to the Hindu by the cosmic dance of the god Shiva as to the physicist by certain aspects of quantum field theory. Both the dancing god and the physical theory are creations of the mind: models to describe their authors' intuition of reality."   (Fritjof Capra, "The Tao of Physics: An Exploration of the Parallels Between Modern Physics and Eastern Mysticism", 1975)

"In a biological or social system each holon must assert its individuality in order to maintain the system's stratified order, but it must also submit to the demands of the whole in order to make the system viable. These two tendencies are opposite but complementary. In a healthy system - an individual, a society, or an ecosystem - there is a balance between integration and self-assertion. This balance is not static but consists of a dynamic interplay between the two complementary tendencies, which makes the whole system flexible and open to change." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"In microscopic systems, consisting of only a few molecules, the second law is violated regularly, but in macroscopic systems, which consist of vast numbers of molecules, the probability that the total entropy of the system will increase becomes virtual certainty. Thus in any isolated system, made up of a large number of molecules, the entropy - or disorder -will keep increasing until, eventually, the system reaches a state of maximum entropy, also known as 'heat death'; in this state all activity has ceased, all material being evenly distributed and at the same temperature. According to classical physics, the universe as a whole is going toward such a state of maximum entropy; it is running down and will eventually grind to a halt." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"Living systems are organized in such a way that they form multileveled structures, each level consisting of subsystems which are wholes in regard to their parts, and parts with respect to the larger wholes." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"Systems theory looks at the world in terms of the interrelatedness and interdependence of all phenomena, and in this framework an integrated whole whose properties cannot be reduced to those of its parts is called a system. Living organisms, societies, and ecosystems are all systems." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The phenomena of the subatomic world are so complex that it is by no means certain whether a complete, self-consistent theory will ever be constructed, but one can envisage a series of partly successful models of smaller scope. Each of them would be intended to cover only a part of the observed phenomena and would contain some unexplained aspects, or parameters, but the parameters of one model might be explained by another. Thus more and more phenomena could gradually be covered with ever increasing accuracy by a mosaic of interlocking models whose net number of unexplained parameters keeps decreasing." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The phenomenon of self-organization is not limited to living matter but occurs also in certain chemical systems […] [Ilya] Prigogine has called these systems 'dissipative structures' to express the fact that they maintain and develop structure by breaking down other structures in the process of metabolism, thus creating entropy­ disorder - which is subsequently dissipated in the form of degraded waste products. Dissipative chemical structures display the dynamics of self-organization in its simplest form, exhibiting most of the phenomena characteristic of life self-renewal, adaptation, evolution, and even primitive forms of 'mental' processes." (Fritjof Capra, "The Turning Point: Science, Society, and the Turning Culture", 1982)

"The new paradigm may be called a holistic world view, seeing the world as an integrated whole rather than a dissociated collection of parts. It may also be called an ecological view, if the term 'ecological' is used in a much broader and deeper sense than usual. Deep ecological awareness recognizes the fundamental interdependence of all phenomena and the fact that, as individuals and societies we are all embedded in (and ultimately dependent on) the cyclical process of nature." (Fritjof Capra & Gunter A Pauli, "Steering Business Toward Sustainability", 1995)

“[…] self-organization is the spontaneous emergence of new structures and new forms of behavior in open systems far from equilibrium, characterized by internal feedback loops and described mathematically by nonlinear equations.” (Fritjof  Capra, “The web of life: a new scientific understanding of living  systems”, 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"The more we study the major problems of our time, the more we come to realise that they cannot be understood in isolation. They are systemic problems, which means that they are interconnected and interdependent." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"The shift of paradigms requires an expansion not only of our perceptions and ways of thinking, but also of our values. […] scientific facts emerge out of an entire constellation of human perceptions, values, and actions-in one word, out of a paradigm-from which they cannot be separated. […] Today the paradigm shift in science, at its deepest level, implies a shift from physics to the life sciences." (
Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"Understanding ecological interdependence means understanding relationships. It requires the shifts of perception that are characteristic of systems thinking - from the parts to the whole, from objects to relationships, from contents to patterns. [...]  Nourishing the community means nourishing those relationships." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"What is sustained in a sustainable community is not economic growth, development, market share, or competitive advantage, but the entire web of life on which our long-term survival depends. In other words, a sustainable community is designed in such a way that its ways of life, businesses, economy, physical structures, and technologies do not interfere with nature’s inherent ability to sustain life." (Fritjof Capra, "Ecoliteracy: The Challenge for Education in the Next Century", 1999)

"One of the key insights of the systems approach has been the realization that the network is a pattern that is common to all life. Wherever we see life, we see networks." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)

"The phenomenon of emergence takes place at critical points of instability that arise from fluctuations in the environment, amplified by feedback loops. Emergence results in the creation of novelty, and this novelty is often qualitatively different from the phenomenon out of which it emerged." (Fritjof Capra, "The Hidden Connections: A Science for Sustainable Living", 2002)

"All living systems are networks of smaller components, and the web of life as a whole is a multilayered structure of living systems nesting within other living systems - networks within networks." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"All the variables we can observe in an ecosystem-population densities, availability of nutrients, weather patterns, and so forth-always fluctuate. This is how ecosystems maintain themselves in a flexible state, ready to adapt to changing conditions. The web of life is a flexible, ever-fluctuating network. The more variables are kept fluctuating, the more dynamic is the system; the greater is its flexibility; and the greater is its ability to adapt to changing conditions." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Deep ecology does not separate humans - or anything else-from the natural environment. It sees the world not as a collection of isolated objects, but as a network of phenomena that are fundamentally interconnected and interdependent. Deep ecology recognizes the intrinsic value of all living beings and views humans as just one particular strand in the web of life." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"In linear systems, small changes produce small effects, and large effects are due either to large changes or to a sum of many small changes. In nonlinear systems, by contrast, small changes may have dramatic effects because they may be amplified repeatedly by self-reinforcing feedback. Such nonlinear feedback processes are the basis of the instabilities and the sudden emergence of new forms of order that are so characteristic of self-organization." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"In many nonlinear systems, however, small changes of certain parameters may produce Dramatic changes in the basic characteristics of the phase portrait. Attractors may disappear, or change into one another, or new attractors may suddenly appear. Such systems are said to be structurally unstable, and the critical points of instability are called 'bifurcation points', because they are points in the system’s evolution where a fork suddenly appears and the system branches off in a new direction. Mathematically, bifurcation points mark sudden changes in the system’s phase portrait. Physically, they correspond to points of instability at which the system changes abruptly and new forms of order suddenly appear." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"In other words, the web of life consists of networks within networks. At each scale, under closer scrutiny, the nodes of the network reveal themselves as smaller networks. We tend to arange these systems, all nesting within larger systems, in a hierarchical scheme by placing the larger systems above the smaller ones in pyramid fashion. But this is a human projection. In nature there is no 'above' or 'below', and there are no hierarchies. There are only networks nesting within other networks." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"In the nonlinear world – which includes most of the real world, as we have discovered – simple deterministic equations may produce an unsuspected richness and variety of behavior. On the other hand, complex and seemingly chaotic behavior can give rise to ordered structures, to subtle and beautiful patterns. In fact, in chaos theory the term 'chaos' has acquired a new technical meaning. The behavior of chaotic systems only appears to be random but in reality shows a deeper level of patterned order." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Intersections of lines, for example, remain intersections, and the hole in a torus (doughnut) cannot be transformed away. Thus a doughnut may be transformed topologically into a coffee cup (the hole turning into a handle) but never into a pancake. Topology, then, is really a mathematics of relationships, of unchangeable, or 'invariant', patterns." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"It is evident that chaotic behavior, in the new scientific sense of the term, is very different from random, erratic motion. With the help of strange attractors a distinction can be made between mere randomness, or 'noise', and chaos. Chaotic behavior is deterministic and patterned, and strange attractors allow us to transform the seemingly random data into distinct visible shapes." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"The first and most obvious property of any network is its nonlinearity – it goes in all directions. Thus the relationships in a network pattern are nonlinear relationships. In particular, an influence, or message, may travel along a cyclical path, which may become a feedback loop. In living networks, the concept of feedback is intimately connected with the network pattern." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"The impossibility of predicting which point in phase space the trajectory of the Lorenz attractor will pass through at a certain time, even though the system is governed by deterministic equations, is a common feature of all chaotic systems. However, this does not mean that chaos theory is not capable of any predictions. We can still make very accurate predictions, but they concern the qualitative features of the system’s behavior rather than the precise values of its variables at a particular time. The new mathematics thus represents the shift from quantity to quality that is characteristic of systems thinking in general. Whereas conventional mathematics deals with quantities and formulas, nonlinear dynamics deals with qualities and patterns." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"The study of pattern is crucial to the understanding of living systems because systemic properties […] arise from a configuration of ordered relationships. Systemic properties are properties of a pattern. What is destroyed when a living organism is dissected is its pattern. The components are still there, but the configuration of relationships between them – the pattern – is destroyed, and thus the organism dies." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"[…] the term 'information', as used in information theory, has nothing to do with meaning. It is a measure of the order, or nonrandomness, of a signal; and the main concern of information theory is the problem of how to get a message, coded as a signal, through a noisy channel." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"This spontaneous emergence of order at critical points of instability, which is often referred to simply as 'emergence', is one of the hallmarks of life. It has been recognized as the dynamic origin of development, learning, and evolution. In other words, creativity-the generation of new forms-is a key property of all living systems." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"To understand the phenomenon of self-organization, we first need to understand the importance of pattern. The idea of a pattern of organization – a configuration of relationships characteristic of a particular system – became the explicit focus of systems thinking in cybernetics and has been a crucial concept ever since. From the systems point of view, the understanding of life begins with the understanding of pattern." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Topology is a geometry in which all lengths, angles, and areas can be distorted at will. Thus a triangle can be continuously transformed into a rectangle, the rectangle into a square, the square into a circle, and so on. Similarly, a cube can be transformed into a cylinder, the cylinder into a cone, the cone into a sphere. Because of these continuous transformations, topology is known popularly as 'rubber sheet geometry'. All figures that can be transformed into each other by continuous bending, stretching, and twisting are called 'topologically equivalent'." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"[…] topology is concerned precisely with those properties of geometric figures that do not change when the figures are transformed. Intersections of lines, for example, remain intersections, and the hole in a torus (doughnut) cannot be transformed away. Thus a doughnut may be transformed topologically into a coffee cup (the hole turning into a handle) but never into a pancake. Topology, then, is really a mathematics of relationships, of unchangeable, or 'invariant', patterns." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Whenever we encounter living systems – organisms, parts of organisms, or communities of organisms – we can observe that their components are arranged in network fashion. Whenever we look at life, we look at networks." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

23 April 2019

Chaos Theory: The Butterfly Effect - A Retrospective

"Parvus error in principiis, magnus in conclusionibus" 
"Parvus error in principio, magnus est in fine"
“A small error in the beginning (or in principles) leads to a big error in the end (or in conclusions).” (ancient axiom)

"The wise tell us that a nail keeps a shoe, a shoe a horse, a horse a man, a man a castle, that can fight." (Freidank, Bescheidenheit, cca. 1230)

"[…] the least initial deviation from the truth is multiplied later a thousand-fold. Admit, for instance, the existence of a minimum magnitude, and you will find that the minimum which you have introduced, small as it is, causes the greatest truths of mathematics to totter. The reason is that a principle is great rather in power than in extent; hence that which was small at the start turns out a giant at the end." (St. Thomas Aquinas, “De Ente et Essentia”, cca. 1252)

"A little neglect may breed mischief [...] 
for want of a nail, the shoe was lost;
for want of a shoe the horse was lost;
and for want of a horse the rider was lost." (Benjamin Franklin, Poor Richard's Almanac, 1758) 

"In every moment of her duration Nature is one connected whole; in every moment each individual part must be what it is, because all the others are what they are; and you could not remove a single grain of sand from its place, without thereby, although perhaps imperceptibly to you, altering something throughout all parts of the immeasurable whole." (Johann G Fichte, "The Vocation of Man", 1800)

"Every existence above a certain rank has its singular points; the higher the rank the more of them. At these points, influences whose physical magnitude is too small to be taken account of by a finite being may produce results of the greatest importance." (James C Maxwell, [letter] 1865) 

"What we call little things are merely the causes of great things; they are the beginning, the embryo, and it is the point of departure which, generally speaking, decides the whole future of an existence. One single black speck may be the beginning of gangrene, of a storm, of a revolution." (Henri-Frédéric Amiel, [journal entry] 1868)

"There is a maxim which is often quoted, that ‘The same causes will always produce the same effects.’ To make this maxim intelligible we must define what we mean by the same causes and the same effects, since it is manifest that no event ever happens more that once, so that the causes and effects cannot be the same in all respects. [...] There is another maxim which must not be confounded with that quoted at the beginning of this article, which asserts ‘That like causes produce like effects’. This is only true when small variations in the initial circumstances produce only small variations in the final state of the system. In a great many physical phenomena this condition is satisfied; but there are other cases in which a small initial variation may produce a great change in the final state of the system, as when the displacement of the ‘points’ causes a railway train to run into another instead of keeping its proper course." (James C Maxwell,"Matter and Motion", 1876)

"A tenth of a degree more or less at any given point, and the cyclone will burst here and not there." (Henri Poincaré, "Sur le probleme des trios corps et les equations de la dynamique", Acta Mathematica Vol. 113, 1890)

"Certainly, if a system moves under the action of given forces and its initial conditions have given values in the mathematical sense, its future motion and behavior are exactly known. But, in astronomical problems, the situation is quite different: the constants defining the motion are only physically known, that is with some errors; their sizes get reduced along the progresses of our observing devices, but these errors can never completely vanish." (Jacques Hadamard, "Les surfaces à courbures opposées et leurs lignes géodésiques", Journal de mathématiques pures et appliquées 5e (4), 1898)

"An exceedingly small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say the effect is due to chance. If we knew exactly the laws of nature and the situation of the universe at the initial moment, we could predict exactly the situation of that same universe at a succeeding moment. But even if it were the case that the natural laws had no longer any secret for us, we could still only know the initial situation 'approximately'. If that enabled us to predict the succeeding situation with 'the same approximation', that is all we require, and we should say that the phenomenon had been predicted, that it is governed by laws. But it is not always so; it may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. Prediction becomes impossible, and we have the fortuitous phenomenon. (Jules H Poincaré, "Science and Method", 1908) 
 
"Throwing a small stone may have some influence on the movement of the sun." (Grigore C Moisil, "Determinism si inlantuire", 1940)

"The predictions of physical theories for the most part concern situations where initial conditions can be precisely specified. If such initial conditions are not found in nature, they can be arranged." (Anatol Rapoport, "The Search for Simplicity", 1956)

“One meteorologist remarked that if the theory were correct, one flap of a sea gull's wings would be enough to alter the course of the weather forever. The controversy has not yet been settled, but the most recent evidence seems to favor the sea gulls.” (Edward N Lorenz, "The Predictability of Hydrodynamic Flow", Transactions of the New York Academy of Sciences 25 (4), 1963)

"Does the flap of a butterfly’s wings in Brazil set off a tornado in Texas?" (Edward N Lorenz, [talk] 1972)

"If a single flap of a butterfly's wing can be instrumental in generating a tornado, so all the previous and subsequent flaps of its wings, as can the flaps of the wings of the millions of other butterflies, not to mention the activities of innumerable more powerful creatures, including our own species." (Edward N Lorenz, [talk] 1972)

"If the flap of a butterfly’s wings can be instrumental in generating a tornado, it can equally well be instrumental in preventing a tornado. More generally, I am proposing that over the years minuscule disturbances neither increase nor decrease the frequency of occurrence of various weather events such as tornadoes; the most that they may do is to modify the sequence in which these events occur." (Edward N Lorenz, [talk] 1972) 

"[...] the influence of a single butterfly is not only a fine detail-it is confined to a small volume. Some of the numerical methods which seem to be well adapted for examining the intensification of errors are not suitable for studying the dispersion of errors from restricted to unrestricted regions. One hypothesis, unconfirmed, is that the influence of a butterfly's wings will spread in turbulent air, but not in calm air." (Edward N Lorenz, [talk] 1972)

"Given an approximate knowledge of a system's initial conditions and an understanding of natural law, one can calculate the approximate behavior of the system. This assumption lay at the philosophical heart of science." (James Gleick, Chaos: Making a New Science, 1987)

"A slight variation in the axioms at the foundation of a theory can result in huge changes at the frontier." (Stanley P Gudder, "Quantum Probability", 1988)

"The principle of maximum diversity operates both at the physical and at the mental level. It says that the laws of nature and the initial conditions are such as to make the universe as interesting as possible.  As a result, life is possible but not too easy. Always when things are dull, something new turns up to challenge us and to stop us from settling into a rut. Examples of things which make life difficult are all around us: comet impacts, ice ages, weapons, plagues, nuclear fission, computers, sex, sin and death.  Not all challenges can be overcome, and so we have tragedy. Maximum diversity often leads to maximum stress. In the end we survive, but only by the skin of our teeth." (Freeman J Dyson, "Infinite in All Directions", 1988)

"Due to this sensitivity any uncertainty about seemingly insignificant digits in the sequence of numbers which defines an initial condition, spreads with time towards the significant digits, leading to chaotic behavior. Therefore there is a change in the information we have about the state of the system. This change can be thought of as a creation of information if we consider that two initial conditions that are different but indistinguishable (within a certain precision), evolve into distinguishable states after a finite time." (David Ruelle, "Chaotic Evolution and Strange Attractors: The statistical analysis of time series for deterministic nonlinear systems", 1989)

"Now, the main problem with a quasiperiodic theory of turbulence (putting several oscillators together) is the following: when there is a nonlinear coupling between the oscillators, it very often happens that the time evolution does not remain quasiperiodic. As a matter of fact, in this latter situation, one can observe the appearance of a feature which makes the motion completely different from a quasiperiodic one. This feature is called sensitive dependence on initial conditions and turns out to be the conceptual key to reformulating the problem of turbulence." (David Ruelle, "Chaotic Evolution and Strange Attractors: The statistical analysis of time series for deterministic nonlinear systems", 1989)

"The flapping of a single butterfly’s wing today produces a tiny change in the state of the atmosphere. Over a period of time, what the atmosphere actually does diverges from what it would have done." (Ian Stewart, "Does God Play Dice?", 1989)

"Although a system may exhibit sensitive dependence on initial condition, this does not mean that everything is unpredictable about it. In fact, finding what is predictable in a background of chaos is a deep and important problem. (Which means that, regrettably, it is unsolved.) In dealing with this deep and important problem, and for want of a better approach, we shall use common sense." (David Ruelle, "Chance and Chaos", 1991)

"Everywhere […] in the Universe, we discern that closed physical systems evolve in the same sense from ordered states towards a state of complete disorder called thermal equilibrium. This cannot be a consequence of known laws of change, since […] these laws are time symmetric- they permit […] time-reverse. […] The initial conditions play a decisive role in endowing the world with its sense of temporal direction. […] some prescription for initial conditions is crucial if we are to understand […]" (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"First, strange attractors look strange: they are not smooth curves or surfaces but have 'non-integer dimension' - or, as Benoit Mandelbrot puts it, they are fractal objects. Next, and more importantly, the motion on a strange attractor has sensitive dependence on initial condition. Finally, while strange attractors have only finite dimension, the time-frequency analysis reveals a continuum of frequencies." (David Ruelle, "Chance and Chaos", 1991)

"If we have several modes, oscillating independently, the motion is, as we saw, not chaotic. Suppose now that we put a coupling, or interaction, between the different modes. This means that the evolution of each mode, or oscillator, at a certain moment is determined not just by the state of this oscillator at that moment, but by the states of the other oscillators as well. When do we have chaos then? Well, for sensitive dependence on initial condition to occur, at least three oscillators are necessary. In addition, the more oscillators there are, and the more coupling there is between them, the more likely you are to see chaos." (David Ruelle, "Chance and Chaos", 1991)

"[…] the standard theory of chaos deals with time evolutions that come back again and again close to where they were earlier. Systems that exhibit this "eternal return" are in general only moderately complex. The historical evolution of very complex systems, by contrast, is typically one way: history does not repeat itself. For these very complex systems with one-way evolution it is usually clear that sensitive dependence on initial condition is present. The question is then whether it is restricted by regulation mechanisms, or whether it leads to long-term important consequences." (David Ruelle, "Chance and Chaos", 1991)

"What we now call chaos is a time evolution with sensitive dependence on initial condition. The motion on a strange attractor is thus chaotic. One also speaks of deterministic noise when the irregular oscillations that are observed appear noisy, but the mechanism that produces them is deterministic." (David Ruelle, "Chance and Chaos", 1991)

"Chaos is a parody of any metaphysics of destiny. It is not even an avatar of such a metaphysics. The poetry of initial conditions fascinates us today, now that we no longer possess a vision of final conditions, and Chaos stands in for us as a negative destiny. [...] Destiny is the ecstatic figure of necessity. Chaos is merely the metastatic figure of Chance. Chaotic processes are random and statistical in nature and, even if they culminate in the hidden order of strange attractors, that still has nothing to do with the fulgurating notion of destiny, the absence of which is cruelly felt." (Jean Baudrillard, "The Illusion of the End", 1992)

"In nonlinear systems - and the economy is most certainly nonlinear - chaos theory tells you that the slightest uncertainty in your knowledge of the initial conditions will often grow inexorably. After a while, your predictions are nonsense." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"In the everyday world of human affairs, no one is surprised to learn that a tiny event over here can have an enormous effect over there. For want of a nail, the shoe was lost, et cetera. But when the physicists started paying serious attention to nonlinear systems in their own domain, they began to realize just how profound a principle this really was. […] Tiny perturbations won't always remain tiny. Under the right circumstances, the slightest uncertainty can grow until the system's future becomes utterly unpredictable - or, in a word, chaotic." (M Mitchell Waldrop, "Complexity: The Emerging Science at the Edge of Order and Chaos", 1992)

"How can deterministic behavior look random? If truly identical states do occur on two or more occasions, it is unlikely that the identical states that will necessarily follow will be perceived as being appreciably different. What can readily happen instead is that almost, but not quite, identical states occurring on two occasions will appear to be just alike, while the states that follow, which need not be even nearly alike, will be observably different. In fact, in some dynamical systems it is normal for two almost identical states to be followed, after a sufficient time lapse, by two states bearing no more resemblance than two states chosen at random from a long sequence. Systems in which this is the case are said to be sensitively dependent on initial conditions. With a few more qualifications, to be considered presently, sensitive dependence can serve as an acceptable definition of chaos [...]" (Edward N Lorenz, "The Essence of Chaos", 1993)

"Symmetry breaking in psychology is governed by the nonlinear causality of complex systems (the 'butterfly effect'), which roughly means that a small cause can have a big effect. Tiny details of initial individual perspectives, but also cognitive prejudices, may 'enslave' the other modes and lead to one dominant view." (Klaus Mainzer, "Thinking in Complexity", 1994)

"How surprising it is that the laws of nature and the initial conditions of the universe should allow for the existence of beings who could observe it. Life as we know it would be impossible if any one of several physical quantities had slightly different values." (Steven Weinberg, "Life in the Quantum Universe", Scientific American, 1995)

"Chaos appears in both dissipative and conservative systems, but there is a difference in its structure in the two types of systems. Conservative systems have no attractors. Initial conditions can give rise to periodic, quasiperiodic, or chaotic motion, but the chaotic motion, unlike that associated with dissipative systems, is not self-similar. In other words, if you magnify it, it does not give smaller copies of itself. A system that does exhibit self-similarity is called fractal. [...] The chaotic orbits in conservative systems are not fractal; they visit all regions of certain small sections of the phase space, and completely avoid other regions. If you magnify a region of the space, it is not self-similar." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"What is chaos? Everyone has an impression of what the word means, but scientifically chaos is more than random behavior, lack of control, or complete disorder. [...] Scientifically, chaos is defined as extreme sensitivity to initial conditions. If a system is chaotic, when you change the initial state of the system by a tiny amount you change its future significantly." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"Small changes in the initial conditions in a chaotic system produce dramatically different evolutionary histories. It is because of this sensitivity to initial conditions that chaotic systems are inherently unpredictable. To predict a future state of a system, one has to be able to rely on numerical calculations and initial measurements of the state variables. Yet slight errors in measurement combined with extremely small computational errors (from roundoff or truncation) make prediction impossible from a practical perspective. Moreover, small initial errors in prediction grow exponentially in chaotic systems as the trajectories evolve. Thus, theoretically, prediction may be possible with some chaotic processes if one is interested only in the movement between two relatively close points on a trajectory. When longer time intervals are involved, the situation becomes hopeless."(Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"Swarm systems generate novelty for three reasons: (1) They are 'sensitive to initial conditions' - a scientific shorthand for saying that the size of the effect is not proportional to the size of the cause - so they can make a surprising mountain out of a molehill. (2) They hide countless novel possibilities in the exponential combinations of many interlinked individuals. (3) They don’t reckon individuals, so therefore individual variation and imperfection can be allowed. In swarm systems with heritability, individual variation and imperfection will lead to perpetual novelty, or what we call evolution." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Unlike classical mathematics, net math exhibits nonintuitive traits. In general, small variations in input in an interacting swarm can produce huge variations in output. Effects are disproportional to causes - the butterfly effect." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Chaos appears in both dissipative and conservative systems, but there is a difference in its structure in the two types of systems. Conservative systems have no attractors. Initial conditions can give rise to periodic, quasiperiodic, or chaotic motion, but the chaotic motion, unlike that associated with dissipative systems, is not self-similar. In other words, if you magnify it, it does not give smaller copies of itself. A system that does exhibit self-similarity is called fractal. [...] The chaotic orbits in conservative systems are not fractal; they visit all regions of certain small sections of the phase space, and completely avoid other regions. If you magnify a region of the space, it is not self-similar." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"What is chaos? Everyone has an impression of what the word means, but scientifically chaos is more than random behavior, lack of control, or complete disorder. [...] Scientifically, chaos is defined as extreme sensitivity to initial conditions. If a system is chaotic, when you change the initial state of the system by a tiny amount you change its future significantly." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"Surveying the bewildering damage from some historical hurricanes, an outside observer might wonder whether builders were suffering under the illusion of a chaos-free environment: one governed by underestimated deterministic forces. Of course, in reality, dynamical chaos is intrinsic to the atmosphere, and contributes significantly to the aleatory uncertainty in wind loading. It may take more than the flap of a butterfly’s wings to change a hurricane forecast, but chaos imposes a fundamental practical limit to windstorm prediction capability." (Gordon Woo, "The mathematics of natural catastrophes", 1999)

"The classic example of chaos at work is in the weather. If you could measure the positions and motions of all the atoms in the air at once, you could predict the weather perfectly. But computer simulations show that tiny differences in starting conditions build up over about a week to give wildly different forecasts. So weather predicting will never be any good for forecasts more than a few days ahead, no matter how big (in terms of memory) and fast computers get to be in the future. The only computer that can simulate the weather is the weather; and the only computer that can simulate the Universe is the Universe." (John Gribbin, "The Little Book of Science", 1999)

"Chaos theory reconciles our intuitive sense of free will with the deterministic laws of nature. However, it has an even deeper philosophical ramification. Not only do we have freedom to control our actions, but also the sensitivity to initial conditions implies that even our smallest act can drastically alter the course of history, for better or for worse. Like the butterfly flapping its wings, the results of our behavior are amplified with each day that passes, eventually producing a completely different world than would have existed in our absence!" (Julien C Sprott, "Strange Attractors: Creating Patterns in Chaos", 2000)

"In chaology, the initial conditions are likely to be out of all proportion to the consequences; indeed, origins are much more random, unpredictable, and unknowable and seemingly much less directly causal than in orderly systems. The sensitive dependence upon initial conditions means that similar phenomena or systems will never be wholly identical and that the results of those small initial changes may be radically different. These unpredictable initial conditions may, for instance, lead to the so-called butterfly effect, in which an extremely minor and remote act causes disruptions of a huge magnitude."
(Gordon E Slethaug, "Beautiful Chaos: Chaos theory and metachaotics in recent American fiction", 2000)

"Scientists tell us that the world of nature is so small and interdependent that a butterfly flapping its wings in the Amazon rainforest can generate a violent storm on the other side of the earth. This principle is known as the 'Butterfly Effect'. Today, we realize, perhaps more than ever, that the world of human activity also has its own 'Butterfly Effect' - for better or for worse." (Kofi Annan, [Nobel lecture] 2001)

"In chaos theory this 'butterfly effect' highlights the extreme sensitivity of nonlinear systems at their bifurcation points. There the slightest perturbation can push them into chaos, or into some quite different form of ordered behavior. Because we can never have total information or work to an infinite number of decimal places, there will always be a tiny level of uncertainty that can magnify to the point where it begins to dominate the system. It is for this reason that chaos theory reminds us that uncertainty can always subvert our attempts to encompass the cosmos with our schemes and mathematical reasoning." (F David Peat, "From Certainty to Uncertainty", 2002)

"Incidentally, the butterfly effect also has a good side to it. Since a butterfly in Brazil can disturb the serene weather in Florida, the same butterfly could calm a hurricane in Texas by simply flapping its wings in a certain fashion. This process is called 'controlling chaos' and has been put to use with some success in dealing with heart fibrillation. By applying small shocks at precisely the right moment, an erratic heartbeat can be regularized and a heart attack avoided." (George Szpiro, "Kepler’s Conjecture", 2002)

"A depressing corollary of the butterfly effect (or so it was widely believed) was that two chaotic systems could never synchronize with each other. Even if you took great pains to start them the same way, there would always be some infinitesimal difference in their initial states. Normally that small discrepancy would remain small for a long time, but in a chaotic system, the error cascades and feeds on itself so swiftly that the systems diverge almost immediately, destroying the synchronization. Unfortunately, it seemed, two of the most vibrant branches of nonlinear science - chaos and sync - could never be married. They were fundamentally incompatible." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"A sudden change in the evolutive dynamics of a system (a ‘surprise’) can emerge, apparently violating a symmetrical law that was formulated by making a reduction on some (or many) finite sequences of numerical data. This is the crucial point. As we have said on a number of occasions, complexity emerges as a breakdown of symmetry (a system that, by evolving with continuity, suddenly passes from one attractor to another) in laws which, expressed in mathematical form, are symmetrical. Nonetheless, this breakdown happens. It is the surprise, the paradox, a sort of butterfly effect that can highlight small differences between numbers that are very close to one another in the continuum of real numbers; differences that may evade the experimental interpretation of data, but that may increasingly amplify in the system’s dynamics." (Cristoforo S Bertuglia & Franco Vaio, "Nonlinearity, Chaos, and Complexity: The Dynamics of Natural and Social Systems", 2003)

"At the basis of the impossibility of making reliable predictions for systems such as the atmosphere, there is a phenomenon known today as the butterfly effect. This deals with the progressive limitless magnification of the slightest imprecision (error) present in the measurement of the initial data (the incomplete knowledge of the current state of each molecule of air), which, although in principle negligible, will increasingly expand during the course of the model’s evolution, until it renders any prediction on future states (atmospheric weather conditions when the forecast refers to more than a few days ahead) completely insignificant, as these states appear completely different from the calculated ones." (Cristoforo S Bertuglia & Franco Vaio, "Nonlinearity, Chaos, and Complexity: The Dynamics of Natural and Social Systems", 2003)

"The butterfly effect came to be the most familiar icon of the new science, and appropriately so, for it is the signature of chaos. […] The idea is that in a chaotic system, small disturbances grow exponentially fast, rendering long-term prediction impossible." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"These, then, are the defining features of chaos: erratic, seemingly random behavior in an otherwise deterministic system; predictability in the short run, because of the deterministic laws; and unpredictability in the long run, because of the butterfly effect." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"An apparent paradox is that chaos is deterministic, generated by fixed rules which do not themselves involve any elements of change. We even speak of deterministic chaos. In principle, the future is completely determined by the past; but in practice small uncertainties, much like minute errors of measurement which enter into calculations, are amplified, with the effect that even though the behavior is predictable in the short term, it is unpredictable over the long term." (Heinz-Otto Peitgen et al, "Chaos and Fractals: New Frontiers of Science" 2nd Ed., 2004)

"Chaos theory, for example, uses the metaphor of the ‘butterfly effect’. At critical times in the formation of Earth’s weather, even the fluttering of the wings of a butterfly sends ripples that can tip the balance of forces and set off a powerful storm. Even the smallest inanimate objects sent back into the past will inevitably change the past in unpredictable ways, resulting in a time paradox." (Michio Kaku, "Parallel Worlds", 2004)

"Natural laws, and for that matter determinism, do not exclude the possibility of chaos. In other words, determinism and predictability are not equivalent. And what is an even more surprising rinding of recent chaos theory has been the discovery that these effects are observable in many systems which are much simpler than the weather. [...] Moreover, chaos and order (i.e., the causality principle) can be observed in juxtaposition within the same system. There may be a linear progression of errors characterizing a deterministic system which is governed by the causality principle, while (in the same system) there can also be an exponential progression of errors (i.e., the butterfly effect) indicating that the causality principle breaks down." (Heinz-Otto Peitgen et al, "Chaos and Fractals: New Frontiers of Science" 2nd Ed., 2004)

"[…] some systems (system is just a jargon for anything, like the swinging pendulum or the Solar System, or water dripping from a tap)  are very sensitive to their starting conditions, so that a tiny difference in the initial ‘push’ you give them causes a big difference in where they end up, and there is feedback, so that what a system does affects its own behavior."(John Gribbin, "Deep Simplicity", 2004)

"We often wonder why the more complex systems seem to indicate a preferred direction of time, or an arrow of time, whereas their elementary counterparts do not. […] This has to do with the if-then nature of physics questions. Anything we observe involves laws of motion but also particular initial conditions. […] The initial conditions are what make a situation look peculiar when we time reverse it." (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"[…] we would like to observe that the butterfly effect lies at the root of many events which we call random. The final result of throwing a dice depends on the position of the hand throwing it, on the air resistance, on the base that the die falls on, and on many other factors. The result appears random because we are not able to take into account all of these factors with sufficient accuracy. Even the tiniest bump on the table and the most imperceptible move of the wrist affect the position in which the die finally lands. It would be reasonable to assume that chaos lies at the root of all random phenomena." (Iwo Białynicki-Birula & Iwona Białynicka-Birula, "Modeling Reality: How Computers Mirror Life", 2004)

"Much of chaos as a science is connected with the notion of ‘sensitive dependence on initial conditions.’ Technically, scientists term as ‘chaotic’ those nonrandom complicated motions that exhibit a very rapid growth of errors that, despite perfect determinism, inhibits any pragmatic ability to render accurate long-term prediction. […] The most important fact is that there is a discernibly precise ‘moment’, with a corresponding behavior, which is neither chaotic nor nonchaotic, at which this transition occurs. Yes, errors do grow, but only in a marginally predictable, rather than in an unpredictable, fashion. In this state of marginal predictability inheres embryonically all the seeds of the chaotic behavior to come. That is, this transitional point, the legitimate child of universality, without full-fledged sensitive dependence upon initial conditions, knows fully how to dictate to its progeny in turn how this latter phenomenon must unfold. For a certain range of possible behaviors of strongly nonlinear systems - specifically, this range surrounding the transition to chaos - the information obtained just at the transition point fully organizes the spectrum of behaviors that these chaotic systems can exhibit." (Ray Kurzweil, "The Singularity is Near", 2005)

"Of course, the existence of an unknown butterfly flapping its wings has no direct bearing on weather forecasts, since it will take far too long for such a small perturbation to grow to a significant size, and we have many more immediate uncertainties to worry about. So, the direct impact of this phenomenon on weather prediction is often somewhat overstated." (James Annan & William Connolley, “Chaos and Climate”, 2005)

"Chaos can leave statistical footprints that look like noise. This can arise from simple systems that are deterministic and not random. [...] The surprising mathematical fact is that most systems are chaotic. Change the starting value ever so slightly and soon the system wanders off on a new chaotic path no matter how close the starting point of the new path was to the starting point of the old path. Mathematicians call this sensitivity to initial conditions but many scientists just call it the butterfly effect. And what holds in math seems to hold in the real world - more and more systems appear to be chaotic." (Bart Kosko, "Noise", 2006)

"'Chaos' refers to systems that are very sensitive to small changes in their inputs. A minuscule change in a chaotic communication system can flip a 0 to a 1 or vice versa. This is the so-called butterfly effect: Small changes in the input of a chaotic system can produce large changes in the output. Suppose a butterfly flaps its wings in a slightly different way. can change its flight path. The change in flight path can in time change how a swarm of butterflies migrates." (Bart Kosko, "Noise", 2006)

"Linearity means that the rule that determines what a piece of a system is going to do next is not influenced by what it is doing now. The mathematics of linear systems exhibits a simple geometry. The simplicity allows us to capture the essence of the problem. Nonlinear dynamics is concerned with the study of systems whose time evolution equations are nonlinear. If a parameter that describes a linear system is changed, the qualitative nature of the behavior remains the same. But for nonlinear systems, a small change in a parameter can lead to sudden and dramatic changes in both the quantitative and qualitative behavior of the system." (Wei-Bin Zhang, "Discrete Dynamical Systems, Bifurcations and Chaos in Economics", 2006)

"Physically, the stability of the dynamics is characterized by the sensitivity to initial conditions. This sensitivity can be determined for statistically stationary states, e.g. for the motion on an attractor. If this motion demonstrates sensitive dependence on initial conditions, then it is chaotic. In the popular literature this is often called the 'Butterfly Effect', after the famous 'gedankenexperiment' of Edward Lorenz: if a perturbation of the atmosphere due to a butterfly in Brazil induces a thunderstorm in Texas, then the dynamics of the atmosphere should be considered as an unpredictable and chaotic one. By contrast, stable dependence on initial conditions means that the dynamics is regular." (Ulrike Feudel et al, "Strange Nonchaotic Attractors", 2006)

"This phenomenon, common to chaos theory, is also known as sensitive dependence on initial conditions. Just a small change in the initial conditions can drastically change the long-term behavior of a system. Such a small amount of difference in a measurement might be considered experimental noise, background noise, or an inaccuracy of the equipment." (Greg Rae, Chaos Theory: A Brief Introduction, 2006)

"Global stability of an equilibrium removes the restrictions on the initial conditions. In global asymptotic stability, solutions approach the equilibrium for all initial conditions. [...] In a study of local stability, first equilibrium solutions are identified, then linearization techniques are applied to determine the behavior of solutions near the equilibrium. If the equilibrium is stable for any set of initial conditions, then this type of stability is referred to as global stability." (Linda J S Allen, "An Introduction to Mathematical Biology", 2007)

"Sensitive dependence on initial conditions is one of the criteria necessary for showing a solution to a difference equation exhibits chaotic behavior." (Linda J S Allen, "An Introduction to Mathematical Biology", 2007)

"The system is highly sensitive to some small changes and blows them up into major alterations in weather patterns. This is popularly known as the butterfly effect in that it is possible for a butterfly to flap its wings in São Paolo, so making a tiny change to air pressure there, and for this tiny change to escalate up into a hurricane over Miami. You would have to measure the flapping of every butterfly’s wings around the earth with infinite precision in order to be able to make long-term forecasts. The tiniest error made in these measurements could produce spurious forecasts. However, short-term forecasts are possible because it takes time for tiny differences to escalate." (Ralph D Stacey, "Strategic Management and Organisational Dynamics: The Challenge of Complexity" 5th Ed., 2007)

"Thus, nonlinearity can be understood as the effect of a causal loop, where effects or outputs are fed back into the causes or inputs of the process. Complex systems are characterized by networks of such causal loops. In a complex, the interdependencies are such that a component A will affect a component B, but B will in general also affect A, directly or indirectly. A single feedback loop can be positive or negative. A positive feedback will amplify any variation in A, making it grow exponentially. The result is that the tiniest, microscopic difference between initial states can grow into macroscopically observable distinctions." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Yet, with the discovery of the butterfly effect in chaos theory, it is now understood that there is some emergent order over time even in weather occurrence, so that weather prediction is not next to being impossible as was once thought, although the science of meteorology is far from the state of perfection." (Peter Baofu, "The Future of Complexity: Conceiving a Better Way to Understand Order and Chaos", 2007)

"The ‘butterfly effect’ is at most a hypothesis, and it was certainly not Lorenz’s intention to change it to a metaphor for the importance of small event.” (Péter Érdi, "Complexity Explained", 2008)

"A characteristic of such chaotic dynamics is an extreme sensitivity to initial conditions (exponential separation of neighboring trajectories), which puts severe limitations on any forecast of the future fate of a particular trajectory. This sensitivity is known as the ‘butterfly effect’: the state of the system at time t can be entirely different even if the initial conditions are only slightly changed, i.e., by a butterfly flapping its wings." (Hans J Korsch et al, "Chaos: A Program Collection for the PC", 2008)

"Prior to the discovery of the butterfly effect it was generally believed that small differences averaged out and were of no real significance. The butterfly effect showed that small things do matter. This has major implications for our notions of predictability, as over time these small differences can lead to quite unpredictable outcomes. For example, first of all, can we be sure that we are aware of all the small things that affect any given system or situation? Second, how do we know how these will affect the long-term outcome of the system or situation under study? The butterfly effect demonstrates the near impossibility of determining with any real degree of accuracy the long term outcomes of a series of events." (Elizabeth McMillan, Complexity, "Management and the Dynamics of Change: Challenges for practice", 2008)

"The butterfly effect demonstrates that complex dynamical systems are highly responsive and interconnected webs of feedback loops. It reminds us that we live in a highly interconnected world. Thus our actions within an organization can lead to a range of unpredicted responses and unexpected outcomes. This seriously calls into doubt the wisdom of believing that a major organizational change intervention will necessarily achieve its pre-planned and highly desired outcomes. Small changes in the social, technological, political, ecological or economic conditions can have major implications over time for organizations, communities, societies and even nations." (Elizabeth McMillan, "Complexity, Management and the Dynamics of Change: Challenges for practice", 2008)

"The 'butterfly effect' is at most a hypothesis, and it was certainly not Lorenz’s intention to change it to a metaphor for the importance of small event. […] Dynamical systems that exhibit sensitive dependence on initial conditions produce remarkably different solutions for two initial values that are close to each other. Sensitive dependence on initial conditions is one of the properties to exhibit chaotic behavior. In addition, at least one further implicit assumption is that the system is bounded in some finite region, i.e., the system cannot blow up. When one uses expanding dynamics, a way of pull-back of too much expanded phase volume to some finite domain is necessary to get chaos." (Péter Érdi, "Complexity Explained", 2008)

"Chaos has three primary features: unpredictability, boundedness, and sensitivity to initial conditions. Unpredictability means that a sequence of numbers that is generated from a chaotic function does not repeat. This principle is perhaps a matter of degree, because some of the numbers could look as though they are recurring only because they are rounded to a convenient number of decimal points. [...] Boundedness means that, for all the unpredictability of motion, all points remain within certain boundaries. The principle of sensitivity to initial conditions means that two points that start off as arbitrarily close together become exponentially farther away from each other as the iteration process proceeds. This is a clear case of small differences producing a huge effect." (Stephen J Guastello & Larry S Liebovitch, "Introduction to Nonlinear Dynamics and Complexity" [in "Chaos and Complexity in Psychology"], 2009)

"A system of equations is deemed most elegant if it contains no un- necessary terms or parameters and if the parameters that remain have a minimum of digits. [...] Just as one can find the most elegant set of parameters for a given system, it is possible to find the most elegant set of initial conditions within the basin of attraction or chaotic sea. However, it is usually more useful to have initial conditions that are close to the attractor to reduce the transients that would otherwise occur."  (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"Another property of bounded systems is that, unless the trajectory attracts to an equilibrium point where it stalls and remains forever, the points must continue moving forever with the flow. However, if we consider two initial conditions separated by a small distance along the direction of the flow, they will maintain their average separation forever since they are subject to the exact same flow but only delayed slightly in time. This fact implies that one of the Lyapunov exponents for a bounded continuous flow must be zero unless the flow attracts to a stable equilibrium." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"In a chaotic system, there must be stretching to cause the exponential separation of initial conditions but also folding to keep the trajectories from moving off to infinity. The folding requires that the equations of motion contain at least one nonlinearity, leading to the important principle that chaos is a property unique to nonlinear dynamical systems. If a system of equations has only linear terms, it cannot exhibit chaos no matter how complicated or high-dimensional it may be." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system. This property of a non-linear system such as the weather is known as the butterfly effect where it is purported that a butterfly flapping its wings in Japan can give rise to a tornado in Kansas. This unpredictable behaviour of nonlinear dynamical systems, i.e. its extreme sensitivity to initial conditions, seems to be random and is therefore referred to as chaos. This chaotic and seemingly random behaviour occurs for non-linear deterministic system in which effects can be linked to causes but cannot be predicted ahead of time." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)

"The main defining feature of chaos is the sensitive dependence on initial conditions. Two nearby initial conditions on the attractor or in the chaotic sea separate by a distance that grows exponentially in time when averaged along the trajectory, leading to long-term unpredictability. The Lyapunov exponent is the average rate of growth of this distance, with a positive value signifying sensitive dependence (chaos), a zero value signifying periodicity (or quasiperiodicity), and a negative value signifying a stable equilibrium." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"Complexity carries with it a lack of predictability different to that of chaotic systems, i.e. sensitivity to initial conditions. In the case of complexity, the lack of predictability is due to relevant interactions and novel information created by them." (Carlos Gershenson, "Understanding Complex Systems", 2011)

"The things that really change the world, according to Chaos theory, are the tiny things. A butterfly flaps its wings in the Amazonian jungle, and subsequently a storm ravages half of Europe." (Neil Gaiman, "Good Omens", 2011)

"The key characteristic of 'chaotic solutions' is their sensitivity to initial conditions: two sets of initial conditions close together can generate very different solution trajectories, which after a long time has elapsed will bear very little relation to each other. Twins growing up in the same household will have a similar life for the childhood years but their lives may diverge completely in the fullness of time. Another image used in conjunction with chaos is the so-called 'butterfly effect' – the metaphor that the difference between a butterfly flapping its wings in the southern hemisphere (or not) is the difference between fine weather and hurricanes in Europe." (Tony Crilly, "Fractals Meet Chaos" [in "Mathematics of Complexity and Dynamical Systems"], 2012)

"The most basic tenet of chaos theory is that a small change in initial conditions - a butterfly flapping its wings in Brazil - can produce a large and unexpected divergence in outcomes - a tornado in Texas. This does not mean that the behavior of the system is random, as the term 'chaos' might seem to imply. Nor is chaos theory some modern recitation of Murphy’s Law ('whatever can go wrong will go wrong'). It just means that certain types of systems are very hard to predict." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"History is often the tale of small moments - chance encounters or casual decisions or sheer coincidence - that seem of little consequence at the time, but somehow fuse with other small moments to produce something momentous, the proverbial flapping of a butterfly's wings that triggers a hurricane." (Scott Anderson, "Lawrence in Arabia: War, Deceit, Imperial Folly and the Making of the Modern Middle East", 2013)

"One of the remarkable features of these complex systems created by replicator dynamics is that infinitesimal differences in starting positions create vastly different patterns. This sensitive dependence on initial conditions is often called the butterfly-effect aspect of complex systems - small changes in the replicator dynamics or in the starting point can lead to enormous differences in outcome, and they change one’s view of how robust the current reality is. If it is complex, one small change could have led to a reality that is quite different." (David Colander & Roland Kupers, "Complexity and the art of public policy : solving society’s problems from the bottom up", 2014)

"The sensitivity of chaotic systems to initial conditions is particularly well known under the moniker of the 'butterfly effect', which is a metaphorical illustration of the chaotic nature of the weather system in which 'a flap of a butterfly’s wings in Brazil could set off a tornado in Texas'. The meaning of this expression is that, in a chaotic system, a small perturbation could eventually cause very large-scale difference in the long run." (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"A system governed by a deterministic theory can only evolve along a single trajectory - namely, that dictated by its laws and initial conditions; all other trajectories are excluded. Symmetry principles, on the other hand, fit the freedom-inducing model. Rather than distinguishing what is excluded from what is bound to happen, these principles distinguish what is excluded from what is possible. In other words, although they place restrictions on what is possible, they do not usually determine a single trajectory." (Yemima Ben-Menahem, "Causation in Science", 2018)

"Chaos theory is a branch of mathematics focusing on the study of chaos - dynamical systems whose random states of disorder and irregularities are governed by underlying patterns and deterministic laws that are highly sensitive to initial conditions. Chaos theory is an interdisciplinary theory stating that, within the apparent randomness of complex, chaotic systems, there are underlying patterns, interconnectedness, constant feedback loops, repetition, self-similarity, fractals, and self-organization. The butterfly effect, an underlying principle of chaos, describes how a small change in one state of a deterministic nonlinear system can result in large differences in a later state (meaning that there is a sensitive dependence on initial conditions)." (Nima Norouzi, "Criminal Policy, Security, and Justice in the Time of COVID-19", 2022)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...