07 September 2025

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in vain. [...] But the pride I might have held in my conclusions was perceptibly lessened by the fact that I knew that the solution of these problems had almost always come to me as the gradual generalization of favorable examples, by a series of fortunate conjectures, after many errors. I am fain to compare myself with a wanderer on the mountains who, not knowing the path, climbs slowly and painfully upwards and often has to retrace his steps because he can go no further—then, whether by taking thought or from luck, discovers a new track that leads him on a little till at length when he reaches the summit he finds to his shame that there is a royal road by which he might have ascended, had he only the wits to find the right approach to it. In my works, I naturally said nothing about my mistake to the reader, but only described the made track by which he may now reach the same heights without difficulty." (Hermann von Helmholtz, 1891)

"There is a famous formula, perhaps the most compact and famous of all formulas developed by Euler from a discovery of de Moivre: It appeals equally to the mystic, the scientist, the philosopher, the mathematician." (Edward Kasner & James R Newman, "Mathematics and the Imagination", 1940)

"The difference is that energy is a property of the microstates, and so all observers, whatever macroscopic variables they may choose to define their thermodynamic states, must ascribe the same energy to a system in a given microstate. But they will ascribe different entropies to that microstate, because entropy is not a property of the microstate, but rather of the reference class in which it is embedded. As we learned from Boltzmann, Planck, and Einstein, the entropy of a thermodynamic state is a measure of the number of microstates compatible with the macroscopic quantities that you or I use to define the thermodynamic state." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

"The acceptance of complex numbers into the realm of algebra had an impact on analysis as well. The great success of the differential and integral calculus raised the possibility of extending it to functions of complex variables. Formally, we can extend Euler's definition of a function to complex variables without changing a single word; we merely allow the constants and variables to assume complex values. But from a geometric point of view, such a function cannot be plotted as a graph in a two-dimensional coordinate system because each of the variables now requires for its representation a two-dimensional coordinate system, that is, a plane. To interpret such a function geometrically, we must think of it as a mapping, or transformation, from one plane to another." (Eli Maor, "e: The Story of a Number", 1994)

"Why did he [Euler] choose the letter e? There is no consensus. According to one view, Euler chose it because it is the first letter of the word exponential. More likely, the choice came to him naturally, since the letters a, b, c, and d frequently appeared elsewhere in mathematics. It seems unlikely that Euler chose the letter because it is the initial of his own name, as occasionally has been suggested. He was an extremely modest man and often delayed publication of his own work so that a colleague or student of his would get due credit. In any event, his choice of the symbol e, like so many other symbols of his, became universally accepted." (Eli Maor, "e: The Story of a Number", 1994)

"I see some parallels between the shifts of fashion in mathematics and in music. In music, the popular new styles of jazz and rock became fashionable a little earlier than the new mathematical styles of chaos and complexity theory. Jazz and rock were long despised by classical musicians, but have emerged as art-forms more accessible than classical music to a wide section of the public. Jazz and rock are no longer to be despised as passing fads. Neither are chaos and complexity theory. But still, classical music and classical mathematics are not dead. Mozart lives, and so does Euler. When the wheel of fashion turns once more, quantum mechanics and hard analysis will once again be in style." (Freeman J Dyson, "Book Review of ‘Nature’s Numbers’", The American Mathematical Monthly, Vol. 103 (7), 1996)

"[…] and unlike the physics or chemistry or engineering of today, which will almost surely appear archaic to technicians of the far future, Euler’s formula will still appear, to the arbitrarily advanced mathematicians ten thousand years hence, to be beautiful and stunning and untarnished by time." (Paul J Nahin, "Dr. Euler's Fabulous Formula: Cures Many Mathematical Ills", 2006)

"I think e^iπ+1=0 is beautiful because it is true even in the face of enormous potential constraint. The equality is precise; the left-hand side is not 'almost' or 'pretty near' or 'just about' zero, but exactly zero. That five numbers, each with vastly different origins, and each with roles in mathematics that cannot be exaggerated, should be connected by such a simple relationship, is just stunning. It is beautiful. And unlike the physics or chemistry or engineering of today, which will almost surely appear archaic to technicians of the far future, Euler's formula will still appear, to the arbitrarily advanced mathematicians ten thousand years hence, to be beautiful and stunning and untarnished by time." (Paul J Nahin, "Dr. Euler's Fabulous Formula: Cures Many Mathematical Ills", 2006)

"There are many ways to use unique prime factorization, and it is rightly regarded as a powerful idea in number theory. In fact, it is more powerful than Euclid could have imagined. There are complex numbers that behave like 'integers' and 'primes', and unique prime factorization holds for them as well. Complex integers were first used around 1770 by Euler, who found they have almost magical powers to unlock secrets of ordinary integers. For example, by using numbers of the form a + b -2. where a and b are integers, he was able to prove a claim of Fermat that 27 is the only cube that exceeds a square by 2. Euler's results were correct, but partly by good luck. He did not really understand complex 'primes' and their behavior." (John Stillwell, "Yearning for the Impossible: The Surprising Truths of Mathematics", 2006)

"At first glance, the number e, known in mathematics as Euler’s number, doesn’t seem like much. It’s about 2.7, a quantity of such modest size that it invites contempt in our age of wretched excess and relentless hype." (David Stipp, "A Most Elegant Equation: Euler's Formula and the Beauty of Mathematics", 2017)

"This equation is considered by some mathematicians and physicists to be the most important equation ever devised. In Euler’s relation, both sides of the equation are expressions for a complex number on the unit circle. The left side emphasizes the magnitude (the 1 multiplying e^iθ ) and direction in the complex plane (θ), while the right side emphasizes the real (cos θ) and imaginary (sin θ) components. Another approach to demonstrating the equivalence of the two sides of Euler’s relation is to write out the power-series representation of each side; [...]" (Daniel Fleisch & Laura Kinnaman, "A Student’s Guide to Waves", 2015)

"Euler’s formula - although deceptively simple - is actually staggeringly conceptually difficult to apprehend in its full glory, which is why so many mathematicians and scientists have failed to see its extraordinary scope, range, and ontology, so powerful and extensive as to render it the master equation of existence, from which the whole of mathematics and science can be derived, including general relativity, quantum mechanics, thermodynamics, electromagnetism and the strong and weak nuclear forces! It’s not called the God Equation for nothing. It is much more mysterious than any theistic God ever proposed." (Thomas Stark, "God Is Mathematics: The Proofs of the Eternal Existence of Mathematics", 2018)

"Like a Shakespearean sonnet that captures the very essence of love, or a painting that brings out the beauty of the human form that is far more than just skin deep, Euler's equation reaches down into the very depths of existence." (Keith J Devlin)

On Albert Einstein

"I believe that, as regards the development of physics, we can be very happy to have such an original young thinker, a 'Boltzmann redivivus'; the same certainty and speed of thought; great boldness in theory, which however cannot harm, since the most intimate contact with experiment is preserved. Einstein’s 'quantum hypothesis' is probably among the most remarkable thought [constructions] ever; if it is correct, then it indicates completely new paths [for the ether and molecular theories;] if it false, well, then it will remain for all times ’a beautiful memory'." (Arthur Schuster, 1910)

"Einstein's relativity work is a magnificent mathematical garb which fascinates, dazzles and makes people blind to the underlying errors. The theory is like a beggar clothed in purple whom ignorant people take for a king [...] its exponents are brilliant men but they are metaphysicists rather than scientists." (Nikola Tesla, New York Times, 1935)

"The difference is that energy is a property of the microstates, and so all observers, whatever macroscopic variables they may choose to define their thermodynamic states, must ascribe the same energy to a system in a given microstate. But they will ascribe different entropies to that microstate, because entropy is not a property of the microstate, but rather of the reference class in which it is embedded. As we learned from Boltzmann, Planck, and Einstein, the entropy of a thermodynamic state is a measure of the number of microstates compatible with the macroscopic quantities that you or I use to define the thermodynamic state." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

"String theory promises to take a further step beyond that taken by Einstein's picture of force subsumed within curved space and time geometry. Indeed, string theory contains Einstein's theory of gravitation within itself. Loops of string behave like the exchange particles of the gravitational forces, or 'gravitons' as they are called in the point-particle picture of things. But it has been argued that it must be possible to extract even the geometry of space and time from the characteristics of the strings and their topological properties. At present, it is not known how to do this and we merely content ourselves with understanding how strings behave when they sit in a background universe of space and time."  (John D Barrow, "New Theories of Everything: The Quest for Ultimate Explanation", 1991)

"In string theory one studies strings moving in a fixed classical spacetime. […] what we call a background-dependent approach. […] One of the fundamental discoveries of Einstein is that there is no fixed background. The very geometry of space and time is a dynamical system that evolves in time. The experimental observations that energy leaks from binary pulsars in the form of gravitational waves - at the rate predicted by general relativity to the […] accuracy of eleven decimal place - tell us that there is no more a fixed background of spacetime geometry than there are fixed crystal spheres holding the planets up." (Lee Smolin, "Loop Quantum Gravity", The New Humanists: Science at the Edge, 2003)

"Using matrices, Dirac was able to write an equation relating the total energy of a body to a sum of its energy at rest and its energy in motion, all consistent with Einstein’s theory of relativity. The fact that matrices keep account of what happens when things rotate was a bonus, as the maths was apparently saying that an electron can itself rotate: can spin! Furthermore, the fact that he had been able to solve the mathematics by using the simplest matrices, where a single number was replaced by two columns of pairs, implied a ‘two-ness’ to the spin, precisely what the Zeeman effect had implied. The missing ingredi ent in Schrodinger’s theory had miraculously emerged from the mathematics of matrices, which had been forced on Dirac by the requirements of Einstein’s theory of relativity." (Frank Close, "Antimatter", 2009)

"Ironically, conventional quantum mechanics itself involves a vast expansion of physical reality, which may be enough to avoid Einstein Insanity. The equations of quantum dynamics allow physicists to predict the future values of the wave function, given its present value. According to the Schrödinger equation, the wave function evolves in a completely predictable way. But in practice we never have access to the full wave function, either at present or in the future, so this 'predictability' is unattainable. If the wave function provides the ultimate description of reality - a controversial issue!" (Frank Wilczek, "Einstein’s Parable of Quantum Insanity", 2015) 

"[…] Einstein showed, for 'stuff' like space and time, seemingly stable, unchangeable aspects of nature; in truth, it’s the relationship between space and time that always stays the same, even as space contracts and time dilates. Like energy and matter, space and time are mutable manifestations of deeper, unshakable foundations: the things that never vary no matter what." (K C Cole, "The Simple Idea Behind Einstein’s Greatest Discoveries", Quanta Magazine, 2019) 

On Thresholds (-1999)

"We who stand on the threshold of a new century can look back on an era of unparalleled progress. Looking into the future an equally bright prospect greets our eyes; on all sides fruitful fi elds of research invite our labor and promise easy and rich returns. Surely this is the golden age of mathematics!" (Pierpont, James Pierpont, "The History of Mathematics in the Nineteenth Century", Bulletin of the American Mathematical Society, 2nd Series, Vol. 11, 1904–1905) 

"Those terrible logarithms, when I happened to open a table of them, made my head swim, with their columns of figures; actual fright, not unmixed with respect, overwhelmed me on the very threshold of that arithmetical cave." (Jean-Henri Fabre, "The Life of the Fly", 1913)

"When we are thrilled with the wonder of the world, the heights and depths of things, the beauty of it all, we approach the door of natural religion. And when the Nature-feeling is not superfi cial but informed with knowledge, with no gain of the hard-won analysis unused, we may reach the threshold. And when we feel that our scientifi c cosmology leaves Isis still veiled, and when our attempts at philosophical interpretation give us a reasoned conviction of a meaning behind the process, we may perhaps enter in." (J Arthur Thomson, "The System of Animate Nature" Vol. 1, 1920) 

"The scientific spirit brings about a particular attitude towards worldly matters; before religious matters it pauses for a little, hesitates, and fi nally there too crosses the threshold. In this process there is no stopping; the greater the number of men to whom the treasures of knowledge become accessible, the more widespread is the falling-away from religious belief…" (Sigmund Freud, "The Future of an Illusion", 1927) 

"There are scientists who make their chief discovery at the threshold of their scientific career, and spend the rest of their lives substantiating and elaborating it, mapping out the details of their discovery, as it were. There are other scientists who have to tread a long, diffi cult and often tortuous path to its end before they succeed in crowning their efforts with a discovery." (V Safonov, "Courage",  1953) 

"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25 (11), 1972)

"As the complexity of a system increases, our ability to make precise and yet significant statements about its behavior diminishes until a threshold is reached beyond which precision and significance (or relevance) become almost mutually exclusive characteristics." (Lotfi A Zadeh, 1973)

"Fuzziness, then, is a concomitant of complexity. This implies that as the complexity of a task, or of a system for performing that task, exceeds a certain threshold, the system must necessarily become fuzzy in nature. Thus, with the rapid increase in the complexity of the information processing tasks which the computers are called upon to perform, we are reaching a point where computers will have to be designed for processing of information in fuzzy form. In fact, it is the capability to manipulate fuzzy concepts that distinguishes human intelligence from the machine intelligence of current generation computers. Without such capability we cannot build machines that can summarize written text, translate well from one natural language to another, or perform many other tasks that humans can do with ease because of their ability to manipulate fuzzy concepts." (Lotfi A Zadeh, "The Birth and Evolution of Fuzzy Logic", 1989)

"Threshold functions (are described) which facilitate the careful study of the structure of a graph as it grows and specifically reveal the mysterious circumstances surrounding the abrupt appearance of the Unique Giant Component which systematically absorbs its neighbours, devouring the larger first and ruthlessly continuing until the last Isolated Nodes have been swallowed up, whereupon the Giant is suddenly brought under control by a Spanning Cycle." (Edgar Palmer, "Graphical Evolution", 1985)

"[…] an epidemic does not always percolate through an entire population. There is a percolation threshold below which the epidemic has died out before most of the people have." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"In the realms of nature it is impossible to predict which way a bifurcation will cut. The outcome of a bifurcation is determined neither by the past history of a system nor by its environment, but only by the interplay of more or less random fluctuations in the chaos of critical destabilization. One or another of the fluctuations that rock such a system will suddenly 'nucleate'. The nucleating fluctuation will amplify with great rapidity and spread to the rest of the system. In a surprisingly short time, it dominates the system’s dynamics. The new order that is then born from the womb of chaos reflects the structural and functional characteristics of the nucleated fluctuation. [...] Bifurcations are more visible, more frequent, and more dramatic when the systems that exhibit them are close to their thresholds of stability - when they are all but choked out of existence." (Ervin László, "Vision 2020: Reordering Chaos for Global Survival", 1994)

"When a system is 'stressed' beyond certain threshold limits as, for example, when it is heated up, or its pressure is increased, it shifts from one set of attractors to another and then behaves differently. To use the language of the theory, the system 'settles into a new dynamic regime'. It is at the point of transition that a bifurcation takes place. The system no longer follows the trajectory of its initial attractors, but responds to new attractors that make the system appear to be behaving randomly. It is not behaving randomly, however, and this is the big shift in our understanding caused by dynamical systems theory. It is merely responding to a new set of attractors that give it a more complex trajectory. The term bifurcation, in its most significant sense, refers to the transition of a system from the dynamic regime of one set of attractors, generally more stable and simpler ones, to the dynamic regime of a set of more complex and 'chaotic' attractors." (Ervin László, "Vision 2020: Reordering Chaos for Global Survival", 1994)

"Once we overcome our fear of being tiny, we find ourselves on the threshold of a vast and awesome Universe that utterly dwarfs - in time, in space, and in potential - the tidy anthropocentric proscenium of our ancestors." (Carl Sagan, "Pale Blue Dot: A Vision of the Human Future in Space", 1994)

"The resolution of how to divide the stakes in an uncompleted game marked the beginning of a systematic analysis of probability - the measure of our confidence that something is going to happen. It brings us to the threshold of the quantification of risk." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

On Thresholds (2000-)

"For any given population of susceptibles, there is some critical combination of contact frequency, infectivity, and disease duration just great enough for the positive loop to dominate the negative loops. That threshold is known as the tipping point. Below the tipping point, the system is stable: if the disease is introduced into the community, there may be a few new cases, but on average, people will recover faster than new cases are generated. Negative feedback dominates and the population is resistant to an epidemic. Past the tipping point, the positive loop dominates .The system is unstable and once a disease arrives, it can spread like wildfire that is, by positive feedback-limited only by the depletion of the susceptible population." (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

"The tipping point is that magic moment when an idea, trend, or social behavior crosses a threshold, tips, and spreads like wildfire." (Malcolm T Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"This possibility of sudden change is at the center of the idea of the Tipping Point and might well be the hardest of all to accept. [...] The Tipping Point is the moment of critical mass, the threshold, the boiling point." (Malcolm T Gladwell, "The Tipping Point: How Little Things Can Make a Big Difference", 2000)

"[…] real networks not only are connected but are well beyond the threshold of one. Random network theory tells us that as the average number of links per node increases beyond the critical one, the number of nodes left out of the giant cluster decreases exponentially. That is, the more links we add, the harder it is to find a node that remains isolated. Nature does not take risks by staying close to the threshold. It well surpasses it."  (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"The arrow of time, through the defi ning role it plays in everyday life and its intimate link with the origin of the universe, lies at a singular threshold between the reality we experience and the more refi ned reality cutting-edge science seeks to uncover." (Brian Greene, "The Fabric of the Cosmos: Space, Time, and the Texture of Reality", 2004)

"In the case of a complex system, nonlinear behavior can happen as disturbances or changes in the system, each one relatively small by itself, accumulate. Outwardly, everything seems to be normal: the system doesn’t generate any surprises. At some point, though, the behavior of the whole system suddenly shifts to a radically new mode. This kind of behavior is often called a threshold effect, because the shift occurs when a critical threshold - usually unseen and often unexpected - is crossed." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"But in mathematics there is a kind of threshold effect, an intellectual tipping point. If a student can just get over the first few humps, negotiate the notational peculiarities of the subject, and grasp that the best way to make progress is to understand the ideas, not just learn them by rote, he or she can sail off merrily down the highway, heading for ever more abstruse and challenging ideas, while an only slightly duller student gets stuck at the geometry of isosceles triangles." (Ian Stewart, "Why Beauty is Truth: A history of symmetry", 2007)

"The simplest basic architecture of an artificial neural network is composed of three layers of neurons - input, output, and intermediary (historically called perceptron). When the input layer is stimulated, each node responds in a particular way by sending information to the intermediary level nodes, which in turn distribute it to the output layer nodes and thereby generate a response. The key to artificial neural networks is in the ways that the nodes are connected and how each node reacts to the stimuli coming from the nodes it is connected to. Just as with the architecture of the brain, the nodes allow information to pass only if a specific stimulus threshold is passed. This threshold is governed by a mathematical equation that can take different forms. The response depends on the sum of the stimuli coming from the input node connections and is 'all or nothing'." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"The existence of dark matter particles can never be disproven by direct experiment because ever lighter particles and/or ever smaller cross sections just below the current detection threshold may be postulated for every non-detection. There exists no falsifiable prediction concerning the DM particles." (Pavel Kroupa, "The dark matter crisis: falsification of the current standard model of cosmology", 2012)

"Even more important is the way complex systems seem to strike a balance between the need for order and the imperative for change. Complex systems tend to locate themselves at a place we call 'the edge of chaos'. We imagine the edge of chaos as a place where there is enough innovation to keep a living system vibrant, and enough stability to keep it from collapsing into anarchy. It is a zone of conflict and upheaval, where the old and new are constantly at war. Finding the balance point must be a delicate matter - if a living system drifts too close, it risks falling over into incoherence and dissolution; but if the system moves too far away from the edge, it becomes rigid, frozen, totalitarian. Both conditions lead to extinction. […] Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Flaws can be found in any research design if you look hard enough. […] In our experience, it is good scientific practice to refine one's research hypotheses in light of the data. Working scientists are also keenly aware of the risks of data dredging, and they use confidence intervals and p-values as a tool to avoid getting fooled by noise. Unfortunately, a by-product of all this struggle and care is that when a statistically significant pattern does show up, it is natural to get excited and believe it. The very fact that scientists generally don't cheat, generally don't go fishing for statistical significance, makes them vulnerable to drawing strong conclusions when they encounter a pattern that is robust enough to cross the p < 0.05 threshold." (Andrew Gelman & Eric Loken, "The Statistical Crisis in Science", American Scientist Vol. 102(6), 2014)

"Only at the edge of chaos can complex systems flourish. This threshold line, that edge between anarchy and frozen rigidity, is not a like a fence line, it is a fractal line; it possesses nonlinearity." (Stephen H Buhner, "Plant Intelligence and the Imaginal Realm: Beyond the Doors of Perception into the Dreaming of Earth", 2014)

"Bifurcation is a qualitative, topological change of a system’s phase space that occurs when some parameters are slightly varied across their critical thresholds. Bifurcations play important roles in many real-world systems as a switching mechanism. […] There are two categories of bifurcations. One is called a local bifurcation, which can be characterized by a change in the stability of equilibrium points. It is called local because it can be detected and analyzed only by using localized information around the equilibrium point. The other category is called a global bifurcation, which occurs when non-local features of the phase space, such as limit cycles (to be discussed later), collide with equilibrium points in a phase space. This type of bifurcation can’t be characterized just by using localized information around the equilibrium point."  (Hiroki Sayama, "Introduction to the Modeling and Analysis of Complex Systems", 2015)

"[...] living organisms manifest deep new physical principles, and that we are on the threshold of uncovering and harnessing those principles. What is different this time, and why it has taken so many decades to discover the real secret of life, is that the new physics is not simply a matter of an additional type of force - a 'life force' - but something altogether more subtle, something that interweaves matter and information, wholes and parts, simplicity and complexity." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019) 

06 September 2025

On Connectedness (2010-)

"First, what are the 'graphs' studied in graph theory? They are not graphs of functions as studied in calculus and analytic geometry. They are (usually finite) structures consisting of vertices and edges. As in geometry, we can think of vertices as points (but they are denoted by thick dots in diagrams) and of edges as arcs connecting pairs of distinct vertices. The positions of the vertices and the shapes of the edges are irrelevant: the graph is completely specified by saying which vertices are connected by edges. A common convention is that at most one edge connects a given pair of vertices, so a graph is essentially just a pair of sets: a set of objects." (John Stillwell, "Mathematics and Its History", 2010)

"In the most basic sense, a network is any collection of objects in which some pairs of these objects are connected by links. This definition is very flexible: depending on the setting, many different forms of relationships or connections can be used to define links." (David Easley & Jon Kleinberg, "Networks, Crowds, and Markets: Reasoning about a Highly Connected World", 2010)

"System dynamics is an approach to understanding the behaviour of over time. It deals with internal feedback loops and time delays that affect the behaviour of the entire system. It also helps the decision maker untangle the complexity of the connections between various policy variables by providing a new language and set of tools to describe. Then it does this by modeling the cause and effect relationships among these variables." (Raed M Al-Qirem & Saad G Yaseen, "Modelling a Small Firm in Jordan Using System Dynamics", 2010)

"We are beginning to see the entire universe as a holographically interlinked network of energy and information, organically whole and self-referential at all scales of its existence. We, and all things in the universe, are non-locally connected with each other and with all other things in ways that are unfettered by the hitherto known limitations of space and time." (Ervin László, "Cosmos: A Co-creator's Guide to the Whole-World", 2010)

"When people talk about the 'connectedness' of a complex system, in general they are really talking about two related issues. One is connectedness at the level of structure – who is linked to whom – and the other is connectedness at the level of behavior – the fact that each individual’s actions have implicit consequences for the outcomes of everyone in the system."(David Easley & Jon Kleinberg, "Networks, Crowds, and Markets: Reasoning about a Highly Connected World", 2010)

"Information is recorded in vast interconnecting networks. Each idea or image has hundreds, perhaps thousands, of associations and is connected to numerous other points in the mental network." (Peter Russell, "The Brain Book: Know Your Own Mind and How to Use it", 2013)

"Chaos theory is a branch of mathematics focusing on the study of chaos - dynamical systems whose random states of disorder and irregularities are governed by underlying patterns and deterministic laws that are highly sensitive to initial conditions. Chaos theory is an interdisciplinary theory stating that, within the apparent randomness of complex, chaotic systems, there are underlying patterns, interconnectedness, constant feedback loops, repetition, self-similarity, fractals, and self-organization. The butterfly effect, an underlying principle of chaos, describes how a small change in one state of a deterministic nonlinear system can result in large differences in a later state (meaning that there is a sensitive dependence on initial conditions)." (Nima Norouzi, "Criminal Policy, Security, and Justice in the Time of COVID-19", 2022)

On Connectedness (1975-1999)

"We have reversed the usual classical notion that the independent 'elementary parts' of the world are the fundamental reality, and that the various systems are merely particular contingent forms and arrangements of these parts. Rather, we say that inseparable quantum interconnectedness of the whole universe is the fundamental reality, and that relatively independent behaving parts are merely particular and contingent forms within this whole." (David Bohm, "On the Intuitive Understanding of Nonlocality as Implied by Quantum Theory", Foundations of Physics Vol 5 (1), 1975)

"Information is recorded in vast interconnecting networks. Each idea or image has hundreds, perhaps thousands, of associations and is connected to numerous other points in the mental network." (Peter Russell, "The Brain Book: Know Your Own Mind and How to Use it", 1979)

"The world is a complex, interconnected, finite, ecological–social–psychological–economic system. We treat it as if it were not, as if it were divisible, separable, simple, and infinite. Our persistent, intractable global problems arise directly from this mismatch." (Donella Meadows,"Whole Earth Models and Systems", 1982)

"All certainty in our relationships with the world rests on acknowledgement of causality. Causality is a genetic connection of phenomena through which one thing (the cause) under certain conditions gives rise to, causes something else (the effect). The essence of causality is the generation and determination of one phenomenon by another." (Alexander Spirkin, "Dialectical Materialism", 1983)

"When loops are present, the network is no longer singly connected and local propagation schemes will invariably run into trouble. [...] If we ignore the existence of loops and permit the nodes to continue communicating with each other as if the network were singly connected, messages may circulate indefinitely around the loops and process may not converges to a stable equilibrium. […] Such oscillations do not normally occur in probabilistic networks […] which tend to bring all messages to some stable equilibrium as time goes on. However, this asymptotic equilibrium is not coherent, in the sense that it does not represent the posterior probabilities of all nodes of the network." (Judea Pearl, "Probabilistic Reasoning in Intelligent Systems: Networks of Plausible Inference", 1988)

"Systems thinking is a discipline for seeing wholes. It is a framework for seeing interrelationships rather than things, for seeing patterns of change rather than static 'snapshots'. It is a set of general principles- distilled over the course of the twentieth century, spanning fields as diverse as the physical and social sciences, engineering, and management. [...] During the last thirty years, these tools have been applied to understand a wide range of corporate, urban, regional, economic, political, ecological, and even psychological systems. And systems thinking is a sensibility for the subtle interconnectedness that gives living systems their unique character." (Peter Senge, "The Fifth Discipline", 1990)

"In sharp contrast (with the traditional social planning) the systems design approach seeks to understand a problem situation as a system of interconnected, interdependent, and interacting issues and to create a design as a system of interconnected, interdependent, interacting, and internally consistent solution ideas." (Béla H Bánáthy, "Designing Social Systems in a Changing World", 1996)

"In the new systems thinking, the metaphor of knowledge as a building is being replaced by that of the network. As we perceive reality as a network of relationships, our descriptions, too, form an interconnected network of concepts and models in which there are no foundations. For most scientists such a view of knowledge as a network with no firm foundations is extremely unsettling, and today it is by no means generally accepted. But as the network approach expands throughout the scientific community, the idea of knowledge as a network will undoubtedly find increasing acceptance." (Fritjof Capra," The Web of Life: a new scientific understanding of living systems", 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

On Connectedness (2000-2009)

"A self-organizing system not only regulates or adapts its behavior, it creates its own organization. In that respect it differs fundamentally from our present systems, which are created by their designer. We define organization as structure with function. Structure means that the components of a system are arranged in a particular order. It requires both connections, that integrate the parts into a whole, and separations that differentiate subsystems, so as to avoid interference. Function means that this structure fulfils a purpose." (Francis Heylighen & Carlos Gershenson, "The Meaning of Self-organization in Computing", IEEE Intelligent Systems, 2003)

"At an anatomical level - the level of pure, abstract connectivity - we seem to have stumbled upon a universal pattern of complexity. Disparate networks show the same three tendencies: short chains, high clustering, and scale-free link distributions. The coincidences are eerie, and baffling to interpret." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Average path length reflects the global structure; it depends on the way the entire network is connected, and cannot be inferred from any local measurement. Clustering reflects the local structure; it depends only on the interconnectedness of a typical neighborhood, the inbreeding among nodes tied to a common center. Roughly speaking, path length measures how big the network is. Clustering measures how incestuous it is." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"By its very nature, the mathematical study of networks transcends the usual boundaries between disciplines. Network theory is concerned with the relationships between individuals, the patterns of interactions. The precise nature of the individuals is downplayed, or even suppressed, in hopes of uncovering deeper laws. A network theorist will look at any system of interlinked components and see an abstract pattern of dots connected by lines. It's the pattern that matters, the architecture of relationships, not the identities of the dots themselves. Viewed from these lofty heights, many networks, seemingly unrelated, begin to look the same." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"By 'network' I mean a set of dynamical systems that are 'coupled together', with some influencing the behavior of others. The systems themselves are the nodes of the network- think of them as blobs - and two nodes are joined by an arrow if one of them (at the tail end) influences the other (at the head end). For example, each node might be a nerve cell in some organism, and the arrows might be connections along which signals pass from one cell to another." (Ian Stewart, "Letters to a Young Mathematician", 2006)

"Connectivity harbors other risks too. As we create more links among the nodes of our technological and social networks, these networks sometimes developed unexpected patterns of connections that make breakdown more likely. They can, for instance, develop harmful feedback loops - what people commonly call vicious circles - that reinforce instabilities and even lead to collapse." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"Initially, increasing connectedness and diversity helps, but as the connections become increasingly dense, the system gets very strongly coupled so that a failure in one part reverberates throughout the entire network." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"Nodes and connectors comprise the structure of a network. In contrast, an ecology is a living organism. It influences the formation of the network itself." (George Siemens, "Knowing Knowledge", 2006)

"Scale-free networks are particularly vulnerable to intentional attack: if someone wants to wreck the whole network, he simply needs to identify and destroy some of its hubs. And here we see how our world’s increasing connectivity really matters. Scientists have found that as a scale-free network like the Internet or our food-distribution system grows- as it adds more nodes - the new nodes tend to hook up with already highly connected hubs." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"If a network is solely composed of neighborhood connections, information must traverse a large number of connections to get from place to place. In a small-world network, however, information can be transmitted between any two nodes using, typically, only a small number of connections. In fact, just a small percentage of random, long-distance connections is required to induce such connectivity. This type of network behavior allows the generation of 'six degrees of separation' type results, whereby any agent can connect to any other agent in the system via a path consisting of only a few intermediate nodes." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"Networks may also be important in terms of view. Many models assume that agents are bunched together on the head of a pin, whereas the reality is that most agents exist within a topology of connections to other agents, and such connections may have an important influence on behavior. […] Models that ignore networks, that is, that assume all activity takes place on the head of a pin, can easily suppress some of the most interesting aspects of the world around us. In a pinhead world, there is no segregation, and majority rule leads to complete conformity - outcomes that, while easy to derive, are of little use." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"A graph enables us to visualize a relation over a set, which makes the characteristics of relations such as transitivity and symmetry easier to understand. […] Notions such as paths and cycles are key to understanding the more complex and powerful concepts of graph theory. There are many degrees of connectedness that apply to a graph; understanding these types of connectedness enables the engineer to understand the basic properties that can be defined for the graph representing some aspect of his or her system. The concepts of adjacency and reachability are the first steps to understanding the ability of an allocated architecture of a system to execute properly." (Dennis M Buede, "The Engineering Design of Systems: Models and methods", 2009)

"Complexity theory embraces things that are complicated, involve many elements and many interactions, are not deterministic, and are given to unexpected outcomes. […] A fundamental aspect of complexity theory is the overall or aggregate behavior of a large number of items, parts, or units that are entangled, connected, or networked together. […] In contrast to classical scientific methods that directly link theory and outcome, complexity theory does not typically provide simple cause-and-effect explanations." (Robert E Gunther et al, "The Network Challenge: Strategy, Profit, and Risk in an Interlinked World", 2009)

"The simplest basic architecture of an artificial neural network is composed of three layers of neurons - input, output, and intermediary (historically called perceptron). When the input layer is stimulated, each node responds in a particular way by sending information to the intermediary level nodes, which in turn distribute it to the output layer nodes and thereby generate a response. The key to artificial neural networks is in the ways that the nodes are connected and how each node reacts to the stimuli coming from the nodes it is connected to. Just as with the architecture of the brain, the nodes allow information to pass only if a specific stimulus threshold is passed. This threshold is governed by a mathematical equation that can take different forms. The response depends on the sum of the stimuli coming from the input node connections and is 'all or nothing'." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

On Connectedness (-1974)

"The first attempts to consider the behavior of so-called 'random neural nets' in a systematic way have led to a series of problems concerned with relations between the 'structure' and the 'function' of such nets. The 'structure' of a random net is not a clearly defined topological manifold such as could be used to describe a circuit with explicitly given connections. In a random neural net, one does not speak of 'this' neuron synapsing on 'that' one, but rather in terms of tendencies and probabilities associated with points or regions in the net." (Anatol Rapoport, "Cycle distributions in random nets", The Bulletin of Mathematical Biophysics 10(3), 1948)

"Equilibrium requires that the whole of the structure, the form of its elements, and the means of interconnection be so combined that at the supports there will automatically be produced passive forces or reactions that are able to balance the forces acting upon the structures, including the force of its own weight."  (Eduardo Torroja, "Philosophy of Structure", 1951)

"The principle of complementarity states that no single model is possible which could provide a precise and rational analysis of the connections between these phenomena [before and after measurement]. In such a case, we are not supposed, for example, to attempt to describe in detail how future phenomena arise out of past phenomena. Instead, we should simply accept without further analysis the fact that future phenomena do in fact somehow manage to be produced, in a way that is, however, necessarily beyond the possibility of a detailed description. The only aim of a mathematical theory is then to predict the statistical relations, if any, connecting the phenomena." (David Bohm, "A Suggested Interpretation of the Quantum Theory in Terms of ‘Hidden’ Variables", 1952)

"[…] there are three different but interconnected conceptions to be considered in every structure, and in every structural element involved: equilibrium, resistance, and stability." (Eduardo Torroja, "Philosophy of Structure", 1951)

"General Systems Theory is a name which has come into use to describe a level of theoretical model-building which lies somewhere between the highly generalized constructions of pure mathematics and the specific theories of the specialized disciplines. Mathematics attempts to organize highly general relationships into a coherent system, a system however which does not have any necessary connections with the 'real' world around us. It studies all thinkable relationships abstracted from any concrete situation or body of empirical knowledge." (Kenneth E Boulding, "General Systems Theory - The Skeleton of Science", Management Science Vol. 2 (3), 1956)

"The essential vision of reality presents us not with fugitive appearances but with felt patterns of order which have coherence and meaning for the eye and for the mind. Symmetry, balance and rhythmic sequences express characteristics of natural phenomena: the connectedness of nature - the order, the logic, the living process. Here art and science meet on common ground." (Gyorgy Kepes, "The New Landscape: In Art and Science", 1956)

"In fact, it is empirically ascertainable that every event is actually produced by a number of factors, or is at least accompanied by numerous other events that are somehow connected with it, so that the singling out involved in the picture of the causal chain is an extreme abstraction. Just as ideal objects cannot be isolated from their proper context, material existents exhibit multiple interconnections; therefore the universe is not a heap of things but a system of interacting systems." (Mario Bunge, "Causality: The place of the casual principles in modern science", 1959)

"The general models, even of the most elaborate kind, serve the simple purpose of demonstrating the interconnectedness of all economic phenomena, and show how, under certain conditions, price may act as a guiding link between them. Looked at in another way such models show how a complex set of interrelations can hang together consistently without any central administrative direction." (Ely Devons, "Essays in Economics", 1961)

"To say a system is 'self-organizing' leaves open two quite different meanings. There is a first meaning that is simple and unobjectionable. This refers to the system that starts with its parts separate (so that the behavior of each is independent of the others' states) and whose parts then act so that they change towards forming connections of some type. Such a system is 'self-organizing' in the sense that it changes from 'parts separated' to 'parts joined'. […] In general such systems can be more simply characterized as 'self-connecting', for the change from independence between the parts to conditionality can always be seen as some form of 'connection', even if it is as purely functional […]  'Organizing' […] may also mean 'changing from a bad organization to a good one' […] The system would be 'self-organizing' if a change were automatically made to the feedback, changing it from positive to negative; then the whole would have changed from a bad organization to a good." (W Ross Ashby, "Principles of the self-organizing system", 1962)

"A NETWORK is a collection of connected lines, each of which indicates the movement of some quantity between two locations. Generally, entrance to a network is via a source (the starting point) and exit from a network is via a sink (the finishing point); the lines which form the network are called links (or arcs), and the points at which two or more links meet are called nodes." (Cecil W Lowe, "Critical Path Analysis by Bar Chart", 1966)

On Graphs in Mathematics

"Most of us have some idea of what the word statistics means. We should probably say that it has something to do with tables of figures, diagrams and graphs in economic and scientific publications, with the cost of living [...]  and with a host of other seemingly unrelated matters of concern or unconcern [...] Our answer would be on the right lines. Nor should we be unduly upset if, to start with, we seem a little vague. Statisticians themselves disagree about the definition of the word: over a hundred definitions have been listed." (Walter F  Willcox, "An Improved Method of Measuring Public Health in the United States", Revue de l’lnstitut InternutionuIe de Stutistique  vol. 3 (1), 1935)

"To function in today's society, mathematical literacy - what the British call ‘numeracy' - is as essential as verbal literacy […] Numeracy requires more than just familiarity with numbers. To cope confidently with the demands of today's society, one must be able to grasp the implications of many mathematical concepts - for example, change, logic, and graphs - that permeate daily news and routine decisions - mathematical, scientific, and cultural - provide a common fabric of communication indispensable for modern civilized society. Mathematical literacy is especially crucial because mathematics is the language of science and technology." (National Research Council, "Everybody counts: A report to the nation on the future of mathematics education", 1989)

"Continuous functions can move freely. Graphs of continuous functions can freely branch off at any place, whereas analytic functions coinciding in some neighborhood of a point P cannot branch outside of this neighborhood. Because of this property, continuous functions can mathematically represent wildly changing wind inside a typhoon or a gentle breeze." (Kenji Ueno & Toshikazu Sunada, "A Mathematical Gift, III: The Interplay Between Topology, Functions, Geometry, and Algebra", Mathematical World Vol. 23, 1996)

"Similarly to the graphs of continuous functions, graphs of differentiable (smooth) functions which coincide in a neighborhood of a point P can branch off outside of the neighborhood. Because of this property, differentiable functions can represent smoothly changing natural phenomena." (Kenji Ueno & Toshikazu Sunada, "A Mathematical Gift, III: The Interplay Between Topology, Functions, Geometry, and Algebra", Mathematical World Vol. 23, 1996)

"The role of graphs in probabilistic and statistical modeling is threefold: (1) to provide convenient means of expressing substantive assumptions; (2) to facilitate economical representation of joint probability functions; and (3) to facilitate efficient inferences from observations." (Judea Pearl, "Causality: Models, Reasoning, and Inference", 2000)

"Replacing particles by strings is a naive-sounding step, from which many other things follow. In fact, replacing Feynman graphs by Riemann surfaces has numerous consequences: 1. It eliminates the infinities from the theory. [...] 2. It greatly reduces the number of possible theories. [...] 3. It gives the first hint that string theory will change our notions of spacetime." (Edward Witten, "The Past and Future of String Theory", 2003)

"As geometers study shape, the student of calculus examines change: the mathematics of how an object transforms from one state into another, as when describing the motion of a ball or bullet through space, is rendered pictorial in its graphs’ curves." (Daniel Tammet, "Thinking in Numbers" , 2012)

05 September 2025

On Graphs (-1969)

"There is a magic in graphs. The profile of a curve reveals in a flash a whole situation - the life history of an epidemic, a panic, or an era of prosperity. The curve informs the mind, awakens the imagination, convinces." (Henry D Hubbard [in William Brinton's "Graphic Presentation", 1939])

"Graphic methods are very commonly used in business correlation problems. On the whole, carefully handled and skillfully interpreted graphs have certain advantages over mathematical methods of determining correlation in the usual business problems. The elements of judgment and special knowledge of conditions can be more easily introduced in studying correlation graphically. Mathematical correlation is often much too rigid for the data at hand." (John R Riggleman & Ira N Frisbee, "Business Statistics", 1938)

"Graphs are all inclusive. No fact is too slight or too great to plot to a scale suited to the eye. Graphs may record the path of an ion or the orbit of the sun, the rise of a civilization, or the acceleration of a bullet, the climate of a century or the varying pressure of a heart beat, the growth of a business, or the nerve reactions of a child." (Henry D Hubbard [foreword to Willard C Brinton, "Graphic Presentation", 1939)])

"The graphic art depicts magnitudes to the eye. It does more. It compels the seeing of relations. We may portray by simple graphic methods whole masses of intricate routine, the organization of an enterprise, or the plan of a campaign. Graphs serve as storm signals for the manager, statesman, engineer; as potent narratives for the actuary, statist, naturalist; and as forceful engines of research for science, technology and industry. They display results. They disclose new facts and laws. They reveal discoveries as the bud unfolds the flower." (Henry D Hubbard [foreword to Willard C Brinton, "Graphic Presentation", 1939)])

"The graphic language is modern. We are learning its alphabet. That it will develop a lexicon and a literature marvelous for its vividness and the variety of application is inevitable. Graphs are dynamic, dramatic. They may epitomize an epoch, each dot a fact, each slope an event, each curve a history. Wherever there are data to record, inferences to draw, or facts to tell, graphs furnish the unrivalled means whose power we are just beginning to realize and to apply." (Henry D Hubbard [foreword to Willard C Brinton, "Graphic Presentation", 1939)])

"Charts and graphs represent an extremely useful and flexible medium for explaining, interpreting, and analyzing numerical facts largely by means of points, lines, areas, and other geometric forms and symbols. They make possible the presentation of quantitative data in a simple, clear, and effective manner and facilitate comparison of values, trends, and relationships. Moreover, charts and graphs possess certain qualities and values lacking in textual and tabular forms of presentation." (Calvin F Schmid, "Handbook of Graphic Presentation", 1954)

"If one technique of data analysis were to be exalted above all others for its ability to be revealing to the mind in connection with each of many different models, there is little doubt which one would be chosen. The simple graph has brought more information to the data analyst’s mind than any other device. It specializes in providing indications of unexpected phenomena." (John W Tukey, "The Future of Data Analysis", Annals of Mathematical Statistics Vol. 33 (1), 1962)

"The histogram, with its columns of area proportional to number, like the bar graph, is one of the most classical of statistical graphs. Its combination with a fitted bell-shaped curve has been common since the days when the Gaussian curve entered statistics. Yet as a graphical technique it really performs quite poorly. Who is there among us who can look at a histogram-fitted Gaussian combination and tell us, reliably, whether the fit is excellent, neutral, or poor? Who can tell us, when the fit is poor, of what the poorness consists? Yet these are just the sort of questions that a good graphical technique should answer at least approximately." (John W Tukey, "The Future of Processes of Data Analysis", 1965)

"Every graph is at least an indication, by contrast with some common instances of numbers." (John W Tukey, "Data Analysis, Including Statistics", 1968)

"One of the methods making the data intelligible is to represent it by means of graphs and diagrams. The graphic & diagrammatic representation of the data is always appealing to the eye as well as to the mind of the observer." (S P Singh & R P S Verma, "Agricultural Statistics", cca. 1969)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...