Showing posts with label certainty. Show all posts
Showing posts with label certainty. Show all posts

07 August 2022

On Principles VI: Uncertainty Principle

"The uncertainty principle refers to the degree of indeterminateness in the possible present knowledge of the simultaneous values of various quantities with which the quantum theory deals; it does not restrict, for example, the exactness of a position measurement alone or a velocity measurement alone." (Werner Heisenberg, "The Uncertainty Principle", [in James R Newman, "The World of Mathematics" Vol. II], 1956)

"Both the uncertainty principle and the negentropy principle of information make Laplace's scheme [of exact determinism] completely unrealistic. The problem is an artificial one; it belongs to imaginative poetry, not to experimental science." (Léon Brillouin, "Science and Information Theory" 2nd Ed., 1962)

"No branch of number theory is more saturated with mystery than the study of prime numbers: those exasperating, unruly integers that refuse to be divided evenly by any integers except themselves and 1. Some problems concerning primes are so simple that a child can understand them and yet so deep and far from solved that many mathematicians now suspect they have no solution. Perhaps they are 'undecideable'. Perhaps number theory, like quantum mechanics, has its own uncertainty principle that makes it necessary, in certain areas, to abandon exactness for probabilistic formulations." (Martin Gardner, "The remarkable lore of the prime numbers", Scientific American, 1964)

"In particular, the uncertainty principle has stood for a generation, barring the way to more detailed descriptions of nature; and yet, with the lesson of parity still fresh in our minds, how can anyone be quite so sure of its universal validity when we note that, to this day, it has never been subjected to even one direct experimental test?" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"Because of mathematical indeterminancy and the uncertainty principle, it may be a law of nature that no nervous system is capable of acquiring enough knowledge to significantly predict the future of any other intelligent system in detail. Nor can intelligent minds gain enough self-knowledge to know their own future, capture fate, and in this sense eliminate free will." (Edward O Wilson, "On Human Nature", 1978)

"In physics, there are numerous phenomena that are said to be 'true on all scales', such as the Heisenberg uncertainty relation, to which no exception has been found over vast ranges of the variables involved (such as energy versus time, or momentum versus position). But even when the size ranges are limited, as in galaxy clusters (by the size of the universe) or the magnetic domains in a piece of iron near the transition point to ferromagnetism (by the size of the magnet), the concept true on all scales is an important postulate in analyzing otherwise often obscure observations." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"A bell curve shows the 'spread' or variance in our knowledge or certainty. The wider the bell the less we know. An infinitely wide bell is a flat line. Then we know nothing. The value of the quantity, position, or speed could lie anywhere on the axis. An infinitely narrow bell is a spike that is infinitely tall. Then we have complete knowledge of the value of the quantity. The uncertainty principle says that as one bell curve gets wider the other gets thinner. As one curve peaks the other spreads. So if the position bell curve becomes a spike and we have total knowledge of position, then the speed bell curve goes flat and we have total uncertainty (infinite variance) of speed." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"According to quantum theory, the ground state, or lowest energy state, of a pendulum is not just sitting at the lowest energy point, pointing straight down. That would have both a definite position and a definite velocity, zero. This would be a violation of the uncertainty principle, which forbids the precise measurement of both position and velocity at the same time. The uncertainty in the position multiplied by the uncertainty in the momentum must be greater than a certain quantity, known as Planck's constant - a number that is too long to keep writing down, so we use a symbol for it: ħ." (Stephen W Hawking, "The Universe in a Nutshell", 2001)

"The uncertainty principle expresses a seesaw relationship between the fluctuations of certain pairs of variables, such as an electron's position and its speed. Anything that lowers the uncertainty of one must necessarily raise the uncertainty of the other; you can't push both down at the same time. For example, the more tightly you confine an electron, the more wildly it thrashes. By lowering the position end of the seesaw, you force the velocity end to lift up. On the other hand, if you try to constrain the electron's velocity instead, its position becomes fuzzier and fuzzier; the electron can turn up almost anywhere.(Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The inherent nature of complexity is to doubt certainty and any pretense to finite and flawless data. Put another way, under uncertainty principles, any attempt by political systems to 'impose order' has an equal chance to instead 'impose disorder'." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

05 August 2022

On Certainty (2000-)

"Information entropy has its own special interpretation and is defined as the degree of unexpectedness in a message. The more unexpected words or phrases, the higher the entropy. It may be calculated with the regular binary logarithm on the number of existing alternatives in a given repertoire. A repertoire of 16 alternatives therefore gives a maximum entropy of 4 bits. Maximum entropy presupposes that all probabilities are equal and independent of each other. Minimum entropy exists when only one possibility is expected to be chosen. When uncertainty, variety or entropy decreases it is thus reasonable to speak of a corresponding increase in information." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Most physical systems, particularly those complex ones, are extremely difficult to model by an accurate and precise mathematical formula or equation due to the complexity of the system structure, nonlinearity, uncertainty, randomness, etc. Therefore, approximate modeling is often necessary and practical in real-world applications. Intuitively, approximate modeling is always possible. However, the key questions are what kind of approximation is good, where the sense of 'goodness' has to be first defined, of course, and how to formulate such a good approximation in modeling a system such that it is mathematically rigorous and can produce satisfactory results in both theory and applications." (Guanrong Chen & Trung Tat Pham, "Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems", 2001) 

"In fact, H [entropy] measures the amount of uncertainty that exists in the phenomenon. If there were only one event, its probability would be equal to 1, and H would be equal to 0 - that is, there is no uncertainty about what will happen in a phenomenon with a single event because we always know what is going to occur. The more events that a phenomenon possesses, the more uncertainty there is about the state of the phenomenon. In other words, the more entropy, the more information." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"The storytelling mind is allergic to uncertainty, randomness, and coincidence. It is addicted to meaning. If the storytelling mind cannot find meaningful patterns in the world, it will try to impose them. In short, the storytelling mind is a factory that churns out true stories when it can, but will manufacture lies when it can't." (Jonathan Gottschall, "The Storytelling Animal: How Stories Make Us Human", 2012)

"The data is a simplification - an abstraction - of the real world. So when you visualize data, you visualize an abstraction of the world, or at least some tiny facet of it. Visualization is an abstraction of data, so in the end, you end up with an abstraction of an abstraction, which creates an interesting challenge. […] Just like what it represents, data can be complex with variability and uncertainty, but consider it all in the right context, and it starts to make sense." (Nathan Yau, "Data Points: Visualization That Means Something", 2013)

"Without precise predictability, control is impotent and almost meaningless. In other words, the lesser the predictability, the harder the entity or system is to control, and vice versa. If our universe actually operated on linear causality, with no surprises, uncertainty, or abrupt changes, all future events would be absolutely predictable in a sort of waveless orderliness." (Lawrence K Samuels, "Defense of Chaos", 2013)

"We have minds that are equipped for certainty, linearity and short-term decisions, that must instead make long-term decisions in a non-linear, probabilistic world." (Paul Gibbons, "The Science of Successful Organizational Change", 2015)

"The greater the uncertainty, the bigger the gap between what you can measure and what matters, the more you should watch out for overfitting - that is, the more you should prefer simplicity." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"A notable difference between many fields and data science is that in data science, if a customer has a wish, even an experienced data scientist may not know whether it’s possible. Whereas a software engineer usually knows what tasks software tools are capable of performing, and a biologist knows more or less what the laboratory can do, a data scientist who has not yet seen or worked with the relevant data is faced with a large amount of uncertainty, principally about what specific data is available and about how much evidence it can provide to answer any given question. Uncertainty is, again, a major factor in the data scientific process and should be kept at the forefront of your mind when talking with customers about their wishes."  (Brian Godsey, "Think Like a Data Scientist", 2017)

"The elements of this cloud of uncertainty (the set of all possible errors) can be described in terms of probability. The center of the cloud is the number zero, and elements of the cloud that are close to zero are more probable than elements that are far away from that center. We can be more precise in this definition by defining the cloud of uncertainty in terms of a mathematical function, called the probability distribution." (David S Salsburg, "Errors, Blunders, and Lies: How to Tell the Difference", 2017)

"Uncertainty is an adversary of coldly logical algorithms, and being aware of how those algorithms might break down in unusual circumstances expedites the process of fixing problems when they occur - and they will occur. A data scientist’s main responsibility is to try to imagine all of the possibilities, address the ones that matter, and reevaluate them all as successes and failures happen." (Brian Godsey, "Think Like a Data Scientist", 2017)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." (G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"Estimates based on data are often uncertain. If the data were intended to tell us something about a wider population (like a poll of voting intentions before an election), or about the future, then we need to acknowledge that uncertainty. This is a double challenge for data visualization: it has to be calculated in some meaningful way and then shown on top of the data or statistics without making it all too cluttered." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

02 April 2022

Frank H Knight - Collected Quotes

"Knowledge is more a matter of learning than of the exercise of absolute judgment. Learning requires time, and in time the situation dealt with, as well as the learner, undergoes change." (Frank H Knight, "Risk, Uncertainty, and Profit", 1921)

"We live in a world full of contradiction and paradox, a fact of which perhaps the most fundamental illustration is this: that the existence of a problem of knowledge depends on the future being different than the past, while the possibility of the solution of the problem depends on the future being like the past." (Frank H Knight, "Risk, Uncertainty, and Profit", 1921)

"We must infer what the future situation would have been without our interference, and what change will be wrought in it by our action. Fortunately or unfortunately, none of these processes is infallible, or indeed ever accurate and complete." (Frank H Knight, "Risk, Uncertainty, and Profit", 1921)

"All science is static in the sense that it describes the unchanging aspects of things." (Frank Knight, "The Ethics of Competition", 1923)

"It is true practically if not altogether without exception that the changes studied by any science tend to equilibrate or neutralize the forces which bring them about, and finally to come to rest." (Frank Knight, "The Ethics of Competition", 1923)

"The possibility of saying anything about a thing rests on the assumption that it preserves its identity, or continues to be the same thing in the respect described, that it will behave in future situations as it has in past." (Frank Knight, "The Ethics of Competition", 1923) 

"There is no sense in making statements that will not continue to be true after they are made." (Frank Knight, "The Ethics of Competition", 1923)

"Market competition is the only form of organization which can afford a large measure of freedom to the individual." (Frank Knight, "Freedom and Reform", 1947)

13 October 2021

Ludwig Wittgenstein - Collected Quotes

"If a fact is to be a picture, it must have something in common with what it depicts. […] What a picture must have in common with reality, in order to be able to depict it correctly or incorrectly - in the way it does, is its pictorial form. […] What any picture, of whatever form, must have in common with reality, in order to be able to depict it - correctly or incorrectly in any way at all, is logical form, i.e., the form of reality. […] Logical pictures can depict the world." (Ludwig Wittgenstein, "Tractatus Logico-Philosophicus", 1922)

"The logical picture of the facts is the thought. […] A picture is a model of reality. In a picture objects have the elements of the picture corresponding to them. The fact that the elements of a picture are related to one another in a determinate way represents that things are related to one another in the same way." (Ludwig Wittgenstein, "Tractatus Logico-Philosophicus", 1922)

"The process of induction is the process of assuming the simplest law that can be made to harmonize with our experience. This process, however, has no logical foundation but only a psychological one. It is clear that there are no grounds for believing that the  simplest course of events will really happen." (Ludwig Wittgenstein, "Tractatus Logico-Philosophicus", 1922)

"The so-called law of induction cannot possibly be a law of logic, since it is obviously a proposition with a sense. - Nor, therefore, can it be an a priori law." (Ludwig Wittgenstein, "Tractatus Logico Philosophicus", 1922)

"For a large class of cases - though not for all - in which we employ the word 'meaning' it can be defined thus: the meaning of a word is its use in language." (Ludwig Wittgenstein, "Philosophical investigations", 1953)

"Like everything metaphysical the harmony between thought and reality is to be found in the grammar of the language." (Ludwig Wittgenstein, "Philosophical Investigations", 1953)

"The problems are solved, not by giving new information, but by arranging what we have known since long." (Ludwig Wittgenstein, "Philosophical Investigations", 1953)

"To convince someone of the truth, it is not enough to state it, but rather one must find the path from error to truth." (Ludwig Wittgenstein, "Philosophical Occasions", 1953)

"Our craving for generality has [as one] source […] our preoccupation with the method of science. I mean the method the method of reducing the explanation of natural phenomena to the smallest possible number of primitive natural laws; and, in mathematics, of unifying the treatment of different topics by using a generalization. Philosophers constantly see the method of science before their eyes, and are irresistibly tempted to ask and answer in the way science does. This tendency is the real source of metaphysics, and leads the philosopher into complete darkness. I want to say here that it can never be our job to reduce anything to anything, or to explain anything. Philosophy really is ‘purely descriptive’." (Ludwig Wittgenstein, "The Blue and Brown Books", 1958)

"Images tell us nothing, either right or wrong, about the external world. […] It is just because forming images is a voluntary activity that it does not instruct us about the external world. […] When we form an image of something we are not observing. The coming and going of the pictures is not something that happens to us. We are not surprised by these pictures, saying ‘Look!’"  (Ludwig Wittgenstein, "Zettel", 1967)

"All testing, all confirmation and disconfirmation of a hypothesis takes place already within a system. And this system is not a more or less arbitrary and doubtful point of departure for all our arguments; no it belongs to the essence of what we call an argument. The system is not so much the point of departure, as the element in which our arguments have their life." (Ludwig Wittgenstein, "On Certainty", 1969)

"People are deeply imbedded in philosophical, i.e., grammatical confusions. And to free them presupposes pulling them out of the immensely manifold connections they are caught up in." (Ludwig Wittgenstein, "Philosophical Occasions 1912-1951", 1993)

08 August 2021

Fred C Scweppe - Collected Quotes

"A bias can be considered a limiting case of a nonwhite disturbance as a constant is the most time-correlated process possible." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Changes of variables can be helpful for iterative and parametric solutions even if they do not linearize the problem. For example, a change of variables may change the 'shape' of J(x) into a more suitable form. Unfortunately there seems to be no· general way to choose the 'right' change of variables. Success depends on the particular problem and the engineer's insight. However, the possibility of a change of variables should always be considered."(Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Decision-making problems (hypothesis testing) involve situations where it is desired to make a choice among various alternative decisions (hypotheses). Such problems can be viewed as generalized state estimation problems where the definition of state has simply been expanded." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Hypothesis testing can introduce the need for multiple models for the multiple hypotheses and,' if appropriate, a priori probabilities. The one modeling aspect of hypothesis testing that has no estimation counterpart is the problem of specifying the hypotheses to be considered. Often this is a critical step which influences both performance arid the difficulty of implementation." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Modeling is definitely the most important and critical problem. If the mathematical model is not valid, any subsequent analysis, estimation, or control study is meaningless. The development of the model in a convenient form can greatly reduce the complexity of the actual studies." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"Pattern recognition can be viewed as a special case of hypothesis testing. In pattern recognition, an observation z is to be used to decide what pattern caused it. Each possible pattern can be viewed as one hypothesis. The main problem in pattern recognition is the development of models for the z corresponding to each pattern (hypothesis)." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"System theory is a tool which engineers use to help them design the 'best' system to do the job that must be done. A dominant characteristic of system theory is the interest in the analysis and design (synthesis) of systems from an input-output point of view. System theory uses mathematical manipulation of a mathematical model to help design the actual system." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The biggest (and sometimes insurmountable) problem is usually to use the available data (information, measurements, etc.) to find out what the system is actually doing (i.e., to estimate its state). If the system's state can be estimated to some reasonable accuracy, the desired control is often obvious (or can be obtained by the use of deterministic control theory)." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The choice of model is often the most critical aspect of a design and development engineering job, but it is impossible to give explicit rules or techniques." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The power and beauty of stochastic approximation theory is that it provides simple, easy to implement gain sequences which guarantee convergence without depending (explicitly) on knowledge of the function to be minimized or the noise properties. Unfortunately, convergence is usually extremely slow. This is to be expected, as 'good performance' cannot be expected if no (or very little) knowledge of the nature of the problem is built into the algorithm. In other words, the strength of stochastic approximation (simplicity, little a priori knowledge) is also its weakness." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The pseudo approach to uncertainty modeling refers to the use of an uncertainty model instead of using a deterministic model which is actually (or at least theoretically) available. The uncertainty model may be desired because it results in a simpler analysis, because it is too difficult (expensive) to gather all the data necessary for an exact model, or because the exact model is too complex to be included in the computer." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"[A] system is represented by a mathematical model which may take many forms, such as algebraic equations, finite state machines, difference equations, ordinary differential equations, partial differential equations, and functional equations. The system model may be uncertain, as the mathematical model may not be known completely." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"The term hypothesis testing arises because the choice as to which process is observed is based on hypothesized models. Thus hypothesis testing could also be called model testing. Hypothesis testing is sometimes called decision theory. The detection theory of communication theory is a special case." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

18 July 2021

Out of Context: On Probability (Definitions)

"Probability is the very guide of life." (Marcus Tullius Cicero, "De Natura Deorum" ["On the Nature of the Gods"], 45 BC) [attributed

"Probability is a degree of possibility." (Gottfried W Leibniz, "On estimating the uncertain", 1676)

"Probability is the appearance of agreement upon fallible proofs." (John Locke, "An Essay Concerning Human Understanding", Book IV, 1689)

"Probability is likeliness to be true, the very notation of the word signifying such a proposition, for which there be arguments or proofs to make it pass, or be received for true." (John Locke, "An Essay Concerning Human Understanding", Book IV, 1689)

"Probability is a degree of certainty and it differs from certainty as a part from a whole." (Jacob Bernoulli, "Ars Conjectandi", 1713)

"[…] to us, probability is the very guide of life." (Joseph Butler, "The Analogy of Religion", 1736) 

"Probability is relative in part to this ignorance, and in part to our knowledge." (Pierre-Simon Laplace, "Mémoire sur les Approximations des Formules qui sont Fonctions de Très Grands Nombres", 1783) 

"Probability is expectation founded upon partial knowledge." (George Boole, "The Laws of Thought", 1854)

"Probability is, so far as measurement is concerned, closely analogous to similarity." (John M Keynes, "A Treatise on Probability", 1921)

"The Theory of Probability is concerned with that part which we obtain by argument, and it treats of the different degrees in which the results so obtained are conclusive or inconclusive." (John M Keynes, "A Treatise on Probability", 1921)

"Probability is the most important concept in modern science, especially as nobody has the slightest notion what it means." (Bertrand Russell, 1929)

"Probability is truth in some degree […]" (Errol E Harris, "Hypothesis and Perception: The Roots of Scientific Method", 1970)

"Probability, too, if regarded as something endowed with some kind of objective existence, is no less a misleading misconception, an illusory attempt to exteriorize or materialize our true probabilistic beliefs." (Bruno de Finetti, "Theory of Probability", 1974)

"The logic of certainty furnishes us with the range of possibility (and the possible has no gradations); probability is an additional notion that one applies within the range of possibility, thus giving rise to graduations (‘more or less’ probable) that are meaningless in the logic of uncertainty."  (Bruno de Finetti, "Theory of Probability", 1974)

"Many modern philosophers claim that probability is relation between an hypothesis and the evidence for it." (Ian Hacking, "The Emergence of Probability", 1975)

"The theory of probability is the only mathematical tool available to help map the unknown and the uncontrollable." (Benoit Mandelbrot, "The Fractal Geometry of Nature", 1977)

"Probability is the mathematics of uncertainty." (Richard W Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"Probabilities are summaries of knowledge that is left behind when information is transferred to a higher level of abstraction." (Judea Pearl, Probabilistic Reasoning in Intelligent Systems: Network of Plausible, Inference, 1988)

"Probability is the branch of mathematics that describes randomness." (David S Moore, "Uncertainty", 1990)

"[...] probability is a style of thinking." (Richard W Hamming, "The Art of Probability for Scientists and Engineers", 1991)

"Probability is not about the odds, but about the belief in the existence of an alternative outcome, cause, or motive." (Nassim N Taleb, "Fooled by Randomness", 2001)

"Probability is a mathematical language for quantifying uncertainty." (Larry A Wasserman, "All of Statistics: A concise course in statistical inference", 2004)

"Probability is a liberal art; it is a child of skepticism, not a tool for people with calculators on their belts to satisfy their desire to produce fancy calculations and certainties." (Nassim N Taleb, “The Black Swan”, 2007)

"We have to be aware that probabilities are relative to a level of observation, and that what is most probable at one level is not necessarily so at another."(Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Probability is the science of uncertainty. It provides precise mathematical rules for understanding and analyzing our own ignorance." (Michael J Evans & Jeffrey S Rosenthal, "Probability and Statistics: The Science of Uncertainty", 2009)

08 July 2021

Edward R Dougherty - Collected Quotes

"An advantage of a deterministic theory is that, assuming sufficient knowledge, there is no uncertainty in the evolution of the state of the system. In practice, measurements are not perfectly precise, so there is always uncertainty as to the value of any variable." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016)

"Foretelling the future is the crux. A model may fit existing data, but the model must incorporate mathematical machinery that makes it predictive across time to be scientifically valid." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016) 

"Limitations on experimentation can result in limitations on the complexity or details of a theory. To be validated, a theory cannot exceed the experimentalist’s ability to conceive and perform appropriate experiments. With the uncertainty theory, modern physics appears to have brought us beyond the situation where limitations on observation result only from insufficient experimental apparatus to a point where limitations are unsurpassable in principle." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016)

"In the classical deterministic scenario, a model consists of a few variables and physical constants. The relational structure of the model is conceptualized by the scientist via intuition gained from thinking about the physical world. Intuition means that the scientist has some mental construct regarding the interactions beyond positing a skeletal mathematical system he believes is sufficiently rich to capture the interactions and then depending upon data to infer the relational structure and estimate a large number of parameters." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016) 

"Parameter estimation is a basic aspect of model construction and historically it has been assumed that data are sufficient to estimate the parameters, for instance, correlations that are part of the model; however, when the number of parameters is too large for the amount of data, accurate parameter estimation becomes impossible. The result is model uncertainty." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016)

"[…] people attempt to use highly flexible mathematical structures with large numbers of parameters that can be adjusted to fit the data, the result often being models that fit the data well but lack structural representation of the phenomena and thus are not predictive outside the range of the data. The situation is exacerbated by uncertainty regarding model parameters on account of insufficient data relative to model complexity, which in fact means uncertainty regarding the models themselves. More importantly from the standpoint of epistemology, the amount of available data is often miniscule in comparison to the amount needed for validation. The desire for knowledge has far outstripped experimental/observational capability. We are starved for data." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016)

"Scientific knowledge is worldly knowledge in the sense that it points into the future by making predictions about events that have yet to take place. Scientific knowledge is contingent, always awaiting the possibility of its invalidation. Its truth or falsity lies in the verity of its predictions and, since these predictions depend upon the outcomes of experiments, ultimately the validity of scientific knowledge is relative to the methodology of verification." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016)

"The foundations of a discipline are inseparable from the rules of its game, without which there is no discipline, just idle talk. The foundations of science reside in its epistemology, meaning that they lie in the mathematical formulation of knowledge, structured experimentation, and statistical characterization of validity. Rules impose limitations. These may be unpleasant, but they arise from the need to link ideas in the mind to natural phenomena. The mature scientist must overcome the desire for intuitive understanding and certainty, and must live with stringent limitations and radical uncertainty." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016)

"The model (conceptual system) is a creation of the imagination, in accordance with the rules of the game. The manner of this creation is not part of the scientific theory. The classical manner is that the scientist combines an appreciation of the problem with reflections upon relevant phenomena and, based on mathematical knowledge, creates a model." (Edward R Dougherty, "The Evolution of Scientific Knowledge: From certainty to uncertainty", 2016) 

05 July 2021

F David Peat - Collected Quotes

"A good poem has a unified structure, each word fits perfectly, there is nothing arbitrary about it, metaphors hold together and interlock, the sound of a word and its reflections of meaning complement each other. Likewise postmodern physics asks: How well does everything fit together in a theory? How inevitable are its arguments? Are the assumptions well founded or somewhat arbitrary? Is its overall mathematical form particularly elegant?" (F David Peat, "From Certainty to Uncertainty", 2002)

"A model is a simplified picture of physical reality; one in which, for example, certain contingencies such as friction, air resistance, and so on have been neglected. This model reproduces within itself some essential feature of the universe. While everyday events in nature are highly contingent and depend upon all sorts of external perturbations and contexts, the idealized model aims to produce the essence of phenomena." (F David Peat, "From Certainty to Uncertainty", 2002)

"A system at a bifurcation point, when pushed slightly, may begin to oscillate. Or the system may flutter around for a time and then revert to its normal, stable behavior. Or, alternatively it may move into chaos. Knowing a system within one range of circumstances may offer no clue as to how it will react in others. Nonlinear systems always hold surprises." (F David Peat, "From Certainty to Uncertainty", 2002)

"A theory makes certain predictions and allows calculations to be made that can be tested directly through experiments and observations. But a theory such as superstrings talks about quantum objects that exist in a multidimensional space and at incredibly short distances. Other grand unified theories would require energies close to those experienced during the creation of the universe to test their predictions." (F David Peat, "From Certainty to Uncertainty", 2002)

"Although the detailed moment-to-moment behavior of a chaotic system cannot be predicted, the overall pattern of its 'random' fluctuations may be similar from scale to scale. Likewise, while the fine details of a chaotic system cannot be predicted one can know a little bit about the range of its 'random' fluctuation." (F David Peat, "From Certainty to Uncertainty", 2002)

"An algorithm is a simple rule, or elementary task, that is repeated over and over again. In this way algorithms can produce structures of astounding complexity." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos itself is one form of a wide range of behavior that extends from simple regular order to systems of incredible complexity. And just as a smoothly operating machine can become chaotic when pushed too hard (chaos out of order), it also turns out that chaotic systems can give birth to regular, ordered behavior (order out of chaos). […] Chaos and chance don’t mean the absence of law and order, but rather the presence of order so complex that it lies beyond our abilities to grasp and describe it." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos theory explains the ways in which natural and social systems organize themselves into stable entities that have the ability to resist small disturbances and perturbations. It also shows that when you push such a system too far it becomes balanced on a metaphoric knife-edge. Step back and it remains stable; give it the slightest nudge and it will move into a radically new form of behavior such as chaos." (F David Peat, "From Certainty to Uncertainty", 2002)

"Giving people new mental tools to represent aspects of the world around them meant that they could now externalize and objectify that world. Proceeding in this way they could treat the world as external to themselves and as something to be contemplated within the imagination. The world now became an object to be manipulated within the theater of the mind, rather than an external tangible reality. This also meant that people could gain increasing control over the world around them, yet always at the expense of a loss of direct involvement. The more we objectify the world, the more we are in danger of losing touch with that sense of immediacy felt by active participants in nature." (F David Peat, "From Certainty to Uncertainty", 2002)

"In a linear system a tiny push produces a small effect, so that cause and effect are always proportional to each other. If one plotted on a graph the cause against the effect, the result would be a straight line. In nonlinear systems, however, a small push may produce a small effect, a slightly larger push produces a proportionately larger effect, but increase that push by a hair’s breadth and suddenly the system does something radically different." (F David Peat, "From Certainty to Uncertainty", 2002)

"In chaos theory this 'butterfly effect' highlights the extreme sensitivity of nonlinear systems at their bifurcation points. There the slightest perturbation can push them into chaos, or into some quite different form of ordered behavior. Because we can never have total information or work to an infinite number of decimal places, there will always be a tiny level of uncertainty that can magnify to the point where it begins to dominate the system. It is for this reason that chaos theory reminds us that uncertainty can always subvert our attempts to encompass the cosmos with our schemes and mathematical reasoning." (F David Peat, "From Certainty to Uncertainty", 2002)

"In essence, mathematicians wanted to prove two things: 1.Mathematics is consistent: Mathematics contains no internal contradictions. There are no slips of reason or ambiguities. No matter from what direction we approach the edifice of mathematics, it will always display the same rigor and truth. 2.Mathematics is complete: No mathematical truths are left hanging. Nothing needs adding to the system. Mathematicians can prove every theorem with total rigor so that nothing is excluded from the overall system." (F David Peat, "From Certainty to Uncertainty", 2002)

"It is not so much that particular languages evolve and then cause us to see the world in a given way, but that language and worldview develop side by side to the point where language becomes so ingrained that it constantly supports a specific way of seeing and structuring the world. In the end it becomes difficult to see the world in any other light."  (F David Peat, "From Certainty to Uncertainty", 2002)

"Lessons from chaos theory show that energy is always needed for reorganization. And for a new order to appear an organization must be willing to allow a measure of chaos to occur; chaos being that which no one can totally control. It means entering a zone where no one can predict the final outcome or be truly confident as to what will happen." (F David Peat, "From Certainty to Uncertainty", 2002)

"Mathematical fractals are generated by repeating the same simple steps at ever decreasing scales. In this way an apparently complex shape, containing endless detail, can be generated by the repeated application of a simple algorithm. In turn these fractals mimic some of the complex forms found in nature. After all, many organisms and colonies also grow though the repetition of elementary processes such as, for example, branching and division." (F David Peat, "From Certainty to Uncertainty", 2002)

"Quantum chance is absolute. […] Quantum chance is not a measure of ignorance but an inherent property. […] Chance in quantum theory is absolute and irreducible." (F David Peat, "From Certainty to Uncertainty", 2002)

"Science is like photographing a series of close-ups with your back to the sun. No matter which way you move, your shadow always falls across the scene you photograph. No matter what you do, you can never efface yourself from the photographed scene." (F David Peat, "From Certainty to Uncertainty", 2002)

"Science is that story our society tells itself about the cosmos. Science supposedly provides an objective account of the material world based upon measurement and quantification so that structure, process, movement, and transformation can be described mathematically in terms of fundamental laws." (F David Peat, "From Certainty to Uncertainty", 2002)

"Science proceeds by abstracting what is essential from the accidental details of matter and process. […] Science begins with our relationship to nature. The facts it discovers about the universe are answers to human questions and involve human-designed experiments." (F David Peat, "From Certainty to Uncertainty", 2002)

"The danger arises when a culture takes its own story as the absolute truth, and seeks to impose this truth on others as the yardstick of all knowledge and belief." (F David Peat, "From Certainty to Uncertainty", 2002)

"The quantum world is in a constant process of change and transformation. On the face of it, all possible processes and transformations could take place, but nature’s symmetry principles place limits on arbitrary transformation. Only those processes that do not violate certain very fundamental symmetry principles are allowed in the natural world." (F David Peat, "From Certainty to Uncertainty", 2002)

"The theories of science are all about idealized models and, in turn, these models give pictures of reality. […] But when we speak of the quantum world we find we are employing concepts that simply do not fit. When we discuss our models of reality we are continually importing ideas that are inappropriate and have no real meaning in the quantum domain." (F David Peat, "From Certainty to Uncertainty", 2002)

"There are endless examples of elaborate structures and apparently complex processes being generated through simple repetitive rules, all of which can be easily simulated on a computer. It is therefore tempting to believe that, because many complex patterns can be generated out of a simple algorithmic rule, all complexity is created in this way." (F David Peat, "From Certainty to Uncertainty", 2002)

"To make a quantum observation or to register a measurement in any way, at least one quantum of energy must be exchanged between apparatus and quantum object. But because a quantum is indivisible, it cannot be split or divided. At the moment of observation we cannot know if that quantum came from the measuring apparatus or from the quantum object." (F David Peat, "From Certainty to Uncertainty", 2002)

"Quantum theory forces us to see the limits of our abilities to make images, to create metaphors, and push language to its ends. As we struggle to gaze into the limits of nature we dimly begin to discern something hidden in the dark shadows. That something consists of ourselves, our minds, our language, our intellect, and our imagination, all of which have been stretched to their limits." (F David Peat, "From Certainty to Uncertainty", 2002)

"Quantum theory introduced uncertainty into physics; not an uncertainty that arises out of mere ignorance but a fundamental uncertainty about the very universe itself. Uncertainty is the price we pay for becoming participators in the universe. Ultimate knowledge may only be possible for ethereal beings who lie outside the universe and observe it from their ivory towers." (F David Peat, "From Certainty to Uncertainty", 2002) 

"Where we find certainty and truth in mathematics we also find beauty. Great mathematics is characterized by its aesthetics. Mathematicians delight in the elegance, economy of means, and logical inevitability of proof. It is as if the great mathematical truths can be no other way. This light of logic is also reflected back to us in the underlying structures of the physical world through the mathematics of theoretical physics." (F David Peat, "From Certainty to Uncertainty", 2002)

"[…] while chaos theory deals in regions of randomness and chance, its equations are entirely deterministic. Plug in the relevant numbers and out comes the answer. In principle at least, dealing with a chaotic system is no different from predicting the fall of an apple or sending a rocket to the moon. In each case deterministic laws govern the system. This is where the chance of chaos differs from the chance that is inherent in quantum theory." (F David Peat, "From Certainty to Uncertainty", 2002)

"While chaos theory is, in the last analysis, no more than a metaphor for human society, it can be a valuable metaphor. It makes us sensitive to the types of organizations we create and the way we deal with the situations that surround us." (F David Peat, "From Certainty to Uncertainty", 2002)

"Quantum theory stresses the irreducible link between observer and observed and the basic holism of all phenomena. Indigenous science also holds that there is no separation between the individual and society, between matter and spirit, between each one of us and the whole of nature." (F David Peat, "The Blackfoot Physics", 2006)

"Art and music make manifest, by bringing into conscious awareness, that which has previously been felt only tentatively and internally. Art, in its widest sense, is a form of play that lies at the origin of all making, of language, and of the mind's awareness of its place within the world. Art, in all its forms, makes manifest the spiritual dimension of the cosmos, and expresses our relationship to the natural world. This may have been the cause of that natural light which first illuminated the preconscious minds of early hominids." (F David Peat, "Pathways of Chance", 2007)

27 June 2021

Amos Tversky - Collected Quotes

"People have erroneous intuitions about the laws of chance. In particular, they regard a sample randomly drawn from a population as highly representative, that is, similar to the population in all essential characteristics. The prevalence of the belief and its unfortunate consequences for psychological research are illustrated by the responses of professional psychologists to a questionnaire concerning research decisions." (Amos Tversky & Daniel Kahneman, "Belief in the law of small numbers", Psychological Bulletin 76(2), 1971)

"Significance levels are usually computed and reported, but power and confidence limits are not. Perhaps they should be." (Amos Tversky & Daniel Kahneman, "Belief in the law of small numbers", Psychological Bulletin 76(2), 1971)

"The emphasis on significance levels tends to obscure a fundamental distinction between the size of an effect and its statistical significance." (Amos Tversky & Daniel Kahneman, "Belief in the law of small numbers", Psychological Bulletin 76(2), 1971)

"[...] the statistical power of many psychological studies is ridiculously low. This is a self-defeating practice: it makes for frustrated scientists and inefficient research. The investigator who tests a valid hypothesis but fails to obtain significant results cannot help but regard nature as untrustworthy or even hostile." (Amos Tversky & Daniel Kahneman, "Belief in the law of small numbers", Psychological Bulletin 76(2), 1971)

"[...] too many users of the analysis of variance seem to regard the reaching of a mediocre level of significance as more important than any descriptive specification of the underlying averages. Our thesis is that people have strong intuitions about random sampling; that these intuitions are wrong in fundamental respects; that these intuitions are shared by naive subjects and by trained scientists; and that they are applied with unfortunate consequences in the course of scientific inquiry. We submit that people view a sample randomly drawn from a population as highly representative, that is, similar to the population in all essential characteristics. Consequently, they expect any two samples drawn from a particular population to be more similar to one another and to the population than sampling theory predicts, at least for small samples." (Amos Tversky & Daniel Kahneman, "Belief in the law of small numbers", Psychological Bulletin 76(2), 1971)

"Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not 'corrected' as a chance process unfolds, they are merely diluted." (Amos Tversky & Daniel Kahneman, "Judgment Under Uncertainty: Heuristics and Biases", Science Vol. 185 (4157), 1974)

"Intuitive judgments of probability are based on a limited number of heuristics that are usually effective but sometimes lead to severe and systematic errors. Research shows, for example, that people judge the probability of a hypothesis by the degree to which it represents the evidence, with little or no regard for its prior probability. Other heuristics lead to an overestimation of the probabilities of highly available or salient events, and to overconfidence in the assessment of subjective probability distributions. These biases are not readily corrected, and they are shared by both naive and statistically sophisticated subjects." (Amos Tversky, "Assessing Uncertainty", Journal of the Royal Statistical Society B Vol. 36 (2), 1974) 

"The theory of expected utility is formulated in terms of an abstract set of consequences, that are the carriers of utilities. The axiomatic theory, by its very nature, leaves the consequences uninterpreted. Any application of the theory, of course, is based on a particular interpretation of the outcomes. Thus, the theory could be valid in one interpretation and invalid in another. The appropriateness of the interpretation, however, cannot be evaluated within the theory." (Amos Tversky, "A Critique of Expected Utility Theory: Descriptive and Normative Considerations", Erkenntnis Vol. 9 (2), 1975)

"A significant property of the value function, called loss aversion, is that the response to losses is more extreme than the response to gains. The common reluctance to accept a fair bet on the toss of a coin suggests that the displeasure of losing a sum of money exceeds the pleasure of winning the same amount. Thus the proposed value function is (i) defined on gains and losses, (ii) generally concave for gains and convex for losses, and (iii) steeper for losses than for gains." (Amos Tversky & Daniel Kahneman, "Rational Choice and the Framing of Decisions", The Journal of Business Vol. 59 (4), 1986)

"An essential condition for a theory of choice that claims normative status is the principle of invariance: different representations of the same choice problem should yield the same preference. That is, the preference between options should be independent of their description. Two characterizations that the decision maker, on reflection, would view as alternative descriptions of the same problem should lead to the same choice-even without the benefit of such reflection." (Amos Tversky & Daniel Kahneman, "Rational Choice and the Framing of Decisions", The Journal of Business Vol. 59 (4), 1986)

"Effective learning takes place only under certain conditions: it requires accurate and immediate feedback about the relation between the situational conditions and the appropriate response. The necessary feedback is often lacking for the decisions made by managers, entrepreneurs, and politicians because (i) outcomes are commonly delayed and not easily attributable to a particular action; (ii) variability in the environment degrades the reliability of the feedback, especially where outcomes of low probability are involved; (iii) there is often no information about what the outcome would have been if another decision had been taken; and (iv) most important decisions are unique and therefore provide little opportunity for learning." (Amos Tversky & Daniel Kahneman, "Rational Choice and the Framing of Decisions", The Journal of Business Vol. 59 (4), 1986)

"The modern theory of decision making under risk emerged from a logical analysis of games of chance rather than from a psychological analysis of risk and value. The theory was conceived as a normative model of an idealized decision maker, not as a description of the behavior of real people." (Amos Tversky & Daniel Kahneman, "Rational Choice and the Framing of Decisions", The Journal of Business Vol. 59 (4), 1986)

"The assumption of rationality has a favored position in economics. It is accorded all the methodological privileges of a self-evident truth, a reasonable idealization, a tautology, and a null hypothesis. Each of these interpretations either puts the hypothesis of rational action beyond question or places the burden of proof squarely on any alternative analysis of belief and choice. The advantage of the rational model is compounded because no other theory of judgment and decision can ever match it in scope, power, and simplicity." (Amos Tversky & Daniel Kahneman, "Rational Choice and the Framing of Decisions", The Journal of Business Vol. 59 (4), 1986)

"Theories of choice are at best approximate and incomplete. One reason for this pessimistic assessment is that choice is a constructive and contingent process. When faced with a complex problem, people employ a variety of heuristic procedures in order to simplify the representation and the evaluation of prospects. These procedures include computational shortcuts and editing operations, such as eliminating common components and discarding nonessential differences. The heuristics of choice do not readily lend themselves to formal analysis because their application depends on the formulation of the problem, the method of elicitation, and the context of choice." (Amos Tversky & Daniel Kahneman, "Advances in Prospect Theory: Cumulative Representation of Uncertainty" [in "Choices, Values, and Frames"], 2000)

"Whenever there is a simple error that most laymen fall for, there is always a slightly more sophisticated version of the same problem that experts fall for." (Amos Tversky)

17 June 2021

On Knowledge (1825-1849)

"It is true that of far the greater part of things, we must content ourselves with such knowledge as description may exhibit, or analogy supply; but it is true likewise, that these ideas are always incomplete, and that at least, till we have compared them with realities, we do not know them to be just. As we see more, we become possessed of more certainties, and consequently gain more principles of reasoning, and found a wider base of analogy." (Samuel Johnson, 1825)

"The first steps in the path of discovery, and the first approximate measures, are those which add most to the existing knowledge of mankind." (Charles Babbage, "Reflections on the Decline of Science in England", 1830)

"Our knowledge of circumstances has increased, but our uncertainty, instead of having diminished, has only increased. The reason of this is, that we do not gain all our experience at once, but by degrees; so our determinations continue to be assailed incessantly by fresh experience; and the mind, if we may use the expression, must always be under arms." (Carl von Clausewitz, "On War", 1832)

"Truth in itself is rarely sufficient to make men act. Hence the step is always long from cognition to volition, from knowledge to ability. The most powerful springs of action in men lie in his emotions." (Carl von Clausewitz, "On War", 1832)

"Science and knowledge are subject, in their extension and increase, to laws quite opposite to those which regulate the material world. Unlike the forces of molecular attraction, which cease at sensible distances; or that of gravity, which decreases rapidly with the increasing distance from the point of its origin; the farther we advance from the origin of our knowledge, the larger it becomes, and the greater power it bestows upon its cultivators, to add new fields to its dominions." (Charles Babbage, "On the Economy of Machinery and Manufactures", 1832)

"The peculiar character of mathematical truth is that it is necessarily and inevitably true; and one of the most important lessons which we learn from our mathematical studies is a knowledge that there are such truths." (William Whewell, "Principles of English University Education", 1838)

"[…] in order that the facts obtained by observation and experiment may be capable of being used in furtherance of our exact and solid knowledge, they must be apprehended and analysed according to some Conception which, applied for this purpose, gives distinct and definite results, such as can be steadily taken hold of and reasoned from […]" (William Whewell, "The Philosophy of the Inductive Sciences Founded Upon their History" Vol. 2, 1840)

"But a thousand unconnected observations have no more value, as a demonstrative proof, than a single one. If we do not succeed in discovering causes by our researches, we have no right to create them by the imagination; we must not allow mere fancy to proceed beyond the bounds of our knowledge."(Justus von Liebig, "The Lancet", 1844)

"[…] there do exist among us doctrines of solid and acknowledged certainty, and truths of which the discovery has been received with universal applause. These constitute what we commonly term Sciences; and of these bodies of exact and enduring knowledge, we have within our reach so large and varied a collection, that we may examine them, and the history of their formation, with good prospect of deriving from the study such instruction as we seek." (William Whewell, "The Philosophy of the Inductive Sciences Founded upon Their History" Vol. 1, 1847)

On Knowledge (1960-1969)

"Any pattern of activity in a network, regarded as consistent by some observer, is a system, Certain groups of observers, who share a common body of knowledge, and subscribe to a particular discipline, like 'physics' or 'biology' (in terms of which they pose hypotheses about the network), will pick out substantially the same systems. On the other hand, observers belonging to different groups will not agree about the activity which is a system." (Gordon Pask, "The Natural History of Networks", 1960)

"The most important maxim for data analysis to heed, and one which many statisticians seem to have shunned is this: ‘Far better an approximate answer to the right question, which is often vague, than an exact answer to the wrong question, which can always be made precise.’ Data analysis must progress by approximate answers, at best, since its knowledge of what the problem really is will at best be approximate." (John W Tukey, "The Future of Data Analysis", Annals of Mathematical Statistics, Vol. 33, No. 1, 1962)

"Incomplete knowledge must be considered as perfectly normal in probability theory; we might even say that, if we knew all the circumstances of a phenomenon, there would be no place for probability, and we would know the outcome with certainty." (Félix E Borel, Probability and Certainty", 1963)

"When a science approaches the frontiers of its knowledge, it seeks refuge in allegory or in analogy." (Erwin Chargaff, "Essays on Nucleic Acids", 1963)

"In its efforts to learn as much as possible about nature, modem physics has found that certain things can never be ‘known’ with certainty. Much of our knowledge must always remain uncertain. The most we can know is in terms of probabilities." (Richard P Feynman, "The Feynman Lectures on Physics", 1964)

"A model is a useful (and often indispensable) framework on which to organize our knowledge about a phenomenon. […] It must not be overlooked that the quantitative consequences of any model can be no more reliable than the a priori agreement between the assumptions of the model and the known facts about the real phenomenon. When the model is known to diverge significantly from the facts, it is self-deceiving to claim quantitative usefulness for it by appeal to agreement between a prediction of the model and observation." (John R Philip, 1966)

"It is a commonplace of modern technology that there is a high measure of certainty that problems have solutions before there is knowledge of how they are to be solved." (John K Galbraith, "The New Industrial State", 1967)

"The aim of science is not so much to search for truth, or even truths, as to classify our knowledge and to establish relations between observable phenomena in order to be able to predict the future in a certain measure and to explain the sequence of phenomena in relation to ourselves." (Pierre L du Noüy, "Between Knowing and Believing", 1967)

"It [knowledge] is clearly related to information, which we can now measure; and an economist especially is tempted to regard knowledge as a kind of capital structure, corresponding to information as an income flow. Knowledge, that is to say, is some kind of improbable structure or stock made up essentially of patterns - that is, improbable arrangements, and the more improbable the arrangements, we might suppose, the more knowledge there is." (Kenneth E Boulding, "Beyond Economics: Essays on Society", 1968)

"Knowing reality means constructing systems of transformations that correspond, more or less adequately, to reality. They are more or less isomorphic to transformations of reality. The transformational structures of which knowledge consists are not copies of the transformations in reality; they are simply possible isomorphic models among which experience can enable us to choose. Knowledge, then, is a system of transformations that become progressively adequate." (Jean Piaget, "Genetic Epistemology", 1968)

"Scientific knowledge is not created solely by the piecemeal mining of discrete facts by uniformly accurate and reliable individual scientific investigations. The process of criticism and evaluation, of analysis and synthesis, are essential to the whole system. It is impossible for each one of us to be continually aware of all that is going on around us, so that we can immediately decide the significance of every new paper that is published. The job of making such judgments must therefore be delegated to the best and wisest among us, who speak, not with their own personal voices, but on behalf of the whole community of Science. […] It is impossible for the consensus - public knowledge - to be voiced at all, unless it is channeled through the minds of selected persons, and restated in their words for all to hear." (John M Ziman, "Public Knowledge: An Essay Concerning the Social Dimension of Science", 1968)

"The idea of knowledge as an improbable structure is still a good place to start. Knowledge, however, has a dimension which goes beyond that of mere information or improbability. This is a dimension of significance which is very hard to reduce to quantitative form. Two knowledge structures might be equally improbable but one might be much more significant than the other." (Kenneth E Boulding, "Beyond Economics: Essays on Society", 1968)

"Discovery always carries an honorific connotation. It is the stamp of approval on a finding of lasting value. Many laws and theories have come and gone in the history of science, but they are not spoken of as discoveries. […] Theories are especially precarious, as this century profoundly testifies. World views can and do often change. Despite these difficulties, it is still true that to count as a discovery a finding must be of at least relatively permanent value, as shown by its inclusion in the generally accepted body of scientific knowledge." (Richard J. Blackwell, "Discovery in the Physical Sciences", 1969)

"It is not enough to observe, experiment, theorize, calculate and communicate; we must also argue, criticize, debate, expound, summarize, and otherwise transform the information that we have obtained individually into reliable, well established, public knowledge." (John M Ziman, "Information, Communication, Knowledge", Nature Vol. 224 (5217), 1969)

"Models constitute a framework or a skeleton and the flesh and blood will have to be added by a lot of common sense and knowledge of details."(Jan Tinbergen, "The Use of Models: Experience," 1969)

"The 'flow of information' through human communication channels is enormous. So far no theory exists, to our knowledge, which attributes any sort of unambiguous measure to this 'flow'." (Anatol Rapoport, "Modern Systems Research for the Behavioral Scientist", 1969)

16 June 2021

On Knowledge (1970-1979)

"Inductive inference is the only process known to us by which essential new knowledge comes into the world." (Sir Ronald A Fisher, "The Design of Experiments", 1971)

"A discovery must be, by definition, at variance with existing knowledge." (Albert Szent-Gyorgyi, "Dionysians and Apollonians", Science 176, 1972)

"Nature is a network of happenings that do not unroll like a red carpet into time, but are intertwined between every part of the world; and we are among those parts. In this nexus, we cannot reach certainty because it is not there to be reached; it goes with the wrong model, and the certain answers ironically are the wrong answers. Certainty is a demand that is made by philosophers who contemplate the world from outside; and scientific knowledge is knowledge for action, not contemplation. There is no God’s eye view of nature, in relativity, or in any science: only a man’s eye view." (Jacob Bronowski, "The Identity of Man", 1972)

"The human condition can almost be summed up in the observation that, whereas all experiences are of the past, all decisions are about the future. It is the great task of human knowledge to bridge this gap and to find those patterns in the past which can be projected into the future as realistic images." (Kenneth E Boulding, [foreword] 1972)

"Human knowledge is personal and responsible, an unending adventure at the edge of uncertainty." (Jacob Bronowski, "The Ascent of Man", 1973)

"In moving from conjecture to experimental data, (D), experiments must be designed which make best use of the experimenter's current state of knowledge and which best illuminate his conjecture. In moving from data to modified conjecture, (A), data must be analyzed so as to accurately present information in a manner which is readily understood by the experimenter." (George E P Box & George C Tjao, "Bayesian Inference in Statistical Analysis", 1973)

"Discoveries are made by pursuing possibilities suggested by existing knowledge." (Michael Polanyi, "Meaning", 1975)

"Knowledge is not a series of self-consistent theories that converges toward an ideal view; it is rather an ever increasing ocean of mutually incompatible (and perhaps even incommensurable) alternatives, each single theory, each fairy tale, each myth that is part of the collection forcing the others into greater articulation and all of them contributing, via this process of competition, to the development of our consciousness." (Paul K Feyerabend, "Against Method: Outline of an Anarchistic Theory of Knowledge", 1975)

"Every judgment teeters on the brink of error. To claim absolute knowledge is to become monstrous. Knowledge is an unending adventure at the edge of uncertainty." (Frank Herbert, "Children of Dune", 1976)

"Owing to his lack of knowledge, the ordinary man cannot attempt to resolve conflicting theories of conflicting advice into a single organized structure. He is likely to assume the information available to him is on the order of what we might think of as a few pieces of an enormous jigsaw puzzle. If a given piece fails to fit, it is not because it is fraudulent; more likely the contradictions and inconsistencies within his information are due to his lack of understanding and to the fact that he possesses only a few pieces of the puzzle. Differing statements about the nature of things […] are to be collected eagerly and be made a part of the individual's collection of puzzle pieces. Ultimately, after many lifetimes, the pieces will fit together and the individual will attain clear and certain knowledge." (Alan R Beals, "Strategies of Resort to Curers in South India" [contributed in Charles M. Leslie (ed.), "Asian Medical Systems: A Comparative Study", 1976]) 

"Concepts form the basis for any science. These are ideas, usually somewhat vague (especially when first encountered), which often defy really adequate definition. The meaning of a new concept can seldom be grasped from reading a one-paragraph discussion. There must be time to become accustomed to the concept, to investigate it with prior knowledge, and to associate it with personal experience. Inability to work with details of a new subject can often be traced to inadequate understanding of its basic concepts." (William C Reynolds & Harry C Perkins, "Engineering Thermodynamics", 1977)

"Because of mathematical indeterminancy and the uncertainty principle, it may be a law of nature that no nervous system is capable of acquiring enough knowledge to significantly predict the future of any other intelligent system in detail. Nor can intelligent minds gain enough self-knowledge to know their own future, capture fate, and in this sense eliminate free will." (Edward O Wilson, "On Human Nature", 1978) 

"Certainty, simplicity, vividness originate in popular knowledge. That is where the expert obtains his faith in this triad as the ideal of knowledge. Therein lies the general epistemological significance of popular science." (Ludwik Fleck, "Genesis and Development of a Scientific Fact", 1979)

"It is hard for us today to assimilate all the new ideas that are being suggested in response to the new information we have. We must remember that our picture of the universe is based not only on our scientific knowledge but also on our culture and our philosophy. What new discoveries lie ahead no one can say. There may well be civilizations in other parts of our galaxy or in other galaxies that have already accomplished much of what lies ahead for mankind. Others may just be beginning. The universe clearly presents an unending challenge." (Necia H Apfel & J Allen Hynek, "Architecture of the Universe", 1979)

14 June 2021

On Puzzles (1990-1999)

"The voyage of discovery into our own solar system has taken us from clockwork precision into chaos and complexity. This still unfinished journey has not been easy, characterized as it is by twists, turns, and surprises that mirror the intricacies of the human mind at work on a profound puzzle. Much remains a mystery. We have found chaos, but what it means and what its relevance is to our place in the universe remains shrouded in a seemingly impenetrable cloak of mathematical uncertainty." (Ivars Peterson, "Newton’s Clock", 1993)

"Each of nature's patterns is a puzzle, nearly always a deep one. Mathematics is brilliant at helping us to solve puzzles. It is a more or less systematic way of digging out the rules and structures that lie behind some observed pattern or regularity, and then using those rules and structures to explain what's going on." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"However mathematics starts, whether it is in counting and measuring in everyday life, or in puzzles and riddles, or in scientific queries about projectiles, floating bodies, levers and balances, or magnetic lines of force, it eventually becomes detached from its roots and develops a life of its own. It becomes more powerful, because it can be applied not just to the situations in which it originated but to all other comparable situations. It also becomes more abstract, and more game-like." (David Wells, "You Are a Mathematician: A wise and witty introduction to the joy of numbers", 1995)

"No, nature is, in its own subtle way, simple. However, those simplicities do not present themselves to us directly. Instead, nature leaves clues for the mathematical detectives to puzzle over. It's a fascinating game, even to a spectator. And it's an absolutely irresistible one if you are a mathematical Sherlock Holmes." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Puzzle composers share another feature with mathematicians. They know that, generally speaking, the simpler a puzzle is to express, the more attractive it is likely to be found: similarly, simplicity is for both a desirable feature of the solution. Especially satisfying solutions are often described as 'elegant', a word that - no surprise here - is also used by scientists, engineers and designers, indeed by anyone with a problem to solve. However, simplicity is by no means the only reward of success. Far from it! Mathematicians (and scientists and others) can reasonably expect two further returns: they are (in no particular order) firstly the power to do things, and secondly the perception of connections which were never before suspected, leading in turn to the insight and illumination that mathematicians expect from their best arguments." (David Wells, "You Are a Mathematician: A wise and witty introduction to the joy of numbers", 1995) 

"When we visually perceive the world, we do not just process information; we have a subjective experience of color, shape, and depth. We have experiences associated with other senses (think of auditory experiences of music, or the ineffable nature of smell experiences), with bodily sensations (e.g., pains, tickles, and orgasms), with mental imagery (e.g., the colored shapes that appear when one tubs one's eyes), with emotion (the sparkle of happiness, the intensity of anger, the weight of despair), and with the stream of conscious thought." (David Chalmers, "The Puzzle of Conscious Experience", Scientific American, 1995)

"The art of science is knowing which observations to ignore and which are the key to the puzzle." (Edward W Kolb, "Blind Watchers of the Sky", 1996)

"Most people think of science as a series of steps forged in concrete, but it’s not. It’s a puzzle, and not all of the pieces will ever be firmly in place. When you’re able to fit some of the together, to see an answer, it’s thrilling." (Nora Roberts, "Homeport", 1998)

"A vision is a clear mental picture of a desired future outcome. If you have ever put together a large 1,000-piece jigsaw puzzle, the chances are you used the picture on the top of the puzzle box to guide the placement of the pieces. That picture on the top of the box is the end result or the vision of what you are trying to turn into a reality. It is much more difficult - if not impossible - to put the jigsaw puzzle together without ever looking at the picture." (Jane Flaherty & Peter B Stark, "The Manager's Pocket Guide to Leadership Skills", 1999)

"Accurate estimates depend at least as much upon the mental model used in forming the picture as upon the number of pieces of the puzzle that have been collected." (Richards J. Heuer Jr, "Psychology of Intelligence Analysis", 1999)

30 May 2021

On Conjecture (1975-1999)

"All knowledge, the sociologist could say, is conjectural and theoretical. Nothing is absolute and final. Therefore all knowledge is relative to the local situation of the thinkers who produce it: the ideas and conjectures that they are capable of producing: the problems that bother them; the interplay of assumptions and criticism in their milieu; their purposes and aims; the experiences they have and the standards and meanings they apply." (David Bloor, "Knowledge and Social Imagery", 1976)

"The essential function of a hypothesis consists in the guidance it affords to new observations and experiments, by which our conjecture is either confirmed or refuted." (Ernst Mach, "Knowledge and Error: Sketches on the Psychology of Enquiry", 1976)

"The verb 'to theorize' is now conjugated as follows: 'I built a model; you formulated a hypothesis; he made a conjecture.'" (John M Ziman, "Reliable Knowledge", 1978)

"All advances of scientific understanding, at every level, begin with a speculative adventure, an imaginative preconception of what might be true - a preconception that always, and necessarily, goes a little way (sometimes a long way) beyond anything which we have logical or factual authority to believe in. It is the invention of a possible world, or of a tiny fraction of that world. The conjecture is then exposed to criticism to find out whether or not that imagined world is anything like the real one. Scientific reasoning is therefore at all levels an interaction between two episodes of thought - a dialogue between two voices, the one imaginative and the other critical; a dialogue, as I have put it, between the possible and the actual, between proposal and disposal, conjecture and criticism, between what might be true and what is in fact the case." (Sir Peter B Medawar, "Pluto’s Republic: Incorporating the Art of the Soluble and Induction Intuition in Scientific Thought", 1982)

"So-called scientific knowledge is not knowledge, for it consists only of conjectures or hypotheses - even if some have gone through the crossfire of ingenious tests." (Karl R Popper, "Epistemology and the Problem of Peace", [lecture in "All Life is Problem Solving", 1999] 1985)

"Three shifts can be detected over time in the understanding of mathematics itself. One is a shift from completeness to incompleteness, another from certainty to conjecture, and a third from absolutism to relativity." (Leone Burton, "Femmes et Mathematiques: Y a–t–il une?",  Association for Women in Mathematics Newsletter, Intersection 18, 1988)

"A mathematical proof is a chain of logical deductions, all stemming from a small number of initial assumptions ('axioms') and subject to the strict rules of mathematical logic. Only such a chain of deductions can establish the validity of a mathematical law, a theorem. And unless this process has been satisfactorily carried out, no relation - regardless of how often it may have been confirmed by observation - is allowed to become a law. It may be given the status of a hypothesis or a conjecture, and all kinds of tentative results may be drawn from it, but no mathematician would ever base definitive conclusions on it. (Eli Maor, "e: The Story of a Number", 1994)

"The sequence for the understanding of mathematics may be: intuition, trial, error, speculation, conjecture, proof. The mixture and the sequence of these events differ widely in different domains, but there is general agreement that the end product is rigorous proof - which we know and can recognize, without the formal advice of the logicians. […] Intuition is glorious, but the heaven of mathematics requires much more. Physics has provided mathematics with many fine suggestions and new initiatives, but mathematics does not need to copy the style of experimental physics. Mathematics rests on proof - and proof is eternal." (Saunders Mac Lan, "Reponses to …", Bulletin of the American Mathematical Society Vol. 30 (2), 1994)

"The methods of science include controlled experiments, classification, pattern recognition, analysis, and deduction. In the humanities we apply analogy, metaphor, criticism, and (e)valuation. In design we devise alternatives, form patterns, synthesize, use conjecture, and model solutions." (Béla H Bánáthy, "Designing Social Systems in a Changing World", 1996)

"A proof of a mathematical theorem is a sequence of steps which leads to the desired conclusion. The rules to be followed [...] were made explicit when logic was formalized early in the this century [...] These rules can be used to disprove a putative proof by spotting logical errors; they cannot, however, be used to find the missing proof of a [...] conjecture. [...] Heuristic arguments are a common occurrence in the practice of mathematics. However... The role of heuristic arguments has not been acknowledged in the philosophy of mathematics despite the crucial role they play in mathematical discovery. [...] Our purpose is to bring out some of the features of mathematical thinking which are concealed beneath the apparent mechanics of proof." (Gian-Carlo Rota, "Indiscrete Thoughts", 1997)

"Architectural conjectures are mathematically precise assertions, as well milled as minted coins, provisionally usable in the commerce of logical arguments; less than ‘coins’ and more aptly, promissory notes to be paid in full by some future demonstration, or to be contradicted. These conjectures are expected to turn out to be true, as, of course, are all conjectures; their formulation is often away of "formally" packaging, or at least acknowledging, an otherwise shapeless body of mathematical experience that points to their truth." (Barry Mazur, "Conjecture", Synthese 111, 1997)

"The everyday usage of 'theory' is for an idea whose outcome is as yet undetermined, a conjecture, or for an idea contrary to evidence. But scientists use the word in exactly the opposite sense. [In science] 'theory' [...] refers only to a collection of hypotheses and predictions that is amenable to experimental test, preferably one that has been successfully tested. It has everything to do with the facts." (Tony Rothman & George Sudarshan, "Doubt and Certainty: The Celebrated Academy: Debates on Science, Mysticism, Reality, in General on the Knowable and Unknowable", 1998)

"A mathematician experiments, amasses information, makes a conjecture, finds out that it does not work, gets confused and then tries to recover. A good mathematician eventually does so - and proves a theorem." (Steven Krantz, "Conformal Mappings", American Scientist, 1999)

27 May 2021

On Induction (-1849)

"The only possible way to conceive universal is by induction, since we come to know abstractions by induction. But unless we have sense experience, we cannot make inductions. Even though sense perception relates to particular things, scientific knowledge concerning such can only be constructed by the successive steps of sense perception, induction, and formulation of universals." (Aristotle, "Posterior Analytics", cca. 350 BC)

"The Syllogism consists of propositions, propositions consist of words, words are symbols of notions. Therefore if the notions themselves (which is the root of the matter) are confused and over-hastily abstracted from the facts, there can be no firmness in the superstructure. Our only hope therefore lies in a true induction." (Francis Bacon, "The New Organon", 1620)

"In experimental philosophy, propositions gathered from phenomena by induction should be considered either exactly or very nearly true notwithstanding any contrary hypotheses, until yet other phenomena make such propositions either more exact or liable to exceptions." (Isaac Newton, "The Principia: Mathematical Principles of Natural Philosophy", 1687)

"As in Mathematics, so in Natural Philosophy, the Investigation of difficult Things by the Method of Analysis, ought ever to precede the Method of Composition. This Analysis consists in making Experiments and Observations, and in drawing general Conclusions from them by Induction, and admitting of no Objections against the Conclusions but such as are taken from Experiments, or other certain Truths." (Sir Isaac Newton, "Opticks", 1704)

"It is often in our Power to obtain an Analogy where we cannot have an Induction." (David Hartley, "Observations on Man, His Frame, His Duty, and His Expectations", 1749)

"Especially when we investigate the general laws of Nature, induction has very great power; & there is scarcely any other method beside it for the discovery of these laws. By its assistance, even the ancient philosophers attributed to all bodies extension, figurability, mobility, & impenetrability; & to these properties, by the use of the same method of reasoning, most of the later philosophers add inertia & universal gravitation. Now, induction should take account of every single case that can possibly happen, before it can have the force of demonstration; such induction as this has no place in establishing the laws of Nature. But use is made of an induction of a less rigorous type ; in order that this kind of induction may be employed, it must be of such a nature that in all those cases particularly, which can be examined in a manner that is bound to lead to a definite conclusion as to whether or no the law in question is followed, in all of them the same result is arrived at; & that these cases are not merely a few. Moreover, in the other cases, if those which at first sight appeared to be contradictory, on further & more accurate investigation, can all of them be made to agree with the law; although, whether they can be made to agree in this way better than in any Other whatever, it is impossible to know directly anyhow. If such conditions obtain, then it must be considered that the induction is adapted to establishing the law." (Roger J Boscovich, "De Lege Continuitatis" ["On the law of continuity"], 1754)

"A discovery in mathematics, or a successful induction of facts, when once completed, cannot be too soon given to the world. But […] an hypothesis is a work of fancy, useless in science, and fit only for the amusement of a vacant hour." (Henry Brougham, Edinburgh Review 1, 1803)

"The most important questions of life are, for the most part, really only problems of probability. Strictly speaking one may even say that nearly all our knowledge is problematical; and in the small number of things which we are able to know with certainty, even in the mathematical sciences themselves, induction and analogy, the principal means for discovering truth, are based on probabilities, so that the entire system of human knowledge is connected with this theory." (Pierre-Simon Laplace, "Theorie Analytique des Probabilités", 1812)

"Analysis and natural philosophy owe their most important discoveries to this fruitful means, which is called induction. Newton was indebted to it for his theorem of the binomial and the principle of universal gravity." (Pierre-Simon Laplace, "Philosophical Essay on Probabilities”, 1814)

"Induction, analogy, hypotheses founded upon facts and rectified continually by new observations, a happy tact given by nature and strengthened by numerous comparisons of its indications with experience, such are the principal means for arriving at truth." (Pierre-Simon Laplace, "A Philosophical Essay on Probabilities", 1814)

"One may even say, strictly speaking, that almost all our knowledge is only probable; and in the small number of things that we are able to know with certainty, in the mathematical sciences themselves, the principal means of arriving at the truth - induction and analogy - are based on probabilities, so that the whole system of human knowledge is tied up with the theory set out in this essay." (Pierre-Simon Laplace, "Philosophical Essay on Probabilities", 1814)

"It is characteristic of higher arithmetic that many of its most beautiful theorems can be discovered by induction with the greatest of ease but have proofs that lie anywhere but near at hand and are often found only after many fruitless investigations with the aid of deep analysis and lucky combinations." (Carl Friedrich Gauss, 1817)

"Such is the tendency of the human mind to speculation, that on the least idea of an analogy between a few phenomena, it leaps forward, as it were, to a cause or law, to the temporary neglect of all the rest; so that, in fact, almost all our principal inductions must be regarded as a series of ascents and descents, and of conclusions from a few cases, verified by trial on many." (Sir John Herschel, "A Preliminary Discourse on the Study of Natural Philosophy" , 1830)

"We have here spoken of the prediction of facts of the same kind as those from which our rule was collected. But the evidence in favour of our induction is of a much higher and more forcible character when it enables us to explain and determine cases of a kind different from those which were contemplated in the formation of our hypothesis. The instances in which this has occurred, indeed, impress us with a conviction that the truth of our hypothesis is certain. No accident could give rise to such an extraordinary coincidence. No false supposition could, after being adjusted to one class of phenomena, so exactly represent a different class, when the agreement was unforeseen and contemplated. That rules springing from remote and unconnected quarters should thus leap to the same point, can only arise from that being where truth resides." (William Whewell, "The Philosophy of the Inductive Sciences" Vol. 2, 1840)

"There is in every step of an arithmetical or algebraical calculation a real induction, a real inference from facts to facts, and what disguises the induction is simply its comprehensive nature, and the consequent extreme generality of its language." (John S Mill, "A System of Logic, Ratiocinative and Inductive", 1843)

"The Higher Arithmetic presents us with an inexhaustible storehouse of interesting truths - of truths, too, which are not isolated but stand in the closest relation to one another, and between which, with each successive advance of the science, we continually discover new and sometimes wholly unexpected points of contact. A great part of the theories of Arithmetic derive an additional charm from the peculiarity that we easily arrive by induction at important propositions which have the stamp of simplicity upon them but the demonstration of which lies so deep as not to be discovered until after many fruitless efforts; and even then it is obtained by some tedious and artificial process while the simpler methods of proof long remain hidden from us." (Carl F Gauss, [introduction to Gotthold Eisenstein’s "Mathematische Abhandlungen"] 1847)

26 May 2021

On Randomness XIX (Chaos II)

"The chaos theory will require scientists in all fields to, develop sophisticated mathematical skills, so that they will be able to better recognize the meanings of results. Mathematics has expanded the field of fractals to help describe and explain the shapeless, asymmetrical find randomness of the natural environment." (Theoni Pappas, "More Joy of Mathematics: Exploring mathematical insights & concepts", 1991)

"Chaos demonstrates that deterministic causes can have random effects […] There's a similar surprise regarding symmetry: symmetric causes can have asymmetric effects. […] This paradox, that symmetry can get lost between cause and effect, is called symmetry-breaking. […] From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"Systems, acting dynamically, produce (and incidentally, reproduce) their own boundaries, as structures which are complementary (necessarily so) to their motion and dynamics. They are liable, for all that, to instabilities chaos, as commonly interpreted of chaotic form, where nowadays, is remote from the random. Chaos is a peculiar situation in which the trajectories of a system, taken in the traditional sense, fail to converge as they approach their limit cycles or 'attractors' or 'equilibria'. Instead, they diverge, due to an increase, of indefinite magnitude, in amplification or gain.(Gordon Pask, "Different Kinds of Cybernetics", 1992)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Randomness, chaos, uncertainty, and chance are all a part of our lives. They reside at the ill-defined boundaries between what we know, what we can know, and what is beyond our knowing. They make life interesting." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"There are only patterns, patterns on top of patterns, patterns that affect other patterns. Patterns hidden by patterns. Patterns within patterns. If you watch close, history does nothing but repeat itself. What we call chaos is just patterns we haven't recognized. What we call random is just patterns we can't decipher. what we can't understand we call nonsense. What we can't read we call gibberish. There is no free will. There are no variables." (Chuck Palahniuk, "Survivor", 1999)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"Chaos is impatient. It's random. And above all it's selfish. It tears down everything just for the sake of change, feeding on itself in constant hunger. But Chaos can also be appealing. It tempts you to believe that nothing matters except what you want." (Rick Riordan, "The Throne of Fire", 2011)

"A system in which a few things interacting produce tremendously divergent behavior; deterministic chaos; it looks random but its not." (Chris Langton)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...