Showing posts with label chance. Show all posts
Showing posts with label chance. Show all posts

20 January 2025

On Chance: Definitions

"Chance is necessity hidden behind a veil." (Marie von Ebner-Eschenbach, Aphorisms, 1880/1893)

"Chance is only the measure of our ignorance." (Henri Poincaré," The Foundations of Science", 1913)

"[...] the conception of chance enters in the very first steps of scientific activity in virtue of the fact that no observation is absolutely correct. I think chance is a more fundamental conception that causality; for whether in a concrete case, a cause-effect relation holds or not can only be judged by applying the laws of chance to the observation." (Max Born, 1949)

"Chance is just as real as causation; both are modes of becoming. The way to model a random process is to enrich the mathematical theory of probability with a model of a random mechanism. In the sciences, probabilities are never made up or 'elicited' by observing the choices people make, or the bets they are willing to place.  The reason is that, in science and technology, interpreted probability exactifies objective chance, not gut feeling or intuition. No randomness, no probability." (Mario Bunge, "Chasing Reality: Strife over Realism", 2006)

"Chance is as relentless as necessity." (Simon Blackburn, Think, 1999) 

"Whether we shuffle cards or roll dice, chance is only a result of our human lack of deftness: we don't have enough control to immobilize a die at will or to individually direct the cards in a deck. The comparison is an important one nonetheless, and highlights the limits of this method of creating chance - it doesn't matter who rolls the dice, but we wouldn't let just anyone shuffle the cards." (Ivar Ekeland, "The Broken Dice, and Other Mathematical Tales of Chance", 1993)

"That randomness gives rise to innovation and diversity in nature is echoed by the notion that chance is also the source of invention in the arts and everyday affairs in which naturally occurring processes are balanced between tight organization, where redundancy is paramount, and volatility, in which little order is possible. One can argue that there is a difference in kind between the unconscious, and sometimes conscious, choices made by a writer or artist in creating a string of words or musical notes and the accidental succession of events taking place in the natural world. However, it is the perception of ambiguity in a string that matters, and not the process that generated it, whether it be man-made or from nature at large." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

Out of Context: On Chance (Definitions)

"Chance is a world void of sense; nothing can exist without a cause." (Voltaire, A Philosophical Dictionary, 1764)

"Our conception of chance is one of law and order in large numbers; it is not that idea of chaotic incidence which vexed the mediaeval mind." (Karl Pearson, "The Chances of Death", 1895)

"Chance is only the measure of our ignorance." (Henri Poincaré, "The Foundations of Science", 1913)

"Can there be laws of chance? The answer, it would seem should be negative, since chance is in fact defined as the characteristic of the phenomena which follow no law, phenomena whose causes are too complex to permit prediction." (Félix E Borel, "Probabilities and Life", 1943)

“Can there be laws of chance? The answer, it would seem should be negative, since chance is in fact defined as the characteristic of the phenomena which follow no law, phenomena whose causes are too complex to permit prediction.” (Félix E Borel, “Probabilities and Life”, 1962)

"Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not 'corrected' as a chance process unfolds, they are merely diluted." (Amos Tversky & Daniel Kahneman, "Judgment Under Uncertainty: Heuristics and Biases", Science Vol. 185 (4157), 1974)

"Quantum chance is absolute. […] Quantum chance is not a measure of ignorance but an inherent property. […] Chance in quantum theory is absolute and irreducible." (F David Peat, "From Certainty to Uncertainty", 2002)

On Chance: Gamblers III

"Behavioural research shows that we tend to use simplifying heuristics when making judgements about uncertain events. These are prone to biases and systematic errors, such as stereotyping, disregard of sample size, disregard for regression to the mean, deriving estimates based on the ease of retrieving instances of the event, anchoring to the initial frame, the gambler’s fallacy, and wishful thinking, which are all affected by our inability to consider more than a few aspects or dimensions of any phenomenon or situation at the same time." (Hans G Daellenbach & Donald C McNickle, "Management Science: Decision making through systems thinking", 2005)

"People sometimes appeal to the ‘law of averages’ to justify their faith in the gambler’s fallacy. They may reason that, since all outcomes are equally likely, in the long run they will come out roughly equal in frequency. However, the next throw is very much in the short run and the coin, die or roulette wheel has no memory of what went before." (Alan Graham, "Developing Thinking in Statistics", 2006)

"Another kind of error possibly related to the use of the representativeness heuristic is the gambler’s fallacy, otherwise known as the law of averages. If you are playing roulette and the last four spins of the wheel have led to the ball’s landing on black, you may think that the next ball is more likely than otherwise to land on red. This cannot be. The roulette wheel has no memory. The chance of black is just what it always is. The reason people tend to think otherwise may be that they expect the sequence of events to be representative of random sequences, and the typical random sequence at roulette does not have five blacks in a row." (Jonathan Baron, "Thinking and Deciding" 4th Ed, 2008)

"The theory of randomness is fundamentally a codification of common sense. But it is also a field of subtlety, a field in which great experts have been famously wrong and expert gamblers infamously correct. What it takes to understand randomness and overcome our misconceptions is both experience and a lot of careful thinking." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"[…] many gamblers believe in the fallacious law of averages because they are eager to find a profitable pattern in the chaos created by random chance." (Gary Smith, "Standard Deviations", 2014)

10 January 2025

On Chance: Gamblers II

"Pure mathematics is the world's best game. It is more absorbing than chess, more of a gamble than poker, and lasts longer than Monopoly. It's free. It can be played anywhere - Archimedes did it in a bathtub." (Richard J Trudeau, "Dots and Lines", 1976)

"Probability does pervade the universe, and in this sense, the old chestnut about baseball imitating life really has validity. The statistics of streaks and slumps, properly understood, do teach an important lesson about epistemology, and life in general. The history of a species, or any natural phenomenon, that requires unbroken continuity in a world of trouble, works like a batting streak. All are games of a gambler playing with a limited stake against a house with infinite resources. The gambler must eventually go bust. His aim can only be to stick around as long as possible, to have some fun while he's at it, and, if he happens to be a moral agent as well, to worry about staying the course with honor!" (Stephen J Gould, 1991)

"Gambling was the place where statistics and profound human consequences met most nakedly, after all, and cards, even more than dice or the numbers on a roulette wheel, seemed able to define and perhaps even dictate a player's... luck." (Tim Powers, "Last Call", 1992)

"Probability theory has a right and a left hand. On the right is the rigorous foundational work using the tools of measure theory. The left hand 'thinks probabilistically', reduces problems to gambling situations, coin-tossing, motions of a physical particle." (Leo Breiman, "Probability", 1992)

"Losing streaks and winning streaks occur frequently in games of chance, as they do in real life. Gamblers respond to these events in asymmetric fashion: they appeal to the law of averages to bring losing streaks to a speedy end. And they appeal to that same law of averages to suspend itself so that winning streaks will go on and on. The law of averages hears neither appeal. The last sequence of throws of the dice conveys absolutely no information about what the next throw will bring. Cards, coins, dice, and roulette wheels have no memory." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Time is the dominant factor in gambling. Risk and time are opposite sides of the same coin, for if there were no tomorrow there would be no risk. Time transforms risk, and the nature of risk is shaped by the time horizon: the future is the playing field." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"A random walk is one in which future steps or directions cannot be predicted on the basis of past history. When the term is applied to the stock market, it means that short-run changes in stock prices are unpredictable. Investment advisory services, earnings forecasts, and chart patterns are useless. [...] What are often called 'persistent patterns' in the stock market occur no more frequently than the runs of luck in the fortunes of any gambler playing a game of chance. This is what economists mean when they say that stock prices behave very much like a random walk." (Burton G Malkiel, "A Random Walk Down Wall Street", 1999)

"All this, though, is to miss the point of gambling, which is to accept the imbalance of chance in general yet deny it for the here and now. Individually we know, with absolute certainty, that 'the way things hap pen' and what actually happens to us are as different as sociology and poetry." (John Haigh," Taking Chances: Winning With Probability", 1999)

"The psychology of gambling includes both a conviction that the unusual must happen and a refusal to believe in it when it does. We are caught by the confusing nature of the long run; just as the imperturbable ocean seen from space will actually combine hurricanes and dead calms, so the same action, repeated over time, can show wide deviations from its normal expected results - deviations that do not themselves break the laws of probability. In fact, they have probabilities of their own." (John Haigh," Taking Chances: Winning With Probability", 1999)

"This notion of 'being due' - what is sometimes called the gambler’s fallacy - is a mistake we make because we cannot help it. The problem with life is that we have to live it from the beginning, but it makes sense only when seen from the end. As a result, our whole experience is one of coming to provisional conclusions based on insufficient evidence: read ing the signs, gauging the odds." (John Haigh," Taking Chances: Winning With Probability", 1999)

On Chance: Gamblers I

"The gambling reasoner is incorrigible; if he would but take to the squaring of the circle, what a load of misery would be saved." (Augustus De Morgan, "A Budget of Paradoxes", 1872)

"In moderation, gambling possesses undeniable virtues. Yet it presents a curious spectacle replete with contradictions. While indulgence in its pleasures has always lain beyond the pale of fear of Hell’s fires, the great laboratories and respectable insurance palaces stand as monuments to a science originally born of the dice cup." (Edward Kasner & James R Newman, "Mathematics and the Imagination", 1940)

"A misunderstanding of Bernoulli’s theorem is responsible for one of the commonest fallacies in the estimation of probabilities, the fallacy of the maturity of chances. When a coin has come down heads twice in succession, gamblers sometimes say that it is more likely to come down tails next time because ‘by the law of averages’ (whatever that may mean) the proportion of tails must be brought right some time." (William Kneale, "Probability and Induction", 1949)

"The classical theory of probability was devoted mainly to a study of the gamble's gain, which is again a random variable; in fact, every random variable can be interpreted as the gain of a real or imaginary gambler in a suitable game." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

"The painful experience of many gamblers has taught us the lesson that no system of betting is successful in improving the gambler's chances. If the theory of probability is true to life, this experience must correspond to a provable statement." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

"The picture of scientific method drafted by modern philosophy is very different from traditional conceptions. Gone is the ideal of a universe whose course follows strict rules, a predetermined cosmos that unwinds itself like an unwinding clock. Gone is the ideal of the scientist who knows the absolute truth. The happenings of nature are like rolling dice rather than like revolving stars; they are controlled by probability laws, not by causality, and the scientist resembles a gambler more than a prophet. He can tell you only his best posits - he never knows beforehand whether they will come true. He is a better gambler, though, than the man at the green table, because his statistical methods are superior. And his goal is staked higher - the goal of foretelling the rolling dice of the cosmos. If he is asked why he follows his methods, with what title he makes his predictions, he cannot answer that he has an irrefutable knowledge of the future; he can only lay his best bets. But he can prove that they are best bets, that making them is the best he can do - and if a man does his best, what else can you ask of him?" (Hans Reichenbach, "The Rise of Scientific Philosophy", 1951)

"Control is an attribute of a system. This word is not used in the way in which either an office manager or a gambler might use it; it is used as a name for connectiveness. That is, anything that consists of parts connected together will be called a system." (Stafford Beer, "Cybernetics and Management", 1959)

"There always remains an orbit that to the limited knowledge of man appears as an orbit of pure chance and marks life as a gamble. Man and his works are always exposed to the impact of unforeseen and uncontrollable events." (Ludwig von Mises, "The Ultimate Foundation of Economic Science: An Essay on Method", 1962)

John Haigh - Collected Quotes

"All this, though, is to miss the point of gambling, which is to accept the imbalance of chance in general yet deny it for the here and now. Individually we know, with absolute certainty, that 'the way things hap pen' and what actually happens to us are as different as sociology and poetry." (John Haigh," Taking Chances: Winning With Probability", 1999)

"As so often happens in mathematics, a convenient re-statement of a problem brings us suddenly up against the deepest questions of knowledge." (John Haigh," Taking Chances: Winning With Probability", 1999)

"But despite their frequently good intuition, many people go wrong in two places in particular. The first is in appreciating the real differences in magnitude that arise with rare events. If a chance is expressed as 'one in a thousand' or 'one in a million', the only message registered maybe that the chance is remote and yet one figure is a thousand times bigger than the other. Another area is in using partial information […]" (John Haigh," Taking Chances: Winning With Probability", 1999)

"It is the same with the numbers generated by roulette: the smoothness of probability in the long term allows any amount of local lumpiness on which to exercise our obsession with pattern. As the sequence of data lengthens, the relative proportions of odd or even, red or black, do indeed approach closer and closer to the 50-50 ratio predicted by probability, but the absolute discrepancy between one and the other will increase." (John Haigh," Taking Chances: Winning With Probability", 1999)

"Normal is safe; normal is central; normal is unexceptional. Yet it also means the pattern from which all others are drawn, the standard against which we measure the healthy specimen. In its simplest statistical form, normality is represented by the mean (often called the 'average') of a group of measurements." (John Haigh," Taking Chances: Winning With Probability", 1999)

"Probability therefore is a kind of corrective lens, allowing us, through an understanding of the nature of Chance, to refine our conclusions and approximate, if not achieve, the perfection of Design." (John Haigh," Taking Chances: Winning With Probability", 1999)

"The psychology of gambling includes both a conviction that the unusual must happen and a refusal to believe in it when it does. We are caught by the confusing nature of the long run; just as the imperturbable ocean seen from space will actually combine hurricanes and dead calms, so the same action, repeated over time, can show wide deviations from its normal expected results - deviations that do not themselves break the laws of probability. In fact, they have probabilities of their own." (John Haigh," Taking Chances: Winning With Probability", 1999)

"These so-called stochastic processes show up everywhere randomness is applied to the output of another random function. They provide, for instance, a method for describing the chance component of financial markets: not every value of the Dow is possible every day; the range of chance fluctuation centers on the opening price. Similarly, shuffling takes the output of the previous shuffle as its input. So, if you’re handed a deck in a given order, how much shuffling does it need to be truly random?" (John Haigh," Taking Chances: Winning With Probability", 1999)

"This notion of 'being due' - what is sometimes called the gambler’s fallacy - is a mistake we make because we cannot help it. The problem with life is that we have to live it from the beginning, but it makes sense only when seen from the end. As a result, our whole experience is one of coming to provisional conclusions based on insufficient evidence: read ing the signs, gauging the odds." (John Haigh," Taking Chances: Winning With Probability", 1999)

"We search for certainty and call what we find destiny. Everything is possible, yet only one thing happens - we live and die between these two poles, under the rule of probability. We prefer, though, to call it Chance: an old familiar embodied in gods and demons, harnessed in charms and rituals. We remind one another of fortune’s fickleness, each secretly believing himself exempt. I am master of my fate; you are dicing with danger; he is living in a fool’s paradise." (John Haigh," Taking Chances: Winning With Probability", 1999)

"Winning and losing is not simply a pastime; it is the model science uses to explore the universe. Flipping a coin or rolling a die is really asking a question: success or failure can be defined as getting a yes or no. So the distribution of probabilities in a game of chance is the same as that in any repeated test - even though the result of any one test is unpredictable." (John Haigh," Taking Chances: Winning With Probability", 1999)

02 June 2024

Francis Y Edgeworth - Collected Quotes

"[…] in the Law of Errors we are concerned only with the objective quantities about which mathematical reasoning is ordinarily exercised; whereas in the Method of Least Squares, as in the moral sciences, we are concerned with a psychical quantity - the greatest possible quantity of advantage." (Francis Y Edgeworth, "The method of least squares", 1883)

"It may be replied that the principles of greatest advantage and greatest proba￾bility do not coincide in .qeneral; that here, as in other depart￾ments of action~ when there is a discrepancy between the principle of utility and any other rul% the former should have precedence."  (Francis Y Edgeworth, "The method of least squares", 1883)

"The probable error, the mean error, the mean square of error, are forms divined to resemble in an essential feature the real object of which they are the imperfect symbols - the quantity of evil, the diminution of pleasure, incurred by error. The proper symbol, it is submitted, for the quantity of evil incurred by a simple error is not any power of the error, nor any definite function at all, but an almost arbitrary function, restricted only by the conditions that it should vanish when the independent variable, the error, vanishes, and continually increase with the increase of the error." (Francis Y Edgeworth, "The method of least squares", 1883)

"Our reasoning appears to become more accurate as our ignorance becomes more complete; that when we have embarked upon chaos we seem to drop down into a cosmos."  (Francis Y Edgeworth, "The Philosophy of Chance", Mind Vol. 9, 1884) 

"Probability may be described, agreeably to general usage, as importing partial incomplete belief." (Francis Y Edgeworth, "The Philosophy of Chance", Mind Vol. 9, 1884)

"Observations and statistics agree in being quantities grouped about a Mean; they differ, in that the Mean of observations is real, of statistics is fictitious. The mean of observations is a cause, as it were the source from which diverging errors emanate. The mean of statistics is a description, a representative quantity put for a whole group, the best representative of the group, that quantity which, if we must in practice put one quantity for many, minimizes the error unavoidably attending such practice. Thus measurements by the reduction of which we ascertain a real time, number, distance are observations. Returns of prices, exports and imports, legitimate and illegitimate marriages or births and so forth, the averages of which constitute the premises of practical reasoning, are statistics. In short, observations are different copies of one original; statistics are different originals affording one ‘generic portrait’. Different measurements of the same man are observations; but measurements of different men, grouped round l’homme moyen, are prima facie at least statistics." (Francis Y Edgeworth, 1885)

"What is required for the elimination of chance is not that the raw material of our observations should fulfill the law of error; but that they should be constant to any law." (Francis Y Edgeworth, 1885)

"The Calculus of Probabilities is an instrument which requires the living hand to direct it" (Francis Y Edgeworth, 1887)

"The swarm of probabilities flying hither and thither, does not settle down on any particular point" (Francis Y Edgeworth, 1887)

"However we define error, the idea of calculating its extent may appear paradoxical. A science of errors seems a contradiction in terms." (Francis Y Edgeworth, "The Element of Chance in Competitive Examinations", Journal of the Royal Statistical Society Vol. 53, 1890) 

"What real and permanent tendencies there are lie hid beneath the shifting superfices of chance, as it were a desert in which the inexperienced traveller mistakes the temporary agglomerations of drifting sand for the real configuration of the ground" (Francis Y Edgeworth, 1898)

"[...] the great objection to the geometric mean is its cumbrousness." (Francis Y Edgeworth, 1906)

24 October 2023

Ian Hacking - Collected Quotes

"A single observation that is inconsistent with some generalization points to the falsehood of the generalization, and thereby 'points to itself'." (Ian Hacking, "The Emergence Of Probability", 1975)

"Many modern philosophers claim that probability is relation between an hypothesis and the evidence for it." (Ian Hacking, "The Emergence of Probability", 1975)

"Determinism was eroded during the nineteenth century and a space was cleared for autonomous laws of chance. The idea of human nature was displaced by a model of normal people with laws of dispersion. These two transformations were parallel and fed into each other. Chance made the world seem less capricious; it was legitimated because it brought order out of chaos. The greater the level of indeterminism in our conception of the world and of people, the higher the expected level of control." (Ian Hacking, "The Taming of Chance", 1990)

"Epistemology is the theory of knowledge and belief." (Ian Hacking, "The Taming of Chance", 1990)

"Logic is the theory of inference and argument. For this purpose we use the deductive and often tautological unravelling of axioms provided by pure mathematics, but also, and for most practical affairs, we now employ- sometimes precisely, sometimes informally - the logic of statistical inference." (Ian Hacking, "The Taming of Chance", 1990)

"Metaphysics is the science of the ultimate states of the universe." (Ian Hacking, "The Taming of Chance", 1990)

"The systematic collection of data about people has affected not only the ways in which we conceive of a society, but also the ways in which we describe our neighbour. It has profoundly transformed what we choose to do, who we try to be, and what we think of ourselves." (Ian Hacking, "The Taming of Chance", 1990)

"There is a seeming paradox: the more the indeterminism, the more the control. This is obvious in the physical sciences. Quantum physics takes for granted that nature is at bottom irreducibly stochastic. Precisely that discovery has immeasurably enhanced our ability to interfere with and alter the course of nature." (Ian Hacking, "The Taming of Chance", 1990)

"I write of the taming of chance, that is, of the way in which apparently chance or irregular events have been brought under the control of natural or social law. The world became not more chancy, but far less so. Chance, which was once the superstition of the vulgar, became the centrepiece of natural and social science, or so genteel and rational people are led to believe." (Ian Hacking, "The Taming of Chance", 1990)

"The best reaction to a paradox is to invent a genuinely new and deep idea." (Ian Hacking, "An Introduction to Probability and Inductive Logic", 2001)

24 September 2023

On Laws III: The Laws of Chance

"The facts of greatest outcome are those we think simple; maybe they really are so, because they are influenced only by a small number of well-defined circumstances, maybe they take on an appearance of simplicity because the various circumstances upon which they depend obey the laws of chance and so come to mutually compensate." (Henri Poincaré, "The Foundations of Science", 1913)

"It is easy without any very profound logical analysis to perceive the difference between a succession of favorable deviations from the laws of chance, and on the other hand, the continuous and cumulative action of these laws. It is on the latter that the principle of Natural Selection relies." (Sir Ronald A Fisher, "The Genetical Theory of Natural Selection", 1930)

"In a sense, of course, probability theory in the form of the simple laws of chance is the key to the analysis of warfare; […] My own experience of actual operational research work, has however, shown that its is generally possible to avoid using anything more sophisticated. […] In fact the wise operational research worker attempts to concentrate his efforts in finding results which are so obvious as not to need elaborate statistical methods to demonstrate their truth. In this sense advanced probability theory is something one has to know about in order to avoid having to use it." (Patrick M S Blackett, "Operations Research", Physics Today, 1951)

"Indeed, the laws of chance are just as necessary as the causal laws themselves." (David Bohm, "Causality and Chance in Modern Physics", 1957)

"Can there be laws of chance? The answer, it would seem should be negative, since chance is in fact defined as the characteristic of the phenomena which follow no law, phenomena whose causes are too complex to permit prediction." (Félix E Borel, "Probabilities and Life", 1962)

"[In quantum mechanics] we have the paradoxical situation that observable events obey laws of chance, but that the probability for these events itself spreads according to laws which are in all essential features causal laws." (Max Born, Natural Philosophy of Cause and Chance, 1949)

"[...] the conception of chance enters in the very first steps of scientific activity in virtue of the fact that no observation is absolutely correct. I think chance is a more fundamental conception that causality; for whether in a concrete case, a cause-effect relation holds or not can only be judged by applying the laws of chance to the observation." (Max Born, 1949)

16 March 2023

Ivar Ekeland - Collected Quotes

"It is true that every aspect of the roll of dice may be suspect: the dice themselves, the form and texture of the surface, the person throwing them. If we push the analysis to its extreme, we may even wonder what chance has to do with it at all. Neither the course of the dice nor their rebounds rely on chance; they are governed by the strict determinism of rational mechanics. Billiards is based on the same principles, and it has never been considered a game of chance. So in the final analysis, chance lies in the clumsiness, the inexperience, or the naiveté of the thrower - or in the eye of the observer." (Ivar Ekeland, "The Broken Dice, and Other Mathematical Tales of Chance", 1993)

"Whether we shuffle cards or roll dice, chance is only a result of our human lack of deftness: we don't have enough control to immobilize a die at will or to individually direct the cards in a deck. The comparison is an important one nonetheless, and highlights the limits of this method of creating chance - it doesn't matter who rolls the dice, but we wouldn't let just anyone shuffle the cards." (Ivar Ekeland, "The Broken Dice, and Other Mathematical Tales of Chance", 1993)

"A pendulum is simply a small load suspended to a string or to a rod fixed at one end. If left alone it ends up hanging vertically, and if we push it away from the vertical, it starts beating. Galileo found that all beats last the same time, called the period, which depends on the length of the pendulum, but not on the amplitude of the beats or on the weight of the load. It also states that the period varies as the square root of the length: to double its period, one should make the pendulum four times as long. Making it heavier, or pushing it farther away from the vertical, has no effect. This property is known as isochrony, and it is the main reason why we are able to measure time with accuracy." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"An equilibrium is not always an optimum; it might not even be good. This may be the most important discovery of game theory." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"Chaos cuts with two edges. We have seen how it is impossible to retrieve past history from current observations. We will now show that it is impossible to predict future states from the current observations." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"It is a testimony to the power of education that classical mechanics could operate for so long under a mistaken conception. Teaching and research concentrated on integrable systems, each feeding the other, until in the end we had no longer the tools nor the interest for studying nonintegrable systems." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"Nowadays, however, we are much more aware of the fact that the best proof in the world is worth no more than its premises: every scientific theory is transitory and provisional, in wait for a better one, and accepted only as long as the experimental results conform to its predictions." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"The measurement of time was the first example of a scientific discovery changing the technology." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"We do not discover mathematical truths; we remember them from our passages through this world outside our own." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

18 May 2022

Jacques Monod - Collected Quotes

"There are living systems; there is no living 'matter'. No substance, no single molecule, extracted and isolated from a living being possess, of its own, the aforementioned paradoxical properties. They are present in living systems only; that is to say, nowhere below the level of the cell." (Jacques Monod, "From Biology to Ethics", 1969)

"A totally blind process can by definition lead to anything; it can even lead to vision itself." (Jacques Monod, "Chance and Necessity: An Essay on the Natural Philosophy of Modern Biology", 1970)

"Among all the occurrences possible in the universe the a priori probability of any particular one of them verges upon zero. Yet the universe exists; particular events must take place in it, the probability of which (before the event) was infinitesimal. At the present time we have no legitimate grounds for either asserting or denying that life got off to but a single start on earth, and that, as a consequence, before it appeared its chances of occurring were next to nil. [...] Destiny is written concurrently with the event, not prior to it." (Jacques Monod, "Chance and Necessity: An Essay on the Natural Philosophy of Modern Biology", 1970)

"Even today a good many distinguished minds seem unable to accept or even to understand that from a source of noise natural selection alone and unaided could have drawn all the music of the biosphere. In effect natural selection operates upon the products of chance and can feed nowhere else; but it operates in a domain of very demanding conditions, and from this domain chance is barred. It is not to chance but to these conditions that evolution owes its generally progressive course, its successive conquests, and the impression it gives of a smooth and steady unfolding." (Jacques Monod, "Chance and Necessity: An Essay on the Natural Philosophy of Modern Biology", 1970)

"Every living being is also a fossil. Within it, all the way down to the microscopic structure of its proteins, it bears the traces if not the stigmata of its ancestry." (Jacques Monod, "Chance and Necessity: An Essay on the Natural Philosophy of Modern Biology", 1970)

"Evolution in the biosphere is therefore a necessarily irreversible process defining a direction in time; a direction which is the same as that enjoined by the law of increasing entropy, that is to say, the second law of thermodynamics. This is far more than a mere comparison: the second law is founded upon considerations identical to those which establish the irreversibility of evolution. Indeed, it is legitimate to view the irreversibility of evolution as an expression of the second law in the biosphere." (Jacques Monod, "Chance and Necessity: An Essay on the Natural Philosophy of Modern Biology", 1970)

"A curious aspect of the theory of evolution is that everybody thinks he understands it." (Jacques Monod, "On the Molecular Theory of Evolution", 1974)

"One of the great problems of philosophy, is the relationship between the realm of knowledge and the realm of values. Knowledge is what is; values are what ought to be. I would say that all traditional philosophies up to and including Marxism have tried to derive the “ought” from the “is.” My point of view is that this is impossible, this is a farce." (Jacques Monod)

"The scientific attitude implies the postulate of objectivity - that is to say, the fundamental postulate that there is no plan; that there is no intention in the universe." (Jacques Monod)

05 July 2021

F David Peat - Collected Quotes

"A good poem has a unified structure, each word fits perfectly, there is nothing arbitrary about it, metaphors hold together and interlock, the sound of a word and its reflections of meaning complement each other. Likewise postmodern physics asks: How well does everything fit together in a theory? How inevitable are its arguments? Are the assumptions well founded or somewhat arbitrary? Is its overall mathematical form particularly elegant?" (F David Peat, "From Certainty to Uncertainty", 2002)

"A model is a simplified picture of physical reality; one in which, for example, certain contingencies such as friction, air resistance, and so on have been neglected. This model reproduces within itself some essential feature of the universe. While everyday events in nature are highly contingent and depend upon all sorts of external perturbations and contexts, the idealized model aims to produce the essence of phenomena." (F David Peat, "From Certainty to Uncertainty", 2002)

"A system at a bifurcation point, when pushed slightly, may begin to oscillate. Or the system may flutter around for a time and then revert to its normal, stable behavior. Or, alternatively it may move into chaos. Knowing a system within one range of circumstances may offer no clue as to how it will react in others. Nonlinear systems always hold surprises." (F David Peat, "From Certainty to Uncertainty", 2002)

"A theory makes certain predictions and allows calculations to be made that can be tested directly through experiments and observations. But a theory such as superstrings talks about quantum objects that exist in a multidimensional space and at incredibly short distances. Other grand unified theories would require energies close to those experienced during the creation of the universe to test their predictions." (F David Peat, "From Certainty to Uncertainty", 2002)

"Although the detailed moment-to-moment behavior of a chaotic system cannot be predicted, the overall pattern of its 'random' fluctuations may be similar from scale to scale. Likewise, while the fine details of a chaotic system cannot be predicted one can know a little bit about the range of its 'random' fluctuation." (F David Peat, "From Certainty to Uncertainty", 2002)

"An algorithm is a simple rule, or elementary task, that is repeated over and over again. In this way algorithms can produce structures of astounding complexity." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos itself is one form of a wide range of behavior that extends from simple regular order to systems of incredible complexity. And just as a smoothly operating machine can become chaotic when pushed too hard (chaos out of order), it also turns out that chaotic systems can give birth to regular, ordered behavior (order out of chaos). […] Chaos and chance don’t mean the absence of law and order, but rather the presence of order so complex that it lies beyond our abilities to grasp and describe it." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos theory explains the ways in which natural and social systems organize themselves into stable entities that have the ability to resist small disturbances and perturbations. It also shows that when you push such a system too far it becomes balanced on a metaphoric knife-edge. Step back and it remains stable; give it the slightest nudge and it will move into a radically new form of behavior such as chaos." (F David Peat, "From Certainty to Uncertainty", 2002)

"Giving people new mental tools to represent aspects of the world around them meant that they could now externalize and objectify that world. Proceeding in this way they could treat the world as external to themselves and as something to be contemplated within the imagination. The world now became an object to be manipulated within the theater of the mind, rather than an external tangible reality. This also meant that people could gain increasing control over the world around them, yet always at the expense of a loss of direct involvement. The more we objectify the world, the more we are in danger of losing touch with that sense of immediacy felt by active participants in nature." (F David Peat, "From Certainty to Uncertainty", 2002)

"In a linear system a tiny push produces a small effect, so that cause and effect are always proportional to each other. If one plotted on a graph the cause against the effect, the result would be a straight line. In nonlinear systems, however, a small push may produce a small effect, a slightly larger push produces a proportionately larger effect, but increase that push by a hair’s breadth and suddenly the system does something radically different." (F David Peat, "From Certainty to Uncertainty", 2002)

"In chaos theory this 'butterfly effect' highlights the extreme sensitivity of nonlinear systems at their bifurcation points. There the slightest perturbation can push them into chaos, or into some quite different form of ordered behavior. Because we can never have total information or work to an infinite number of decimal places, there will always be a tiny level of uncertainty that can magnify to the point where it begins to dominate the system. It is for this reason that chaos theory reminds us that uncertainty can always subvert our attempts to encompass the cosmos with our schemes and mathematical reasoning." (F David Peat, "From Certainty to Uncertainty", 2002)

"In essence, mathematicians wanted to prove two things: 1.Mathematics is consistent: Mathematics contains no internal contradictions. There are no slips of reason or ambiguities. No matter from what direction we approach the edifice of mathematics, it will always display the same rigor and truth. 2.Mathematics is complete: No mathematical truths are left hanging. Nothing needs adding to the system. Mathematicians can prove every theorem with total rigor so that nothing is excluded from the overall system." (F David Peat, "From Certainty to Uncertainty", 2002)

"It is not so much that particular languages evolve and then cause us to see the world in a given way, but that language and worldview develop side by side to the point where language becomes so ingrained that it constantly supports a specific way of seeing and structuring the world. In the end it becomes difficult to see the world in any other light."  (F David Peat, "From Certainty to Uncertainty", 2002)

"Lessons from chaos theory show that energy is always needed for reorganization. And for a new order to appear an organization must be willing to allow a measure of chaos to occur; chaos being that which no one can totally control. It means entering a zone where no one can predict the final outcome or be truly confident as to what will happen." (F David Peat, "From Certainty to Uncertainty", 2002)

"Mathematical fractals are generated by repeating the same simple steps at ever decreasing scales. In this way an apparently complex shape, containing endless detail, can be generated by the repeated application of a simple algorithm. In turn these fractals mimic some of the complex forms found in nature. After all, many organisms and colonies also grow though the repetition of elementary processes such as, for example, branching and division." (F David Peat, "From Certainty to Uncertainty", 2002)

"Quantum chance is absolute. […] Quantum chance is not a measure of ignorance but an inherent property. […] Chance in quantum theory is absolute and irreducible." (F David Peat, "From Certainty to Uncertainty", 2002)

"Science is like photographing a series of close-ups with your back to the sun. No matter which way you move, your shadow always falls across the scene you photograph. No matter what you do, you can never efface yourself from the photographed scene." (F David Peat, "From Certainty to Uncertainty", 2002)

"Science is that story our society tells itself about the cosmos. Science supposedly provides an objective account of the material world based upon measurement and quantification so that structure, process, movement, and transformation can be described mathematically in terms of fundamental laws." (F David Peat, "From Certainty to Uncertainty", 2002)

"Science proceeds by abstracting what is essential from the accidental details of matter and process. […] Science begins with our relationship to nature. The facts it discovers about the universe are answers to human questions and involve human-designed experiments." (F David Peat, "From Certainty to Uncertainty", 2002)

"The danger arises when a culture takes its own story as the absolute truth, and seeks to impose this truth on others as the yardstick of all knowledge and belief." (F David Peat, "From Certainty to Uncertainty", 2002)

"The quantum world is in a constant process of change and transformation. On the face of it, all possible processes and transformations could take place, but nature’s symmetry principles place limits on arbitrary transformation. Only those processes that do not violate certain very fundamental symmetry principles are allowed in the natural world." (F David Peat, "From Certainty to Uncertainty", 2002)

"The theories of science are all about idealized models and, in turn, these models give pictures of reality. […] But when we speak of the quantum world we find we are employing concepts that simply do not fit. When we discuss our models of reality we are continually importing ideas that are inappropriate and have no real meaning in the quantum domain." (F David Peat, "From Certainty to Uncertainty", 2002)

"There are endless examples of elaborate structures and apparently complex processes being generated through simple repetitive rules, all of which can be easily simulated on a computer. It is therefore tempting to believe that, because many complex patterns can be generated out of a simple algorithmic rule, all complexity is created in this way." (F David Peat, "From Certainty to Uncertainty", 2002)

"To make a quantum observation or to register a measurement in any way, at least one quantum of energy must be exchanged between apparatus and quantum object. But because a quantum is indivisible, it cannot be split or divided. At the moment of observation we cannot know if that quantum came from the measuring apparatus or from the quantum object." (F David Peat, "From Certainty to Uncertainty", 2002)

"Quantum theory forces us to see the limits of our abilities to make images, to create metaphors, and push language to its ends. As we struggle to gaze into the limits of nature we dimly begin to discern something hidden in the dark shadows. That something consists of ourselves, our minds, our language, our intellect, and our imagination, all of which have been stretched to their limits." (F David Peat, "From Certainty to Uncertainty", 2002)

"Quantum theory introduced uncertainty into physics; not an uncertainty that arises out of mere ignorance but a fundamental uncertainty about the very universe itself. Uncertainty is the price we pay for becoming participators in the universe. Ultimate knowledge may only be possible for ethereal beings who lie outside the universe and observe it from their ivory towers." (F David Peat, "From Certainty to Uncertainty", 2002) 

"Where we find certainty and truth in mathematics we also find beauty. Great mathematics is characterized by its aesthetics. Mathematicians delight in the elegance, economy of means, and logical inevitability of proof. It is as if the great mathematical truths can be no other way. This light of logic is also reflected back to us in the underlying structures of the physical world through the mathematics of theoretical physics." (F David Peat, "From Certainty to Uncertainty", 2002)

"[…] while chaos theory deals in regions of randomness and chance, its equations are entirely deterministic. Plug in the relevant numbers and out comes the answer. In principle at least, dealing with a chaotic system is no different from predicting the fall of an apple or sending a rocket to the moon. In each case deterministic laws govern the system. This is where the chance of chaos differs from the chance that is inherent in quantum theory." (F David Peat, "From Certainty to Uncertainty", 2002)

"While chaos theory is, in the last analysis, no more than a metaphor for human society, it can be a valuable metaphor. It makes us sensitive to the types of organizations we create and the way we deal with the situations that surround us." (F David Peat, "From Certainty to Uncertainty", 2002)

"Quantum theory stresses the irreducible link between observer and observed and the basic holism of all phenomena. Indigenous science also holds that there is no separation between the individual and society, between matter and spirit, between each one of us and the whole of nature." (F David Peat, "The Blackfoot Physics", 2006)

"Art and music make manifest, by bringing into conscious awareness, that which has previously been felt only tentatively and internally. Art, in its widest sense, is a form of play that lies at the origin of all making, of language, and of the mind's awareness of its place within the world. Art, in all its forms, makes manifest the spiritual dimension of the cosmos, and expresses our relationship to the natural world. This may have been the cause of that natural light which first illuminated the preconscious minds of early hominids." (F David Peat, "Pathways of Chance", 2007)

07 June 2021

On Patterns (1990-1999)

"Mathematics is an exploratory science that seeks to understand every kind of pattern - patterns that occur in nature, patterns invented by the human mind, and even patterns created by other patterns." (Lynn A Steen, "The Future of Mathematics Education", 1990)

"Phenomena having uncertain individual outcomes but a regular pattern of outcomes in many repetitions are called random. 'Random' is not a synonym for 'haphazard' but a description of a kind of order different from the deterministic one that is popularly associated with science and mathematics. Probability is the branch of mathematics that describes randomness." (David S Moore, "Uncertainty", 1990)

"Systems thinking is a framework for seeing interrelationships rather than things, for seeing patterns rather than static snapshots. It is a set of general principles spanning fields as diverse as physical and social sciences, engineering and management." (Peter Senge, "The Fifth Discipline", 1990)

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"Chaos demonstrates that deterministic causes can have random effects […] There's a similar surprise regarding symmetry: symmetric causes can have asymmetric effects. […] This paradox, that symmetry can get lost between cause and effect, is called symmetry-breaking. […] From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"In everyday language, the words 'pattern' and 'symmetry' are used almost interchangeably, to indicate a property possessed by a regular arrangement of more-or-less identical units […]” (Ian Stewart & Martin Golubitsky, “Fearful Symmetry: Is God a Geometer?”, 1992)

"Scientists have discovered many peculiar things, and many beautiful things. But perhaps the most beautiful and the most peculiar thing that they have discovered is the pattern of science itself. Our scientific discoveries are not independent isolated facts; one scientific generalization finds its explanation in another, which is itself explained by yet another. By tracing these arrows of explanation back toward their source we have discovered a striking convergent pattern - perhaps the deepest thing we have yet learned about the universe." (Steven Weinberg, "Dreams of a Final Theory: The Scientist’s Search for the Ultimate Laws of Nature", 1992)

"Searching for patterns is a way of thinking that is essential for making generalizations, seeing relationships, and understanding the logic and order of mathematics. Functions evolve from the investigation of patterns and unify the various aspects of mathematics." (Marilyn Burns, "About Teaching Mathematics: A K–8 Resource", 1992)

"Symmetry is bound up in many of the deepest patterns of Nature, and nowadays it is fundamental to our scientific understanding of the universe. Conservation principles, such as those for energy or momentum, express a symmetry that (we believe) is possessed by the entire space-time continuum: the laws of physics are the same everywhere." (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"World view, a concept borrowed from cultural anthropology, refers to the culturally dependent, generally subconscious, fundamental organization of the mind. This conceptual organization manifests itself as a set of presuppositions that predispose one to feel, think, and act in predictable patterns." (Kenneth G Tobin, "The practice of constructivism in science education", 1993)

"[For] us to be able to speak and understand novel sentences, we have to store in our heads not just the words of our language but also the patterns of sentences possible in our language. These patterns, in turn, describe not just patterns of words but also patterns of patterns. Linguists refer to these patterns as the rules of language stored in memory; they refer to the complete collection of rules as the mental grammar of the language, or grammar for short." (Ray Jackendoff, "Patterns in the Mind", 1994)

"A neural network is characterized by A) its pattern of connections between the neurons (called its architecture), B) its method of determining the weights on the connections (called its training, or learning, algorithm), and C) its activation function." (Laurene Fausett, "Fundamentals of Neural Networks", 1994)

"At the other far extreme, we find many systems ordered as a patchwork of parallel operations, very much as in the neural network of a brain or in a colony of ants. Action in these systems proceeds in a messy cascade of interdependent events. Instead of the discrete ticks of cause and effect that run a clock, a thousand clock springs try to simultaneously run a parallel system. Since there is no chain of command, the particular action of any single spring diffuses into the whole, making it easier for the sum of the whole to overwhelm the parts of the whole. What emerges from the collective is not a series of critical individual actions but a multitude of simultaneous actions whose collective pattern is far more important. This is the swarm model." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Each of nature's patterns is a puzzle, nearly always a deep one. Mathematics is brilliant at helping us to solve puzzles. It is a more or less systematic way of digging out the rules and structures that lie behind some observed pattern or regularity, and then using those rules and structures to explain what's going on." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Human mind and culture have developed a formal system of thought for recognizing, classifying, and exploiting patterns. We call it mathematics. By using mathematics to organize and systematize our ideas about patterns, we have discovered a great secret: nature's patterns are not just there to be admired, they are vital clues to the rules that govern natural processes." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Patterns possess utility as well as beauty. Once we have learned to recognize a background pattern, exceptions suddenly stand out." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Self-organization refers to the spontaneous formation of patterns and pattern change in open, nonequilibrium systems. […] Self-organization provides a paradigm for behavior and cognition, as well as the structure and function of the nervous system. In contrast to a computer, which requires particular programs to produce particular results, the tendency for self-organization is intrinsic to natural systems under certain conditions." (J A Scott Kelso, "Dynamic Patterns : The Self-organization of Brain and Behavior", 1995)

"Symmetry is basically a geometrical concept. Mathematically it can be defined as the invariance of geometrical patterns under certain operations. But when abstracted, the concept applies to all sorts of situations. It is one of the ways by which the human mind recognizes order in nature. In this sense symmetry need not be perfect to be meaningful. Even an approximate symmetry attracts one's attention, and makes one wonder if there is some deep reason behind it." (Eguchi Tohru & ?K Nishijima , "Broken Symmetry: Selected Papers Of Y Nambu", 1995)

"Whatever the reasons, mathematics definitely is a useful way to think about nature. What do we want it to tell us about the patterns we observe? There are many answers. We want to understand how they happen; to understand why they happen, which is different; to organize the underlying patterns and regularities in the most satisfying way; to predict how nature will behave; to control nature for our own ends; and to make practical use of what we have learned about our world. Mathematics helps us to do all these things, and often it is indispensable." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"If we are to have meaningful, connected experiences; ones that we can comprehend and reason about; we must be able to discern patterns to our actions, perceptions, and conceptions. Underlying our vast network of interrelated literal meanings (all of those words about objects and actions) are those imaginative structures of understanding such as schema and metaphor, such as the mental imagery that allows us to extrapolate a path, or zoom in on one part of the whole, or zoom out until the trees merge into a forest." (William H Calvin, "The Cerebral Code", 1996)

"The methods of science include controlled experiments, classification, pattern recognition, analysis, and deduction. In the humanities we apply analogy, metaphor, criticism, and (e)valuation. In design we devise alternatives, form patterns, synthesize, use conjecture, and model solutions." (Béla H Bánáthy, "Designing Social Systems in a Changing World", 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"The role of science, like that of art, is to blend proximate imagery with more distant meaning, the parts we already understand with those given as new into larger patterns that are coherent enough to be acceptable as truth. Biologists know this relation by intuition during the course of fieldwork, as they struggle to make order out of the infinitely varying patterns of nature." (Edward O Wilson, "In Search of Nature", 1996)

"Mathematics can function as a telescope, a microscope, a sieve for sorting out the signal from the noise, a template for pattern perception, a way of seeking and validating truth. […] A knowledge of the mathematics behind our ideas can help us to fool ourselves a little less often, with less drastic consequences." (K C Cole, "The Universe and the Teacup: The Mathematics of Truth and Beauty", 1997)

"Mathematics is a way of thinking that can help make muddy relationships clear. It is a language that allows us to translate the complexity of the world into manageable patterns. In a sense, it works like turning off the houselights in a theater the better to see a movie. Certainly, something is lost when the lights go down; you can no longer see the faces of those around you or the inlaid patterns on the ceiling. But you gain a far better view of the subject at hand." (K C Cole, "The Universe and the Teacup: The Mathematics of Truth and Beauty", 1997)

"A formal system consists of a number of tokens or symbols, like pieces in a game. These symbols can be combined into patterns by means of a set of rules which defines what is or is not permissible (e.g. the rules of chess). These rules are strictly formal, i.e. they conform to a precise logic. The configuration of the symbols at any specific moment constitutes a ‘state’ of the system. A specific state will activate the applicable rules which then transform the system from one state to another. If the set of rules governing the behaviour of the system are exact and complete, one could test whether various possible states of the system are or are not permissible." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Mathematics, in the common lay view, is a static discipline based on formulas taught in the school subjects of arithmetic, geometry, algebra, and calculus. But outside public view, mathematics continues to grow at a rapid rate, spreading into new fields and spawning new applications. The guide to this growth is not calculation and formulas but an open-ended search for pattern." (Lynn A Steen, "The Future of Mathematics Education", 1998)

"A neural network consists of large numbers of simple neurons that are richly interconnected. The weights associated with the connections between neurons determine the characteristics of the network. During a training period, the network adjusts the values of the interconnecting weights. The value of any specific weight has no significance; it is the patterns of weight values in the whole system that bear information. Since these patterns are complex, and are generated by the network itself (by means of a general learning strategy applicable to the whole network), there is no abstract procedure available to describe the process used by the network to solve the problem. There are only complex patterns of relationships." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Mathematics has traditionally been described as the science of number and shape. […] When viewed in this broader context, we see that mathematics is not just about number and shape but about pattern and order of all sorts. Number and shape - arithmetic and geometry - are but two of many media in which mathematicians work. Active mathematicians seek patterns wherever they arise." (Lynn A Steen, "The Future of Mathematics Education", 1998)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Sequences of random numbers also inevitably display certain regularities. […] The trouble is, just as no real die, coin, or roulette wheel is ever likely to be perfectly fair, no numerical recipe produces truly random numbers. The mere existence of a formula suggests some sort of predictability or pattern." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"We use mathematics and statistics to describe the diverse realms of randomness. From these descriptions, we attempt to glean insights into the workings of chance and to search for hidden causes. With such tools in hand, we seek patterns and relationships and propose predictions that help us make sense of the world."  (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Complexity is looking at interacting elements and asking how they form patterns and how the patterns unfold. It’s important to point out that the patterns may never be finished. They’re open-ended. In standard science this hit some things that most scientists have a negative reaction to. Science doesn’t like perpetual novelty." (W Brian Arthur, 1999)

"Randomness is the very stuff of life, looming large in our everyday experience. […] The fascination of randomness is that it is pervasive, providing the surprising coincidences, bizarre luck, and unexpected twists that color our perception of everyday events." (Edward Beltrami, "Chaos and Order in Mathematics and Life", 1999)

"The first view of randomness is of clutter bred by complicated entanglements. Even though we know there are rules, the outcome is uncertain. Lotteries and card games are generally perceived to belong to this category. More troublesome is that nature's design itself is known imperfectly, and worse, the rules may be hidden from us, and therefore we cannot specify a cause or discern any pattern of order. When, for instance, an outcome takes place as the confluence of totally unrelated events, it may appear to be so surprising and bizarre that we say that it is due to blind chance." (Edward Beltrami. "What is Random?: Chance and Order in Mathematics and Life", 1999)

26 May 2021

On Randomness XIX (Chaos II)

"The chaos theory will require scientists in all fields to, develop sophisticated mathematical skills, so that they will be able to better recognize the meanings of results. Mathematics has expanded the field of fractals to help describe and explain the shapeless, asymmetrical find randomness of the natural environment." (Theoni Pappas, "More Joy of Mathematics: Exploring mathematical insights & concepts", 1991)

"Chaos demonstrates that deterministic causes can have random effects […] There's a similar surprise regarding symmetry: symmetric causes can have asymmetric effects. […] This paradox, that symmetry can get lost between cause and effect, is called symmetry-breaking. […] From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"Systems, acting dynamically, produce (and incidentally, reproduce) their own boundaries, as structures which are complementary (necessarily so) to their motion and dynamics. They are liable, for all that, to instabilities chaos, as commonly interpreted of chaotic form, where nowadays, is remote from the random. Chaos is a peculiar situation in which the trajectories of a system, taken in the traditional sense, fail to converge as they approach their limit cycles or 'attractors' or 'equilibria'. Instead, they diverge, due to an increase, of indefinite magnitude, in amplification or gain.(Gordon Pask, "Different Kinds of Cybernetics", 1992)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Randomness, chaos, uncertainty, and chance are all a part of our lives. They reside at the ill-defined boundaries between what we know, what we can know, and what is beyond our knowing. They make life interesting." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"There are only patterns, patterns on top of patterns, patterns that affect other patterns. Patterns hidden by patterns. Patterns within patterns. If you watch close, history does nothing but repeat itself. What we call chaos is just patterns we haven't recognized. What we call random is just patterns we can't decipher. what we can't understand we call nonsense. What we can't read we call gibberish. There is no free will. There are no variables." (Chuck Palahniuk, "Survivor", 1999)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"Chaos is impatient. It's random. And above all it's selfish. It tears down everything just for the sake of change, feeding on itself in constant hunger. But Chaos can also be appealing. It tempts you to believe that nothing matters except what you want." (Rick Riordan, "The Throne of Fire", 2011)

"A system in which a few things interacting produce tremendously divergent behavior; deterministic chaos; it looks random but its not." (Chris Langton)

09 May 2021

On Randomness XXVI (Universe)

"Random chance was not a sufficient explanation of the Universe - in fact, random chance was not sufficient to explain random chance; the pot could not hold itself." (Robert A Heinlein, "Stranger in a Strange Land", 1961)

"The line between inner and outer landscapes is breaking down. Earthquakes can result from seismic upheavals within the human mind. The whole random universe of the industrial age is breaking down into cryptic fragments." (William S Burroughs, [preface] 1972)

"There is no reason to assume that the universe has the slightest interest in intelligence -  or even in life. Both may be random accidental by-products of its operations like the beautiful patterns on a butterfly's wings. The insect would fly just as well without them […]" (Arthur C Clarke, "The Lost Worlds of 2001", 1972)

"It is tempting to wonder if our present universe, large as it is and complex though it seems, might not be merely the result of a very slight random increase in order over a very small portion of an unbelievably colossal universe which is virtually entirely in heat-death." (Isaac Asimov, 1976)

"Perhaps randomness is not merely an adequate description for complex causes that we cannot specify. Perhaps the world really works this way, and many events are uncaused in any conventional sense of the word." (Stephen J Gould, "Hen's Teeth and Horse's Toes", 1983)

"The world of science lives fairly comfortably with paradox. We know that light is a wave and also that light is a particle. The discoveries made in the infinitely small world of particle physics indicate randomness and chance, and I do not find it any more difficult to live with the paradox of a universe of randomness and chance and a universe of pattern and purpose than I do with light as a wave and light as a particle. Living with contradiction is nothing new to the human being." (Madeline L'Engle, "Two-Part Invention: The Story of a Marriage", 1988)

"Intriguingly, the mathematics of randomness, chaos, and order also furnishes what may be a vital escape from absolute certainty - an opportunity to exercise free will in a deterministic universe. Indeed, in the interplay of order and disorder that makes life interesting, we appear perpetually poised in a state of enticingly precarious perplexity. The universe is neither so crazy that we can’t understand it at all nor so predictable that there’s nothing left for us to discover." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1997)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"The first view of randomness is of clutter bred by complicated entanglements. Even though we know there are rules, the outcome is uncertain. Lotteries and card games are generally perceived to belong to this category. More troublesome is that nature's design itself is known imperfectly, and worse, the rules may be hidden from us, and therefore we cannot specify a cause or discern any pattern of order. When, for instance, an outcome takes place as the confluence of totally unrelated events, it may appear to be so surprising and bizarre that we say that it is due to blind chance." (Edward Beltrami. "What is Random?: Chance and Order in Mathematics and Life", 1999)

"The tissue of the world is built from necessities and randomness; the intellect of men places itself between both and can control them; it considers the necessity and the reason of its existence; it knows how randomness can be managed, controlled, and used." (Johann Wolfgang von Goethe)

30 April 2021

Statistical Tools III: Cards

"In short, absolute, so-called mathematical factors never find a firm basis in military calculations. From the very start there is an interplay of possibilities, probabilities, good luck and bad that weaves its way throughout the length and breadth of the tapestry. In the whole range the human activities war most closely resembles a game of cards." (Carl von Clausewitz, "On War", 1832)

"The law of large numbers is noted in events which are attributed to pure chance since we do not know their causes or because they are too complicated. Thus, games, in which the circumstances determining the occurrence of a certain card or certain number of points on a die infinitely vary, can not be subjected to any calculus. If the series of trials is continued for a long time, the different outcomes nevertheless appear in constant ratios. Then, if calculations according to the rules of a game are possible, the respective probabilities of eventual outcomes conform to the known Jakob Bernoulli theorem. However, in most problems of contingency a prior determination of chances of the various events is impossible and, on the contrary, they are calculated from the observed result." (Siméon-Denis Poisson, "Researches into the Probabilities of Judgements in Criminal and Civil Cases", 1837)

"As an instrument for selecting at random, I have found nothing superior to dice. It is most tedious to shuffle cards thoroughly be- tween each successive draw, and the method of mixing and stirring up marked balls in a bag is more tedious still. A teetotum or some form of roulette is preferable to these, but dice are better than all. When they are shaken and tossed in a basket, they hurtle so variously against one another and against the ribs of the basket-work that they tumble wildly about, and their positions at the outset afford no perceptible clue to what they will be after even a single good shake and toss." (Francis Galton, Nature vol. 42, 1890) 

"Scientific facts accumulate rapidly, and give rise to theories with almost equal rapidity. These theories are often wonderfully enticing, and one is apt to pass from one to another, from theory to theory, without taking care to establish each before passing on to the next, without assuring oneself that the foundation on which one is building is secure. Then comes the crash; the last theory breaks down utterly, and on attempting to retrace our steps to firm ground and start anew, we may find too late that one of the cards, possibly at the very foundation of the pagoda, is either faultily placed or in itself defective, and that this blemish easily remedied if detected in time has, neglected, caused the collapse of the whole structure on whose erection so much skill and perseverance have been spent." (Arthur M Marshall, 1894)

"If you take a pack of cards as it comes from the maker and shuffle it for a few minutes, all trace of the original systematic order disappears. The order will never come back however long you shuffle. Something has been done which cannot be undone, namely, the introduction of a random element in place of the arrangement." (Sir Arthur S Eddington, "The Nature of the Physical World", 1928)

"It seems hard to sneak a look at God's cards. But that He plays dice and uses 'telepathic' methods [...] is something that I cannot believe for a single moment." (Albert Einstein, [Letter to Cornel Lanczos] 1942)

"We must emphasize that such terms as 'select at random', 'choose at random', and the like, always mean that some mechanical device, such as coins, cards, dice, or tables of random numbers, is used." (Frederick Mosteller et al, "Principles of Sampling", Journal of the American Statistical Association Vol. 49 (265), 1954)

"A thorough understanding of game theory, should dim these greedy hopes. Knowledge of game theory does not make one a better card player, businessman or military strategist." (Anatol Rapoport, "The Use and Misuse of Game Theory," 1962)

"Life is like a game of cards. The hand that is dealt you represents determinism. The way you play it is free will." (Jawaharlal Nehru, Saturday Review, 1967)

"There may be such a thing as habitual luck. People who are said to be lucky at cards probably have certain hidden talents for those games in which skill plays a role. It is like hidden parameters in physics, this ability that does not surface and that I like to call 'habitual luck'." (Stanislaw Ulam, "Adventures of a Mathematician", 1976)

"Gambling was the place where statistics and profound human consequences met most nakedly, after all, and cards, even more than dice or the numbers on a roulette wheel, seemed able to define and perhaps even dictate a player's... luck." (Tim Powers, "Last Call", 1992)

"An example, which, like tossing a coin, is intimately associated with games of chance, is the shuffling of a deck of cards. […] the process is not completely random, if by what happens next we mean the outcome of the next single riffle, since one riffle cannot change any given order of the cards in the deck to any other given order. In particular, a single riffle cannot completely reverse the order of the cards, although a sufficient number of successive riffles, of course, can produce any order." (Edward N Lorenz, "The Essence of Chaos", 1993)

"Whether we shuffle cards or roll dice, chance is only a result of our human lack of deftness: we don't have enough control to immobilize a die at will or to individually direct the cards in a deck. The comparison is an important one nonetheless, and highlights the limits of this method of creating chance - it doesn't matter who rolls the dice, but we wouldn't let just anyone shuffle the cards." (Ivar Ekeland, "The Broken Dice, and Other Mathematical Tales of Chance", 1993)

"For several centuries that we know of, and probably for many centuries before that, flipping a coin (or rolling a die) has been the epitome of probability, the paradigm of randomness. You flip the coin (or roll the die), and nobody can accurately predict how it will fall. Nor can the most powerful computer predict correctly how it will fall, if it is flipped energetically enough. This is why cards, dice, and other gambling aids crop up so often in literature both directly and as metaphors. No doubt it is also the reason for the (perhaps excessive) popularity of gambling as entertainment. If anyone had any idea what numbers the lottery would show, or where the roulette ball will land, the whole industry would be a dead duck." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"We cannot really have a perfectly shuffled pack of perfect cards; this ‘collection of equally likely hands’ is actually a fiction. We create the idea, and then use the rules of arithmetic to calculate the required chances. This is characteristic of all mathematics, which concerns itself only with rules defining the behaviour of entities which are themselves undefined (such as ‘numbers’ or ‘points’)." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"To look at the development of physics since Newton is to observe a struggle to define the limits of science. Part of this process has been the intrusion of scientific methods and ideas into domains that have traditionally been the province of metaphysics or religion. In this conflict, Hawking’s phrase ‘to know the Mind of God’ is just one example of a border infringement. But by playing the God card, Hawking has cleverly fanned the flames of his own publicity, appealing directly to the popular allure of the scientist-as-priest." (Peter Coles, "Hawking and the Mind of God", 2000)

"In contrast, the system may be a pack of cards, and the dynamic may be to shuffle the pack and then take the top card. Imagine that the current top card is the ace of spades, and that after shuffling the pack the top card becomes the seven of diamonds. Does that imply that whenever the top card is the ace of spades then the next top card will always be the seven of diamonds? Of course not. So this system is random."(Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"In modelling terms, the difference between randomness and determinacy is clear enough. The randomness in the pack of cards arises from our failure to prescribe unique rules for getting from the current state to the next one. There are lots of different ways to shuffle a pack. The determinism of the cannonball is a combination of two things: fully prescribed rules of behaviour, and fully defined initial conditions. Notice that in both systems we are thinking on a very short timescale: it is the next state that matters - or, if time is flowing continuously, it is the state a tiny instant into the future. We don't need to consider long-term behaviour to distinguish randomness from determinacy."(Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"The randomness of the card-shuffle is of course caused by our lack of knowledge of the precise procedure used to shuffle the cards. But that is outside the chosen system, so in our practical sense it is not admissible. If we were to change the system to include information about the shuffling rule – for example, that it is given by some particular computer code for pseudo-random numbers, starting with a given ‘seed value’ – then the system would look deterministic. Two computers of the same make running the same ‘random shuffle’ program would actually produce the identical sequence of top cards."(Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"Players must accept the cards dealt to them. However, once they have those cards in hand, they alone choose how they will play them. They decide what risks and actions to take." (John C Maxwell, "The Difference Maker: Making Your Attitude Your Greatest Asset", 2006)

"It's a game of a million inferences. There are a lot of things to draw inferences from - cards played and not played. These inferences tell you something about the probabilities. It's got to be the best intellectual exercise out there. You're seeing through new situations every ten minutes. Bridge is about weighing gain/loss ratios. You're doing calculations all the time." (Warren Buffett)

"The card-player begins by arranging his hand for maximum sense. Scientists do the same with the facts they gather." (Isaac Asimov)

Statistical Tools I: Coins

"Equiprobability in the physical world is purely a hypothesis. We may exercise the greatest care and the most accurate of scientific instruments to determine whether or not a penny is symmetrical. Even if we are satisfied that it is, and that our evidence on that point is conclusive, our knowledge, or rather our ignorance, about the vast number of other causes which affect the fall of the penny is so abysmal that the fact of the penny’s symmetry is a mere detail. Thus, the statement 'head and tail are equiprobable' is at best an assumption." (Edward Kasner & James R Newman, "Mathematics and the Imagination", 1940)

"A misunderstanding of Bernoulli’s theorem is responsible for one of the commonest fallacies in the estimation of probabilities, the fallacy of the maturity of chances. When a coin has come down heads twice in succession, gamblers sometimes say that it is more likely to come down tails next time because ‘by the law of averages’ (whatever that may mean) the proportion of tails must be brought right some time." (William Kneale, "Probability and Induction", 1949)

"We must emphasize that such terms as 'select at random', 'choose at random', and the like, always mean that some mechanical device, such as coins, cards, dice, or tables of random numbers, is used." (Frederick Mosteller et al, "Principles of Sampling", Journal of the American Statistical Association Vol. 49 (265), 1954)

"And nobody can get [...] far without at least an acquaintance with the mathematics of probability, not to the extent of making its calculations and filling examination papers with typical equations, but enough to know when they can be trusted, and when they are cooked. For when their imaginary numbers correspond to exact quantities of hard coins unalterably stamped with heads and tails, they are safe within certain limits; for here we have solid certainty [...] but when the calculation is one of no constant and several very capricious variables, guesswork, personal bias, and pecuniary interests, come in so strong that those who began by ignorantly imagining that statistics cannot lie end by imagining equally ignorantly, that they never do anything else." (George B Shaw, "The World of Mathematics", 1956)

"[...] there can be such a thing as a simple probabilistic system. For example, consider the tossing of a penny. Here is a perfectly simple system, but one which is notoriously unpredictable. It maybe described in terms of a binary decision process, with a built-in even probability between the two possible outcomes." (Stafford Beer, "Cybernetics and Management", 1959)

"The shrewd guess, the fertile hypothesis, the courageous leap to a tentative conclusion - these are the most valuable coin of the thinker at work." (Jerome S Bruner, "The Process of Education", 1960)

"No Chancellor of the Exchequer could introduce his proposals for monetary and fiscal policy in the House of Commons by saying 'I have looked at all the forecasts, some go one way, some another; so I decided to toss a coin and assume inflationary tendencies if it came down heads and deflationary if it came down tails' [...] And statistics, however uncertain, can apparently provide some basis." (Ely Devons, "Essays in Economics", 1961)

"The equanimity of your average tosser of coins depends upon a law, or rather a tendency, or let us say a probability, or at any rate a mathematically calculable chance, which ensures that he will not upset himself by losing too much nor upset his opponent by winning too often." (Tom Stoppard, "Rosencrantz and Guildenstern Are Dead", 1967)

"A significant property of the value function, called loss aversion, is that the response to losses is more extreme than the response to gains. The common reluctance to accept a fair bet on the toss of a coin suggests that the displeasure of losing a sum of money exceeds the pleasure of winning the same amount. Thus the proposed value function is (i) defined on gains and losses, (ii) generally concave for gains and convex for losses, and (iii) steeper for losses than for gains." (Amos Tversky & Daniel Kahneman, "Rational Choice and the Framing of Decisions", The Journal of Business Vol. 59 (4), 1986)

"Flip a coin 100 times. Assume that 99 heads are obtained. If you ask a statistician, the response is likely to be: 'It is a biased coin'. But if you ask a probabilist, he may say: 'Wooow, what a rare event'." (Chamont Wang, "Sense and Nonsense of Statistical Inference", 1993)

"The coin is an example of complete randomness. It is the sort of randomness that one commonly has in mind when thinking of random numbers, or deciding to use a random-number generator." (Edward N Lorenz, "The Essence of Chaos", 1993)

"Losing streaks and winning streaks occur frequently in games of chance, as they do in real life. Gamblers respond to these events in asymmetric fashion: they appeal to the law of averages to bring losing streaks to a speedy end. And they appeal to that same law of averages to suspend itself so that winning streaks will go on and on. The law of averages hears neither appeal. The last sequence of throws of the dice conveys absolutely no information about what the next throw will bring. Cards, coins, dice, and roulette wheels have no memory." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"The dice and the roulette wheel, along with the stock market and the bond market, are natural laboratories for the study of risk because they lend themselves so readily to quantification; their language is the language of numbers." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"However, random walk theory also tells us that the chance that the balance never returns to zero - that is, that H stays in the lead for ever - is 0. This is the sense in which the 'law of averages' is true. If you wait long enough, then almost surely the numbers of heads and tails will even out. But this fact carries no implications about improving your chances of winning, if you're betting on whether H or T turns up. The probabilities are unchanged, and you don't know how long the 'long run' is going to be. Usually it is very long indeed." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"In everyday language, a fair coin is called random, but not a coin that shows head more often than tail. A coin that keeps a memory of its own record of heads and tails is viewed as even less random. This mental picture is present in the term random walk, especially as used in finance." (Benoit B Mandelbrot, "Fractals and Scaling in Finance: Discontinuity, concentration, risk", 1997)

"The basis of many misconceptions about probability is a belief in something usually referred to as 'the law of averages', which alleges that any unevenness in random events gets ironed out in the long run. For example, if a tossed coin keeps coming up heads, then it is widely believed that at some stage there will be a predominance of tails to balance things out." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"Sequences of random numbers also inevitably display certain regularities. […] The trouble is, just as no real die, coin, or roulette wheel is ever likely to be perfectly fair, no numerical recipe produces truly random numbers. The mere existence of a formula suggests some sort of predictability or pattern." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"For several centuries that we know of, and probably for many centuries before that, flipping a coin (or rolling a die) has been the epitome of probability, the paradigm of randomness. You flip the coin (or roll the die), and nobody can accurately predict how it will fall. Nor can the most powerful computer predict correctly how it will fall, if it is flipped energetically enough. This is why cards, dice, and other gambling aids crop up so often in literature both directly and as metaphors. No doubt it is also the reason for the (perhaps excessive) popularity of gambling as entertainment. If anyone had any idea what numbers the lottery would show, or where the roulette ball will land, the whole industry would be a dead duck." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"From the moment we first roll a die in a children’s board game, or pick a card (any card), we start to learn what probability is. But even as adults, it is not easy to tell what it is, in the general way." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"[...] the chance of a head (or a double six) is just a chance. The whole point of probability is to discuss uncertain eventualities before they occur. After this event, things are completely different. As the simplest illustration of this, note that even though we agree that if we flip a coin and roll two dice then the chance of a head is greater than the chance of a double six, nevertheless it may turn out that the coin shows a tail when the dice show a double six." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"We cannot really have a perfectly shuffled pack of perfect cards; this ‘collection of equally likely hands’ is actually a fiction. We create the idea, and then use the rules of arithmetic to calculate the required chances. This is characteristic of all mathematics, which concerns itself only with rules defining the behaviour of entities which are themselves undefined (such as ‘numbers’ or ‘points’)." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"If sinks, sources, saddles, and limit cycles are coins landing heads or tails, then the exceptions are a coin landing on edge. Yes, it might happen, in theory; but no, it doesn't, in practice." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"It's a bit like having a theory about coins that move in space, but only being able to measure their state by interrupting them with a table. We hypothesize that the coin may be able to revolve in space, a state that is neither ‘heads’ nor ‘tails’ but a kind of mixture. Our experimental proof is that when you stick a table in, you get heads half the time and tails the other half - randomly. This is by no means a perfect analogy with standard quantum theory - a revolving coin is not exactly in a superposition of heads and tails - but it captures some of the flavour." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"The chance events due to deterministic chaos, on the other hand, occur even within a closed system determined by immutable laws. Our most cherished examples of chance - dice, roulette, coin-tossing – seem closer to chaos than to the whims of outside events. So, in this revised sense, dice are a good metaphor for chance after all. It's just that we've refined our concept of randomness. Indeed, the deterministic but possibly chaotic stripes of phase space may be the true source of probability." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"The possibility of translating uncertainties into risks is much more restricted in the propensity view. Propensities are properties of an object, such as the physical symmetry of a die. If a die is constructed to be perfectly symmetrical, then the probability of rolling a six is 1 in 6. The reference to a physical design, mechanism, or trait that determines the risk of an event is the essence of the propensity interpretation of probability. Note how propensity differs from the subjective interpretation: It is not sufficient that someone’s subjective probabilities about the outcomes of a die roll are coherent, that is, that they satisfy the laws of probability. What matters is the die’s design. If the design is not known, there are no probabilities." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns - which frequently occur in random data - could equally well be derived from tossing a coin. So it is in the stock market as well." (Didier Sornette, "Why Stock Markets Crash: Critical events in complex financial systems", 2003)

"Suppose that while flipping a coin, a small black hole passed by and ate the coin. As long as we got to see the coin, the probabilities of heads and tails would add to one, but the possibility of a coin disappearing altogether into a black hole would have to be included. Once the coin crosses the event horizon of the black hole, it simply does not meaningfully exist in our universe anymore. Can we simply adjust our probabilistic interpretation to accommodate this outcome? Will we ever encounter negative probabilities?" (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)

"Random number generators do not always need to be symmetrical. This misconception of assuming equal likelihood for each outcome is fostered in a restricted learning environment, where learners see only such situations (that is, dice, coins and spinners). It is therefore very important for learners to be aware of situations where the different outcomes are not equally likely (as with the drawing-pins example)." (Alan Graham, "Developing Thinking in Statistics", 2006)

"The objectivist view is that probabilities are real aspects of the universe - propensities of objects to behave in certain ways - rather than being just descriptions of an observer’s degree of belief. For example, the fact that a fair coin comes up heads with probability 0.5 is a propensity of the coin itself. In this view, frequentist measurements are attempts to observe these propensities. Most physicists agree that quantum phenomena are objectively probabilistic, but uncertainty at the macroscopic scale - e.g., in coin tossing - usually arises from ignorance of initial conditions and does not seem consistent with the propensity view." (Stuart J Russell & Peter Norvig, "Artificial Intelligence: A Modern Approach", 2010)

"A very different - and very incorrect - argument is that successes must be balanced by failures (and failures by successes) so that things average out. Every coin flip that lands heads makes tails more likely. Every red at roulette makes black more likely. […] These beliefs are all incorrect. Good luck will certainly not continue indefinitely, but do not assume that good luck makes bad luck more likely, or vice versa." (Gary Smith, "Standard Deviations", 2014)

"Remember that even random coin flips can yield striking, even stunning, patterns that mean nothing at all. When someone shows you a pattern, no matter how impressive the person’s credentials, consider the possibility that the pattern is just a coincidence. Ask why, not what. No matter what the pattern, the question is: Why should we expect to find this pattern?" (Gary Smith, "Standard Deviations", 2014)

"We are seduced by patterns and we want explanations for these patterns. When we see a string of successes, we think that a hot hand has made success more likely. If we see a string of failures, we think a cold hand has made failure more likely. It is easy to dismiss such theories when they involve coin flips, but it is not so easy with humans. We surely have emotions and ailments that can cause our abilities to go up and down. The question is whether these fluctuations are important or trivial." (Gary Smith, "Standard Deviations", 2014)

"When statisticians, trained in math and probability theory, try to assess likely outcomes, they demand a plethora of data points. Even then, they recognize that unless it’s a very simple and controlled action such as flipping a coin, unforeseen variables can exert significant influence." (Zachary Karabell, "The Leading Indicators: A short history of the numbers that rule our world", 2014)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...