26 March 2023

Simone Weil - Collected Quotes

"Our science is like a store filled with the most subtle intellectual devices for solving the most complex problems, and yet we are almost incapable of applying the elementary principles of rational thought. In every sphere, we seem to have lost the very elements of intelligence: the ideas of limit, measure, degree, proportion, relation, comparison, contingency, interdependence, interrelation of means and ends." (Simone Weil, "The Power of Words", 1937) 

"Art is the symbol of the two noblest human efforts: to construct and to refrain from destruction." (Simone Weil, "The Pre-War Notebook 1933-1939")

"Attention consists of suspending our thought, leaving it detached, empty, and ready to be penetrated by the object; it means holding in our minds, within reach of this thought, but on a lower level and not in contact with it, the diverse knowledge we have acquired which we are forced to make use of." (Simone Weil, "Reflections on the Right Use of School Studies with a View to the Love of God", 1942)

"Paradoxical as it may seem, a Latin prose or a geometry problem, even though they are done wrong, may be of a great service one day, provided we devote the right kind of effort to them. Should the occasion arise, they can one day make us better able to give someone in affliction exactly the help required to save him, at the supreme moment of his need." (Simone Weil, "Reflections on the Right Use of School Studies with a View to the Love of God", 1942)

"In order to be exercised, the intelligence requires to be free to express itself without control by any authority. There must therefore be a domain of pure intellectual research, separate but accessible to all, where no authority intervenes." (Simone Weil, "Statement of Human Obligations", 1943)

"There is a reality outside the world, that is to say, outside space and time, outside man's mental universe, outside any sphere whatsoever that is accessible to human faculties. [...] That reality is the unique source of all the good that can exist in this world: that is to say, all beauty, all truth, all justice, all legitimacy, all order, and all human behavior that is mindful of obligations." (Simone Weil, "Draft for a Statement of Human Obligation", 1943)

"Concern for the symbol has completely disappeared from our science. And yet, if one were to give oneself the trouble, one could easily find, in certain parts at least of contemporary mathematics... symbols as clear, as beautiful, and as full of spiritual meaning as that of the circle and mediation. From modern thought to ancient wisdom the path would be short and direct, if one cared to take it." (Simone Weil, "The Need for Roots", 1949)

"The most important part of education - to teach the meaning of to know [in the scientific sense]" (Simone Weil, "Waiting on God", 1950)

"[…] algebra is the intellectual instrument which has been created for rendering clear the quantitative aspects of the world." (Simone Weil, "The Organization of Thought", 1974)

"There are necessities and impossibilities in reality which do not obtain in fiction, any more than the law of gravity to which we are subject controls what is represented in a picture. [...] It is the same with pure good; for a necessity as strong as gravity condemns man to evil and forbids him any good, or only within the narrowest limits and laboriously obtained and soiled and adulterated with evil. [...] The simplicity which makes the fictional good something insipid and unable to hold the attention becomes, in the real good, an unfathomable marvel." (Simone Weil, "Morality and Literature")

16 March 2023

Ivar Ekeland - Collected Quotes

"It is true that every aspect of the roll of dice may be suspect: the dice themselves, the form and texture of the surface, the person throwing them. If we push the analysis to its extreme, we may even wonder what chance has to do with it at all. Neither the course of the dice nor their rebounds rely on chance; they are governed by the strict determinism of rational mechanics. Billiards is based on the same principles, and it has never been considered a game of chance. So in the final analysis, chance lies in the clumsiness, the inexperience, or the naiveté of the thrower - or in the eye of the observer." (Ivar Ekeland, "The Broken Dice, and Other Mathematical Tales of Chance", 1993)

"Whether we shuffle cards or roll dice, chance is only a result of our human lack of deftness: we don't have enough control to immobilize a die at will or to individually direct the cards in a deck. The comparison is an important one nonetheless, and highlights the limits of this method of creating chance - it doesn't matter who rolls the dice, but we wouldn't let just anyone shuffle the cards." (Ivar Ekeland, "The Broken Dice, and Other Mathematical Tales of Chance", 1993)

"A pendulum is simply a small load suspended to a string or to a rod fixed at one end. If left alone it ends up hanging vertically, and if we push it away from the vertical, it starts beating. Galileo found that all beats last the same time, called the period, which depends on the length of the pendulum, but not on the amplitude of the beats or on the weight of the load. It also states that the period varies as the square root of the length: to double its period, one should make the pendulum four times as long. Making it heavier, or pushing it farther away from the vertical, has no effect. This property is known as isochrony, and it is the main reason why we are able to measure time with accuracy." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"An equilibrium is not always an optimum; it might not even be good. This may be the most important discovery of game theory." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"Chaos cuts with two edges. We have seen how it is impossible to retrieve past history from current observations. We will now show that it is impossible to predict future states from the current observations." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"It is a testimony to the power of education that classical mechanics could operate for so long under a mistaken conception. Teaching and research concentrated on integrable systems, each feeding the other, until in the end we had no longer the tools nor the interest for studying nonintegrable systems." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"Nowadays, however, we are much more aware of the fact that the best proof in the world is worth no more than its premises: every scientific theory is transitory and provisional, in wait for a better one, and accepted only as long as the experimental results conform to its predictions." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"The measurement of time was the first example of a scientific discovery changing the technology." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

"We do not discover mathematical truths; we remember them from our passages through this world outside our own." (Ivar Ekeland, "The Best of All Possible Worlds", 2006)

15 March 2023

Roger J Barlow - Collected Quotes

"Averaging results, whether weighted or not, needs to be done with due caution and commonsense. Even though a measurement has a small quoted error it can still be, not to put too fine a point on it, wrong. If two results are in blatant and obvious disagreement, any average is meaningless and there is no point in performing it. Other cases may be less outrageous, and it may not be clear whether the difference is due to incompatibility or just unlucky chance." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"In everyday life, 'estimation' means a rough and imprecise procedure leading to a rough and imprecise result. You 'estimate' when you cannot measure exactly. In statistics, on the other hand, 'estimation' is a technical term. It means a precise and accurate procedure, leading to a result which may be imprecise, but where at least the extent of the imprecision is known. It has nothing to do with approximation. You have some data, from which you want to draw conclusions and produce a 'best' value for some particular numerical quantity (or perhaps for several quantities), and you probably also want to know how reliable this value is, i.e. what the error is on your estimate." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"Least squares' means just what it says: you minimise the (suitably weighted) squared difference between a set of measurements and their predicted values. This is done by varying the parameters you want to estimate: the predicted values are adjusted so as to be close to the measurements; squaring the differences means that greater importance is placed on removing the large deviations." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"Probabilities are pure numbers. Probability densities, on the other hand, have dimensions, the inverse of those of the variable x to which they apply." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"Science is supposed to explain to us what is actually happening, and indeed what will happen, in the world. Unfortunately as soon as you try and do something useful with it, sordid arithmetical numbers start getting in the way and messing up the basic scientific laws." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"Statistics is a tool. In experimental science you plan and carry out experiments, and then analyse and interpret the results. To do this you use statistical arguments and calculations. Like any other tool - an oscilloscope, for example, or a spectrometer, or even a humble spanner - you can use it delicately or clumsily, skillfully or ineptly. The more you know about it and understand how it works, the better you will be able to use it and the more useful it will be." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"Subjective probability, also known as Bayesian statistics, pushes Bayes' theorem further by applying it to statements of the type described as 'unscientific' in the frequency definition. The probability of a theory (e.g. that it will rain tomorrow or that parity is not violated) is considered to be a subjective 'degree of belief - it can perhaps be measured by seeing what odds the person concerned will offer as a bet. Subsequent experimental evidence then modifies the initial degree of belief, making it stronger or weaker according to whether the results agree or disagree with the predictions of the theory in question." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"The principle of maximum likelihood is not a rule that requires justification - it does not need to be proved. It is merely a sensible way of producing an estimator. But although the name 'maximum likelihood' has a nice ring to it - it suggest that your estimate is the 'most likely' value - this is an unfair interpretation; it is the estimate that makes your data most likely - another thing altogether." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"There is a technical difference between a bar chart and a histogram in that the number represented is proportional to the length of bar in the former and the area in the latter. This matters if non-uniform binning is used. Bar charts can be used for qualitative or quantitative data, whereas histograms can only be used for quantitative data, as no meaning can be attached to the width of the bins if the data are qualitative." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989) 

"There is an obvious parallel between an expectation value and the mean of a data sample. The former is a sum over a theoretical probability distribution and the latter is a (similar) sum over a real data sample. The law of large numbers ensures that if a data sample is described by a theoretical distribution, then as N, the size of the data sample, goes to infinity […]." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

"When you want to use some data to give the answer to a question, the first step is to formulate the question precisely by expressing it as a hypothesis. Then you consider the consequences of that hypothesis, and choose a suitable test to apply to the data. From the result of the test you accept or reject the hypothesis according to prearranged criteria. This cannot be infallible, and there is always a chance of getting the wrong answer, so you try and reduce the chance of such a mistake to a level which you consider reasonable." (Roger J Barlow, "Statistics: A guide to the use of statistical methods in the physical sciences", 1989)

12 March 2023

On Heuristics IV

"Design problems - generating or discovering alternatives - are complex largely because they involve two spaces, an action space and a state space, that generally have completely different structures. To find a design requires mapping the former of these on the latter. For many, if not most, design problems in the real world systematic algorithms are not known that guarantee solutions with reasonable amounts of computing effort. Design uses a wide range of heuristic devices - like means-end analysis, satisficing, and the other procedures that have been outlined - that have been found by experience to enhance the efficiency of search. Much remains to be learned about the nature and effectiveness of these devices." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)

"Heuristics are rules of thumb that help constrain the problem in certain ways (in other words they help you to avoid falling back on blind trial and error), but they don't guarantee that you will find a solution. Heuristics are often contrasted with algorithms that will guarantee that you find a solution - it may take forever, but if the problem is algorithmic you will get there. However, heuristics are also algorithms." (S Ian Robertson, "Problem Solving", 2001)

"A heuristic is defined as a simple rule that exploits both evolved abilities to act fast and struc- tures of the environment to act accurately and frugally. The complexity and uncertainty of an environment cannot be determined independently of the actor. What matters is the degree of complexity and uncertainty encountered by the decision maker." (Christoph Engel & Gerd Gigerenzer, "Law and Heuristics: An interdisciplinary venture" [in "Heuristics and the Law", 2006)

"Heuristics are needed in situations where the world does not permit optimization. For many real-world problems (as opposed to optimization-tuned textbook problems), optimal solutions are unknown because the problems are computationally intractable or poorly defined." (Christoph Engel & Gerd Gigerenzer, "Law and Heuristics: An interdisciplinary venture" [in "Heuristics and the Law", 2006)

"Perception and memory are imprecise filters of information, and the way in which information is presented, that is, the frame, influences how it is received. Because too much information is difficult to deal with, people have developed shortcuts or heuristics in order to come up with reasonable decisions. Unfortunately, sometimes these heuristics lead to bias, especially when used outside their natural domains." (Lucy F Ackert & Richard Deaves, "Behavioral Finance: Psychology, Decision-Making, and Markets", 2010)

"A rule of thumb, or heuristic, enables us to make a decision fast, without much searching for information, but nevertheless with high accuracy. [...] A heuristic can be safer and more accurate than a calculation, and the same heuristic can underlie both conscious and unconscious decisions." (Gerd Gigerenzer, "Risk Savvy: How to make good decisions", 2014)

"A heuristic is a strategy that ignores part of the information, with the goal of making decisions more quickly, frugally, and/or accurately than more complex methods." (Gerd Gigerenzer et al, "Simply Rational: Decision Making in the Real World", 2015)

"Probability theory is not the only tool for rationality. In situations of uncertainty, as opposed to risk, simple heuristics can lead to more accurate judgments, in addition to being faster and more frugal. Under uncertainty, optimal solutions do not exist (except in hindsight) and, by definition, cannot be calculated. Thus, it is illusory to model the mind as a general optimizer, Bayesian or otherwise. Rather, the goal is to achieve satisficing solutions, such as meeting an aspiration level or coming out ahead of a competitor."  (Gerd Gigerenzer et al, "Simply Rational: Decision Making in the Real World", 2015)

"Judgments made in difficult circumstances can be based on a limited number of simple, rapidly-arrived-at rules ('heuristics'), rather than formal, extensive algorithmic calculus and programs. Often, even complex problems can be solved quickly and accurately using such 'quick and dirty' heuristics. However, equally often, such heuristics can be beset by systematic errors or biases." (Jérôme Boutang & Michel De Lara, "The Biased Mind", 2016)

11 March 2023

Gerd Gigerenzer - Collected Quotes

"A heuristic is ecologically rational to the degree that it is adapted to the structure of an environment. Thus, simple heuristics and environmental structure can both work hand in hand to provide a realistic alternative to the ideal of optimization, whether unbounded or constrained." (Gerd Gigerenzer & Peter M Todd, "Fast and Frugal Heuristics: The Adaptive Toolbox" [in "Simple Heuristics That Make Us Smart"], 1999)

"Fast and frugal heuristics employ a minimum of time, knowledge, and computation to make adaptive choices in real environments. They can be used to solve problems of sequential search through objects or options, as in satisficing. They can also be used to make choices between simultaneously available objects, where the search for information (in the form of cues, features, consequences, etc.) about the possible options must be limited, rather than the search for the options themselves. Fast and frugal heuristics limit their search of objects or information using easily computable stopping rules, and they make their choices with easily computable decision rules." (Gerd Gigerenzer & Peter M Todd, "Fast and Frugal Heuristics: The Adaptive Toolbox" [in "Simple Heuristics That Make Us Smart"], 1999)

"At present, high school education in some countries covers only little, if any, statistical thinking. Algebra, geometry, and calculus teach thinking in a world of certainty - not in the real world, which is uncertain. [...] Furthermore, in the medical and social sciences, data analysis is typically taught as a set of statistical rituals rather than a set of methods for statistical thinking." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"External representations are not just passive inputs to an active mind. They can do part of the reasoning or calculation, making relevant properties of the same information more accessible." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Good numeric representation is a key to effective thinking that is not limited to understanding risks. Natural languages show the traces of various attempts at finding a proper representation of numbers. [...] The key role of representation in thinking is often downplayed because of an ideal of rationality that dictates that whenever two statements are mathematically or logically the same, representing them in different forms should not matter. Evidence that it does matter is regarded as a sign of human irrationality. This view ignores the fact that finding a good representation is an indispensable part of problem solving and that playing with different representations is a tool of creative thinking." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Ignorance of relevant risks and miscommunication of those risks are two aspects of innumeracy. A third aspect of innumeracy concerns the problem of drawing incorrect inferences from statistics. This third type of innumeracy occurs when inferences go wrong because they are clouded by certain risk representations. Such clouded thinking becomes possible only once the risks have been communicated." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"In my view, the problem of innumeracy is not essentially 'inside' our minds as some have argued, allegedly because the innate architecture of our minds has not evolved to deal with uncertainties. Instead, I suggest that innumeracy can be traced to external representations of uncertainties that do not match our mind’s design - just as the breakdown of color constancy can be traced to artificial illumination. This argument applies to the two kinds of innumeracy that involve numbers: miscommunication of risks and clouded thinking. The treatment for these ills is to restore the external representation of uncertainties to a form that the human mind is adapted to." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Information needs representation. The idea that it is possible to communicate information in a 'pure' form is fiction. Successful risk communication requires intuitively clear representations. Playing with representations can help us not only to understand numbers (describe phenomena) but also to draw conclusions from numbers (make inferences). There is no single best representation, because what is needed always depends on the minds that are doing the communicating." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Natural frequencies facilitate inferences made on the basis of numerical information. The representation does part of the reasoning, taking care of the multiplication the mind would have to perform if given probabilities. In this sense, insight can come from outside the mind." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Overcoming innumeracy is like completing a three-step program to statistical literacy. The first step is to defeat the illusion of certainty. The second step is to learn about the actual risks of relevant events and actions. The third step is to communicate the risks in an understandable way and to draw inferences without falling prey to clouded thinking. The general point is this: Innumeracy does not simply reside in our minds but in the representations of risk that we choose." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Statistical innumeracy is the inability to think with numbers that represent uncertainties. Ignorance of risk, miscommunication of risk, and clouded thinking are forms of innumeracy. Like illiteracy, innumeracy is curable. Innumeracy is not simply a mental defect 'inside' an unfortunate mind, but is in part produced by inadequate 'outside' representations of numbers. Innumeracy can be cured from the outside." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"The creation of certainty seems to be a fundamental tendency of human minds. The perception of simple visual objects reflects this tendency. At an unconscious level, our perceptual systems automatically transform uncertainty into certainty, as depth ambiguities and depth illusions illustrate." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002) 

"The key role of representation in thinking is often downplayed because of an ideal of rationality that dictates that whenever two statements are mathematically or logically the same, representing them in different forms should not matter. Evidence that it does matter is regarded as a sign of human irrationality. This view ignores the fact that finding a good representation is an indispensable part of problem solving and that playing with different representations is a tool of creative thinking." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"The possibility of translating uncertainties into risks is much more restricted in the propensity view. Propensities are properties of an object, such as the physical symmetry of a die. If a die is constructed to be perfectly symmetrical, then the probability of rolling a six is 1 in 6. The reference to a physical design, mechanism, or trait that determines the risk of an event is the essence of the propensity interpretation of probability. Note how propensity differs from the subjective interpretation: It is not sufficient that someone’s subjective probabilities about the outcomes of a die roll are coherent, that is, that they satisfy the laws of probability. What matters is the die’s design. If the design is not known, there are no probabilities." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"When does an uncertainty qualify as a risk? The answer depends on one’s interpretation of probability, of which there are three major versions: degree of belief, propensity, and frequency. Degrees of belief are sometimes called subjective probabilities. Of the three interpretations of probability, the subjective interpretation is most liberal about expressing uncertainties as quantitative probabilities, that is, risks. Subjective probabilities can be assigned even to unique or novel events." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"When natural frequencies are transformed into conditional probabilities, the base rate information is taken out (this is called normalization). The benefit of this normalization is that the resulting values fall within the uniform range of 0 and 1. The cost, however, is that when drawing inferences from probabilities (as opposed to natural frequencies), one has to put the base rates back in by multiplying the conditional probabilities by their respective base rates." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Why does representing information in terms of natural frequencies rather than probabilities or percentages foster insight? For two reasons. First, computational simplicity: The representation does part of the computation. And second, evolutionary and developmental primacy: Our minds are adapted to natural frequencies." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"A heuristic is defined as a simple rule that exploits both evolved abilities to act fast and struc- tures of the environment to act accurately and frugally. The complexity and uncertainty of an environment cannot be determined independently of the actor. What matters is the degree of complexity and uncertainty encountered by the decision maker." (Christoph Engel & Gerd Gigerenzer, "Law and Heuristics: An interdisciplinary venture" [in "Heuristics and the Law", 2006)

"Heuristics are needed in situations where the world does not permit optimization. For many real-world problems (as opposed to optimization-tuned textbook problems), optimal solutions are unknown because the problems are computationally intractable or poorly defined." (Christoph Engel & Gerd Gigerenzer, "Law and Heuristics: An interdisciplinary venture" [in "Heuristics and the Law", 2006)

"[...] a gut feeling is not caprice or a sixth sense. An executive may be buried under a mountain of information, some of which is contradictory, some of questionable reliability, and some that makes one wonder why it was sent in the first place. But despite the excess of data, there is no algorithm to calculate the best decision. In this situation, an experienced executive may have a feeling of what the best action is - a gut feeling. By definition, the reasons behind this feeling are unconscious." (Gerd Gigerenzer, "Risk Savvy: How to make good decisions", 2014)

"A rule of thumb, or heuristic, enables us to make a decision fast, without much searching for information, but nevertheless with high accuracy. [...] A heuristic can be safer and more accurate than a calculation, and the same heuristic can underlie both conscious and unconscious decisions." (Gerd Gigerenzer, "Risk Savvy: How to make good decisions", 2014)

"Probability theory provides the best answer only when the rules of the game are certain, when all alternatives, consequences, and probabilities are known or can be calculated. [...] In the real game, probability theory is not enough. Good intuitions are needed, which can be more challenging than calculations. One way to reduce uncertainty is to rely on rules of thumb." (Gerd Gigerenzer, "Risk Savvy: How to make good decisions", 2014)

"Teaching statistical thinking means giving people tools for problem solving in the real world. It should not be taught as pure mathematics. Instead of mechanically solving a dozen problems with the help of a particular formula, children and adolescents should be asked to find solutions to real-life problems. That’s what teaches them how to solve problems, and also shows that there may be more than one good answer in the first place. Equally important is encouraging curiosity, such as asking for answers to questions by doing experiments." (Gerd Gigerenzer, "Risk Savvy: How to make good decisions", 2014)

"The taming of chance created mathematical probability. [...] Probability is not one of a kind; it was born with three faces: frequency, physical design, and degrees of belief. [...] n the first of its identities, probability is about counting. [...] Second, probability is about constructing. For example, if a die is constructed to be perfectly symmetrical, then the probability of rolling a six is one in six. You don’t have to count. [...] Probabilities by design are called propensities. Historically, games of chance were the prototype for propensity. These risks are known because people crafted, not counted, them. [...] Third, probability is about degrees of belief. A degree of belief can be based on anything from experience to personal impression." (Gerd Gigerenzer, "Risk Savvy: How to make good decisions", 2014)

"There is a mathematical theory that tells us why and when simple is better. It is called the bias-variance dilemma. [...] How far we go in simplifying depends on three features. First, the more uncertainty, the more we should make it simple. The less uncertainty, the more complex it should be. [...] Second, the more alternatives, the more we should simplify; the fewer, the more complex it can be. The reason is that complex methods need to estimate risk factors, and more alternatives mean that more factors need to be estimated, which leads to more estimation errors being made. [...] Finally, the more past data there are, the more beneficial for the complex methods." (Gerd Gigerenzer, "Risk Savvy: How to make good decisions", 2014)

"To calculate a risk is one thing, to communicate it is another. Risk communication is an important skill for laypeople and experts alike. Because it’s rarely taught, misinterpreting numbers is the rule rather than the exception. Each of the three kinds of probability - relative frequency, design, or degree of belief - can be framed in either a confusing or a transparent way. So far we have seen two communication tools for reporting risks: Use frequencies instead of single-event probabilities. Use absolute instead of relative risks."  (Gerd Gigerenzer, "Risk Savvy: How to make good decisions", 2014)

"A heuristic is a strategy that ignores part of the information, with the goal of making decisions more quickly, frugally, and/or accurately than more complex methods." (Gerd Gigerenzer et al, "Simply Rational: Decision Making in the Real World", 2015)

"Probability theory is not the only tool for rationality. In situations of uncertainty, as opposed to risk, simple heuristics can lead to more accurate judgments, in addition to being faster and more frugal. Under uncertainty, optimal solutions do not exist (except in hindsight) and, by definition, cannot be calculated. Thus, it is illusory to model the mind as a general optimizer, Bayesian or otherwise. Rather, the goal is to achieve satisficing solutions, such as meeting an aspiration level or coming out ahead of a competitor."  (Gerd Gigerenzer et al, "Simply Rational: Decision Making in the Real World", 2015)

On Statistics: Skewness/Kurtosis

"It is common for positive data to be skewed to the right: some values bunch together at the low end of the scale and others trail off to the high end with increasing gaps between the values as they get higher. Such data can cause severe resolution problems on graphs, and the common remedy is to take logarithms. Indeed, it is the frequent success of this remedy that partly accounts for the large use of logarithms in graphical data display." (William S Cleveland, "The Elements of Graphing Data", 1985)

"Skewness is a measure of symmetry. For example, it's zero for the bell-shaped normal curve, which is perfectly symmetric about its mean. Kurtosis is a measure of the peakedness, or fat-tailedness, of a distribution. Thus, it measures the likelihood of extreme values." (John L Casti, "Reality Rules: Picturing the world in mathematics", 1992)

"Data that are skewed toward large values occur commonly. Any set of positive measurements is a candidate. Nature just works like that. In fact, if data consisting of positive numbers range over several powers of ten, it is almost a guarantee that they will be skewed. Skewness creates many problems. There are visualization problems. A large fraction of the data are squashed into small regions of graphs, and visual assessment of the data degrades. There are characterization problems. Skewed distributions tend to be more complicated than symmetric ones; for example, there is no unique notion of location and the median and mean measure different aspects of the distribution. There are problems in carrying out probabilistic methods. The distribution of skewed data is not well approximated by the normal, so the many probabilistic methods based on an assumption of a normal distribution cannot be applied." (William S Cleveland, "Visualizing Data", 1993)

"The logarithm is one of many transformations that we can apply to univariate measurements. The square root is another. Transformation is a critical tool for visualization or for any other mode of data analysis because it can substantially simplify the structure of a set of data. For example, transformation can remove skewness toward large values, and it can remove monotone increasing spread. And often, it is the logarithm that achieves this removal." (William S Cleveland, "Visualizing Data", 1993)

"When the distributions of two or more groups of univariate data are skewed, it is common to have the spread increase monotonically with location. This behavior is monotone spread. Strictly speaking, monotone spread includes the case where the spread decreases monotonically with location, but such a decrease is much less common for raw data. Monotone spread, as with skewness, adds to the difficulty of data analysis. For example, it means that we cannot fit just location estimates to produce homogeneous residuals; we must fit spread estimates as well. Furthermore, the distributions cannot be compared by a number of standard methods of probabilistic inference that are based on an assumption of equal spreads; the standard t-test is one example. Fortunately, remedies for skewness can cure monotone spread as well." (William S Cleveland, "Visualizing Data", 1993)

"Use a logarithmic scale when it is important to understand percent change or multiplicative factors. […] Showing data on a logarithmic scale can cure skewness toward large values." (Naomi B Robbins, "Creating More effective Graphs", 2005)

"Before calculating a confidence interval for a mean, first check that one of the situations just described holds. To determine whether the data are bell-shaped or skewed, and to check for outliers, plot the data using a histogram, dotplot, or stemplot. A boxplot can reveal outliers and will sometimes reveal skewness, but it cannot be used to determine the shape otherwise. The sample mean and median can also be compared to each other. Differences between the mean and the median usually occur if the data are skewed - that is, are much more spread out in one direction than in the other." (Jessica M Utts & Robert F Heckard, "Mind on Statistics", 2007)

"Symmetry and skewness can be judged, but boxplots are not entirely useful for judging shape. It is not possible to use a boxplot to judge whether or not a dataset is bell-shaped, nor is it possible to judge whether or not a dataset may be bimodal." (Jessica M Utts & Robert F Heckard, "Mind on Statistics", 2007)

"Given the important role that correlation plays in structural equation modeling, we need to understand the factors that affect establishing relationships among multivariable data points. The key factors are the level of measurement, restriction of range in data values (variability, skewness, kurtosis), missing data, nonlinearity, outliers, correction for attenuation, and issues related to sampling variation, confidence intervals, effect size, significance, sample size, and power." (Randall E Schumacker & Richard G Lomax, "A Beginner’s Guide to Structural Equation Modeling" 3rd Ed., 2010)

"A histogram represents the frequency distribution of the data. Histograms are similar to bar charts but group numbers into ranges. Also, a histogram lets you show the frequency distribution of continuous data. This helps in analyzing the distribution (for example, normal or Gaussian), any outliers present in the data, and skewness." (Umesh R Hodeghatta & Umesha Nayak, "Business Analytics Using R: A Practical Approach", 2017)

"New information is constantly flowing in, and your brain is constantly integrating it into this statistical distribution that creates your next perception (so in this sense 'reality' is just the product of your brain’s ever-evolving database of consequence). As such, your perception is subject to a statistical phenomenon known in probability theory as kurtosis. Kurtosis in essence means that things tend to become increasingly steep in their distribution [...] that is, skewed in one direction. This applies to ways of seeing everything from current events to ourselves as we lean 'skewedly' toward one interpretation, positive or negative. Things that are highly kurtotic, or skewed, are hard to shift away from. This is another way of saying that seeing differently isn’t just conceptually difficult - it’s statistically difficult." (Beau Lotto, "Deviate: The Science of Seeing Differently", 2017)

"Many statistical procedures perform more effectively on data that are normally distributed, or at least are symmetric and not excessively kurtotic (fat-tailed), and where the mean and variance are approximately constant. Observed time series frequently require some form of transformation before they exhibit these distributional properties, for in their 'raw' form they are often asymmetric." (Terence C. Mills, "Applied Time Series Analysis: A practical guide to modeling and forecasting", 2019)

"With skewed data, quantiles will reflect the skew, while adding standard deviations assumes symmetry in the distribution and can be misleading." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

"Skewed data means data that is shifted in one direction or the other. Skewness can cause machine learning models to underperform. Many machine learning models assume normally distributed data or data structures to follow the Gaussian structure. Any deviation from the assumed Gaussian structure, which is the popular bell curve, can affect model performance. A very effective area where we can apply feature engineering is by looking at the skewness of data and then correcting the skewness through normalization of the data." (Anthony So et al, "The Data Science Workshop" 2nd Ed., 2020)

On Fractals IV

"Fractal geometry will make you see everything differently. There is danger in reading further. You risk the loss of your childhood vision of clouds, forests, flowers, galaxies, leaves, feathers, rocks, mountains, torrents of water, carpets, bricks, and much else besides. Never again will your interpretation of these things be quite the same." (Michael F Barnsley, "Fractals Everywhere" 1988)

"Very often a strange attractor is a fractal object, whose geometric structure is invariant under the time evolution maps."  (David Ruelle, "Chaotic Evolution and Strange Attractors: The statistical analysis of time series for deterministic nonlinear systems", 1989)

"The unifying concept underlying fractals, chaos, and power laws is self-similarity. Self-similarity, or invariance against changes in scale or size, is an attribute of many laws of nature and innumerable phenomena in the world around us. Self-similarity is, in fact, one of the decisive symmetries that shape our universe and our efforts to comprehend it." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"[…] the world is not complete chaos. Strange attractors often do have structure: like the Sierpinski gasket, they are self-similar or  approximately so. And they have fractal dimensions that hold important clues for our attempts to understand chaotic systems such as the weather." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"First, strange attractors look strange: they are not smooth curves or surfaces but have 'non-integer dimension' - or, as Benoit Mandelbrot puts it, they are fractal objects. Next, and more importantly, the motion on a strange attractor has sensitive dependence on initial condition. Finally, while strange attractors have only finite dimension, the time-frequency analysis reveals a continuum of frequencies." (David Ruelle, "Chance and Chaos", 1991)

"[…] chaos and fractals are part of an even grander subject known as dynamics. This is the subject that deals with change, with systems that evolve in time. Whether the system in question settles down to equilibrium, keeps repeating in cycles, or does something more complicated, it is dynamics that we use to analyze the behavior." (Steven H Strogatz, "Non-Linear Dynamics and Chaos, 1994)

"What is renormalization? First of all, if scaling is present we can go to smaller scales and get exactly the same result. In a sense we are looking at the system with a microscope of increasing power. If you take the limit of such a process you get a stability that is not otherwise present. In short, in the renormalized system, the self-similarity is exact, not approximate as it usually is. So renormalization gives stability and exactness." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"Mathematics is sometimes described as the science which generates eternal notions and concepts for the scientific method: derivatives‚ continuity‚ powers‚ logarithms are examples. The notions of chaos‚ fractals and strange attractors are not yet mathematical notions in that sense‚ because their final definitions are not yet agreed upon." (Heinz-Otto Peitgen et al, "Chaos and Fractals: New Frontiers of Science", 2004)

"In the telephone system a century ago, messages dispersed across the network in a pattern that mathematicians associate with randomness. But in the last decade, the flow of bits has become statistically more similar to the patterns found in self-organized systems. For one thing, the global network exhibits self-similarity, also known as a fractal pattern. We see this kind of fractal pattern in the way the jagged outline of tree branches look similar no matter whether we look at them up close or far away. Today messages disperse through the global telecommunications system in the fractal pattern of self-organization." (Kevin Kelly, "What Technology Wants", 2010)

"The theory of fractality is of importance from two distinct but related points of view: its origins and its results. Fractals are the fruit of the breaking down of traditional thought and philosophy that had governed mathematics and the sciences for centuries. They in turn had a revolutionary effect on diverse  sciences, mathematics, thought and arts in a very short period  of time. They upended linear philosophical conceptions of true or false, high or low, ordered or disordered, beautiful or ugly." (Mehrdad Garousi, "The Postmodern Beauty of Fractals", Leonardo Vol. 45 (1), 2012)

10 March 2023

On Chaos: Does God Play Dice?

"God's dice always have a lucky roll." (Sophocles, 5th century BC)

"A perfect equity adjusts its balance in all parts of life. The dice of God are always loaded. The world looks like a multiplication-table, or a mathematical equation, which, turn it how you will, balances itself." (Ralph W Emerson, "Compensation", 1841)

"It seems hard to sneak a look at God's cards. But that He plays dice and uses 'telepathic' methods [...] is something that I cannot believe for a single moment." (Albert Einstein, [Letter to Cornel Lanczos] 1942)

"You believe in the God who plays dice, and I in complete law and order in a world that objectively exists, and which I, in a wildly speculative way, am trying to capture. [...] Even the great initial success of the quantum theory does not make me believe in the fundamental dice-game, although I am well aware that our younger colleagues interpret this as a consequence of senility. No doubt the day will come when we will see whose instinctive attitude was the correct one." (Albert Einstein, [Letter to Max Born] 1944)

"If God has made the world a perfect mechanism, He has at least conceded so much to our imperfect intellect that in order to predict little parts of it, we need not solve innumerable differential equations, but can use dice with fair success." (Max Born, "Albert Einstein: Philosopher-Scientist", 1949)

"Consideration of particle emission from black holes would seem to suggest that God not only plays dice, but also sometimes throws them where they cannot be seen." (Stephen Hawking, "The Quantum Mechanics of Black Holes", Scientific American, 1977)

"Not only does God play dice with the world - He does not let us see what He has rolled." (Stanisław Lem, "Imaginary Magnitude", 1981)

"Specialists [...] are slowly coming to the realization that the universe is biased and leans to the left. [...] Many scientists have come to believe that this odd state of affairs has somethittg to do with the weak nuclear force. It seems that the weak force tends to impart a left-handed spin to electrons, and this effect may bias some kinds of molecular synthesis to the left. [...] But scientific speculation of this ilk leads to a deeper question. Was it purely a matter of chance that left-handedness became the preferred direction in our universe, or is there some reason behind it? Did the sinister bent of existence that scientists have observed stem from a roll of the dice, or is God a semiambidextrous southpaw?" (Malcolm W Browne, 1986)

"So Einstein was wrong when he said, 'God does not play dice'. Consideration of black holes suggests, not only that God does play dice, but that he sometimes confuses us by throwing them where they can't be seen." (Stephen Hawking, 1994)

"Yet, Einstein's theories are also not the last word: quantum theory and relativity are inconsistent, and Einstein himself, proclaiming that 'God does not play dice!', rejected the basic reliance of quantum theory on chance events, and looked forward to a theory which would be deterministic. Recent experiments suggest that this view of Einstein's conflicts with his other deeply held beliefs about the nature of the physical universe. Certain it is that somewhere, beyond physicists' current horizons, are even more powerful theories of how the world is." (David Wells, "You Are a Mathematician: A wise and witty introduction to the joy of numbers", 1995)

"Chaos teaches us that anybody, God or cat, can play dice deterministically, while the naïve onlooker imagines that something random is going on." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 1997)

"Indeed a deterministic die behaves very much as if it has six attractors, the steady states corresponding to its six faces, all of whose basins are intertwined. For technical reasons that can't quite be true, but it is true that deterministic systems with intertwined basins are wonderful substitutes for dice; in fact they're super-dice, behaving even more ‘randomly’ - apparently - than ordinary dice. Super-dice are so chaotic that they are uncomputable. Even if you know the equations for the system perfectly, then given an initial state, you cannot calculate which attractor it will end up on. The tiniest error of approximation - and there will always be such an error - will change the answer completely." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 1997)

"Perhaps God can play dice, and create a universe of complete law and order, in the same breath." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 1997)

"Simple laws may not produce simple behaviour. Deterministic laws can produce behaviour that appears random. Order can breed its own kind of chaos. The question is not so much whether God plays dice, but how God plays dice.", 1997)

"The chance events due to deterministic chaos, on the other hand, occur even within a closed system determined by immutable laws. Our most cherished examples of chance - dice, roulette, coin-tossing - seem closer to chaos than to the whims of outside events. So, in this revised sense, dice are a good metaphor for chance after all. It's just that we've refined our concept of randomness. Indeed, the deterministic but possibly chaotic stripes of phase space may be the true source of probability." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 1997)

"[…] we would like to observe that the butterfly effect lies at the root of many events which we call random. The final result of throwing a dice depends on the position of the hand throwing it, on the air resistance, on the base that the die falls on, and on many other factors. The result appears random because we are not able to take into account all of these factors with sufficient accuracy. Even the tiniest bump on the table and the most imperceptible move of the wrist affect the position in which the die finally lands. It would be reasonable to assume that chaos lies at the root of all random phenomena." (Iwo Bialynicki-Birula & Iwona Bialynicka-Birula, "Modeling Reality: How Computers Mirror Life", 2004)

"Freedom, then, is the ability to choose at will from the greatest number of options available to us. In other words, entropy is freedom; and the equal opportunity (rather than equality per se) that maximizes the number of options available is the second law of thermodynamics. When the number of options available to us is infinite, choice becomes random and the microstate in which we exist is our fate that is determined by God’s game of dice." (Oded Kafri & Hava Kafri, "Entropy: God's Dice Game", 2013)

"There is no such thing as randomness. No one who could detect every force operating on a pair of dice would ever play dice games, because there would never be any doubt about the outcome. The randomness, such as it is, applies to our ignorance of the possible outcomes. It doesn’t apply to the outcomes themselves. They are 100% determined and are not random in the slightest. Scientists have become so confused by this that they now imagine that things really do happen randomly, i.e. for no reason at all." (Thomas Stark, "God Is Mathematics: The Proofs of the Eternal Existence of Mathematics", 2018)

"God may not play dice with the universe, but something strange is going on with the prime numbers." (Paul Erdos)

"God plays dice with the universe, but they’re loaded dice. And the main objective of physics now is to find out by what rules were they loaded and how can we use them for our own ends." (Joseph Ford)

"I shall never believe that God plays dice with the world." (Albert Einstein)

Related Posts Plugin for WordPress, Blogger...

Alexander von Humboldt - Collected Quotes

"Whatever relates to extent and quantity may be represented by geometrical figures. Statistical projections which speak to the senses w...