30 August 2022

On Risk II: Trivia

"A mind that questions everything, unless strong enough to bear the weight of its ignorance, risks questioning itself and being engulfed in doubt." (Émile Durkheim, "Suicide: A Study in Sociology", 1897)

"It is a matter of primary importance in the cultivation of those sciences in which truth is discoverable by the human intellect that the investigator should be free, independent, unshackled in his movement; that he should be allowed and enabled to fix his mind intently, nay, exclusively, on his special object, without the risk of being distracted every other minute in the process and progress of his inquiry by charges of temerariousness, or by warnings against extravagance or scandal." (John H Newman, "The Idea of a University Defined and Illustrated", 1905)

"The final truth about phenomena resides in the mathematical description of it; so long as there is no imperfection in this, our knowledge is complete. We go beyond the mathematical formula at our own risk; we may find a [nonmathematical] model or picture that helps us to understand it, but we have no right to expect this, and our failure to find such a model or picture need not indicate that either our reasoning or our knowledge is at fault." (James Jeans, "The Mysterious Universe", 1930)

"It is easy to obtain confirmations, or verifications, for nearly every theory - if we look for confirmations. Confirmations should count only if they are the result of risky predictions. […] A theory which is not refutable by any conceivable event is non-scientific. Irrefutability is not a virtue of a theory (as people often think) but a vice. Every genuine test of a theory is an attempt to falsify it, or refute it." (Karl R Popper,"Conjectures and Refutations: The Growth of Scientific Knowledge", 1963)

"Taking no action to solve these problems is equivalent of taking strong action. Every day of continued exponential growth brings the world system closer to the ultimate limits of that growth. A decision to do nothing is a decision to increase the risk of collapse." (Donella Meadows et al, "The Limits to Growth", 1972)

"Demonstrative reasoning differs from plausible reasoning just as the fact differs from the supposition, just as actual existence differs from the possibility of existence. Demonstrative reasoning is reliable, incontrovertible and final. Plausible reasoning is conditional, arguable and oft-times risky." (Yakov Khurgin, "Did You Say Mathematics?", 1974)

"In reasoning, as in every other activity, it is, of course, easy to fall into error. In order to reduce this risk, at least to some extent, it is useful to support intuition with suitable superstructures: in this case, the superstructure is logic (or, to be precise, the logic of certainty)." (Bruno de Finetti, "Theory of Probability", 1974)

"When you are confronted by any complex social system […] with things about it that you’re dissatisfied with and anxious to fix, you cannot just step in and set about fixing with much hope of helping. This realization is one of the sore discouragements of our century […] You cannot meddle with one part of a complex system from the outside without the almost certain risk of setting off disastrous events that you hadn’t counted on in other, remote parts. If you want to fix something you are first obliged to understand […] the whole system. […] Intervening is a way of causing trouble." (Lewis Thomas, "The Medusa and the Snail: More Notes of a Biology Watcher", 1974)

"I find it more difficult, but also much more fun, to get the right answer by indirect reasoning and before all the evidence is in. It’s what a theoretician does in science. But the conclusions drawn in this way are obviously more risky than those drawn by direct measurement, and most scientists withhold judgment until there is more direct evidence available. The principal function of such detective work - apart from entertaining the theoretician - is probably to so annoy and enrage the observationalists that they are forced, in a fury of disbelief, to perform the critical measurements." (Carl Sagan, "The Cosmic Connection: An Extraterrestrial Perspective", 1975)

"Human language is a vehicle of truth but also of error, deception, and nonsense. Its use, as in the present discussion, thus requires great prudence. One can improve the precision of language by explicit definition of the terms used. But this approach has its limitations: the definition of one term involves other terms, which should in turn be defined, and so on. Mathematics has found a way out of this infinite regression: it bypasses the use of definitions by postulating some logical relations (called axioms) between otherwise undefined mathematical terms. Using the mathematical terms introduced with the axioms, one can then define new terms and proceed to build mathematical theories. Mathematics need, not, in principle rely on a human language. It can use, instead, a formal presentation in which the validity of a deduction can be checked mechanically and without risk of error or deception." (David Ruelle,"The Mathematician's Brain", 2007)

Technology is the result of antifragility, exploited by risk-takers in the form of tinkering and trial and error, with nerd-driven design confined to the backstage." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"This is the central illusion in life: that randomness is risky, that it is a bad thing - and that eliminating randomness is done by eliminating randomness. Randomness is distributed rather than concentrated." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012) 

On Risk I

"The Risk of losing any Sum is the reverse of Expectation; and the true measure of it is, the product of the Sum adventured multiplied by the Probability of the Loss." (Abraham de Moivre, "The Doctrine of Chances", 1718)

"The modern theory of decision making under risk emerged from a logical analysis of games of chance rather than from a psychological analysis of risk and value. The theory was conceived as a normative model of an idealized decision maker, not as a description of the behavior of real people." (Amos Tversky & Daniel Kahneman, "Rational Choice and the Framing of Decisions", The Journal of Business Vol. 59 (4), 1986)

"Conversely, there are few features of life, the universe, or anything, in which chance is not in some way crucial. Nor is this merely some abstruse academic point; assessing risks and taking chances are inescapable facets of everyday existence. It is a trite maxim to say that life is a lottery; it would be more true to say that life offers a collection of lotteries that we can all, to some extent, choose to enter or avoid. And as the information at our disposal increases, it does not reduce the range of choices but in fact increases them." (David Stirzaker, "Probability and Random Variables: A Beginner's Guide", 1999)

"Players must accept the cards dealt to them. However, once they have those cards in hand, they alone choose how they will play them. They decide what risks and actions to take." (John C Maxwell, "The Difference Maker: Making Your Attitude Your Greatest Asset", 2006)

"When confronted with multiple models, I find it revealing to pose the resulting uncertainty as a two-stage lottery. For the purposes of my discussion, there is no reason to distinguish unknown models from unknown parameters of a given model. I will view each parameter configuration as a distinct model. Thus a model, inclusive of its parameter values, assigns probabilities to all events or outcomes within the model’s domain. The probabilities are often expressed by shocks with known distributions and outcomes are functions of these shocks. This assignment of probabilities is what I will call risk. By contrast there may be many such potential models. Consider a two-stage lottery where in stage one we select a model and in stage two we draw an outcome using the model probabilities. Call stage one model ambiguity and stage two risk that is internal to a model." (Lars P Hansen, "Uncertainty Outside and Inside Economic Models", [Nobel lecture] 2013)

"Without context, data is useless, and any visualization you create with it will also be useless. Using data without knowing anything about it, other than the values themselves, is like hearing an abridged quote secondhand and then citing it as a main discussion point in an essay. It might be okay, but you risk finding out later that the speaker meant the opposite of what you thought." (Nathan Yau, "Data Points: Visualization That Means Something", 2013)

"Mental Modeling enables discovery of people’s mental models in a structured, rigorous, respectful manner. Mental Modeling has been recognized as one of the premier methods for informing the development of strategies and communications that precisely address people’s current thinking, judgment, decision making, and behavior on complex issues , including risk issues. Broadly, Mental Modeling works from the"inside out," starting with an in-depth understanding of people’s mental models, and then using that insight to develop focused strategies and communication that builds on where people are at in their thinking today, reinforcing what they know about a topic and addressing critical gaps. Broadly stated, the goal is to help people make well-informed decisions and take appropriate actions on the topic at hand." (Matthew D Wood, An Introduction to Mental Modeling, [in "Mental Modeling Approach: Risk Management Application Case Studies"], 2017)

"Premature enumeration is an equal-opportunity blunder: the most numerate among us may be just as much at risk as those who find their heads spinning at the first mention of a fraction. Indeed, if you’re confident with numbers you may be more prone than most to slicing and dicing, correlating and regressing, normalizing and rebasing, effortlessly manipulating the numbers on the spreadsheet or in the statistical package - without ever realizing that you don’t fully understand what these abstract quantities refer to. Arguably this temptation lay at the root of the last financial crisis: the sophistication of mathematical risk models obscured the question of how, exactly, risks were being measured, and whether those measurements were something you’d really want to bet your global banking system on." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Sample error reflects the risk that, purely by chance, a randomly chosen sample of opinions does not reflect the true views of the population. The 'margin of error' reported in opinion polls reflects this risk, and the larger the sample, the smaller the margin of error. […] sampling error has a far more dangerous friend: sampling bias. Sampling error is when a randomly chosen sample doesn’t reflect the underlying population purely by chance; sampling bias is when the sample isn’t randomly chosen at all." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

22 August 2022

On Regression III: Regression toward the Mean II

"'Regression to the mean' describes a natural phenomenon whereby, after a short period of success, things tend to return to normal immediately afterwards. This notion applies particularly to random events." (Alan Graham, "Developing Thinking in Statistics", 2006)

"A naive interpretation of regression to the mean is that heights, or baseball records, or other variable phenomena necessarily become more and more 'average' over time. This view is mistaken because it ignores the error in the regression predicting y from x. For any data point xi, the point prediction for its yi will be regressed toward the mean, but the actual yi that is observed will not be exactly where it is predicted. Some points end up falling closer to the mean and some fall further." (Andrew Gelman & Jennifer Hill, "Data Analysis Using Regression and Multilevel/Hierarchical Models", 2007)

"Regression toward the mean. That is, in any series of random events an extraordinary event is most likely to be followed, due purely to chance, by a more ordinary one." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"Regression does not describe changes in ability that happen as time passes […]. Regression is caused by performances fluctuating about ability, so that performances far from the mean reflect abilities that are closer to the mean." (Gary Smith, "Standard Deviations", 2014)

"We encounter regression in many contexts - pretty much whenever we see an imperfect measure of what we are trying to measure. Standardized tests are obviously an imperfect measure of ability. [...] Each experimental score is an imperfect measure of “ability,” the benefits from the layout. To the extent there is randomness in this experiment - and there surely is - the prospective benefits from the layout that has the highest score are probably closer to the mean than was the score." (Gary Smith, "Standard Deviations", 2014)

"When a trait, such as academic or athletic ability, is measured imperfectly, the observed differences in performance exaggerate the actual differences in ability. Those who perform the best are probably not as far above average as they seem. Nor are those who perform the worst as far below average as they seem. Their subsequent performances will consequently regress to the mean." (Gary Smith, "Standard Deviations", 2014)

"The term shrinkage is used in regression modeling to denote two ideas. The first meaning relates to the slope of a calibration plot, which is a plot of observed responses against predicted responses. When a dataset is used to fit the model parameters as well as to obtain the calibration plot, the usual estimation process will force the slope of observed versus predicted values to be one. When, however, parameter estimates are derived from one dataset and then applied to predict outcomes on an independent dataset, overfitting will cause the slope of the calibration plot (i.e., the shrinkage factor ) to be less than one, a result of regression to the mean. Typically, low predictions will be too low and high predictions too high. Predictions near the mean predicted value will usually be quite accurate. The second meaning of shrinkage is a statistical estimation method that preshrinks regression coefficients towards zero so that the calibration plot for new data will not need shrinkage as its calibration slope will be one." (Frank E. Harrell Jr., "Regression Modeling Strategies: With Applications to Linear Models, Logistic and Ordinal Regression, and Survival Analysis" 2nd Ed, 2015)

"Often when people relate essentially the same variable in two different groups, or at two different times, they see this same phenomenon - the tendency of the response variable to be closer to the mean than the predicted value. Unfortunately, people try to interpret this by thinking that the performance of those far from the mean is deteriorating, but it’s just a mathematical fact about the correlation. So, today we try to be less judgmental about this phenomenon and we call it regression to the mean. We managed to get rid of the term 'mediocrity', but the name regression stuck as a name for the whole least squares fitting procedure - and that’s where we get the term regression line." (Richard D De Veaux et al, "Stats: Data and Models", 2016)

"Regression toward the mean is pervasive. In sports, excellent performance tends to be followed by good, but less outstanding, performance. [...] By contrast, the good news about regression toward the mean is that very poor performance tends to be followed by improved performance. If you got the worst score in your statistics class on the first exam, you probably did not do so poorly on the second exam (but you were probably still below the mean)." (Alan Agresti et al, Statistics: The Art and Science of Learning from Data" 4th Ed., 2018)

On Regression II: Regression toward the Mean I

"Whenever we make any decision based on the expectation that matters will return to 'normal', we are employing the notion of regression to the mean." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Regression to the mean occurs when the process produces results that are statistically independent or negatively correlated. With strong negative serial correlation, extremes are likely to be reversed each time (which would reinforce the instructors' error). In contrast, with strong positive dependence, extreme results are quite likely to be clustered together." (Dan Trietsch, "Statistical Quality Control : A loss minimization approach", 1998) 

"Unfortunately, people are poor intuitive scientists, generally failing to reason in accordance with the principles of scientific method. For example, people do not generate sufficient alternative explanations or consider enough rival hypotheses. People generally do not adequately control for confounding variables when they explore a novel environment. People’s judgments are strongly affected by the frame in which the information is presented, even when the objective information is unchanged. People suffer from overconfidence in their judgments (underestimating uncertainty), wishful thinking (assessing desired outcomes as more likely than undesired outcomes), and the illusion of control (believing one can predict or influence the outcome of random events). People violate basic rules of probability, do not understand basic statistical concepts such as regression to the mean, and do not update beliefs according to Bayes’ rule. Memory is distorted by hindsight, the availability and salience of examples, and the desirability of outcomes. And so on."  (John D Sterman, "Business Dynamics: Systems thinking and modeling for a complex world", 2000)

 "People often attribute meaning to phenomena governed only by a regression to the mean, the mathematical tendency for an extreme value of an at least partially chance-dependent quantity to be followed by a value closer to the average. Sports and business are certainly chancy enterprises and thus subject to regression. So is genetics to an extent, and so very tall parents can be expected to have offspring who are tall, but probably not as tall as they are. A similar tendency holds for the children of very short parents." (John A Paulos, "A Mathematician Plays the Stock Market", 2003)

"'Regression to the mean' […] says that, in any series of events where chance is involved, very good or bad performances, high or low scores, extreme events, etc. tend on the average, to be followed by more average performance or less extreme events. If we do extremely well, we're likely to do worse the next time, while if we do poorly, we're likely to do better the next time. But regression to the mean is not a natural law. Merely a statistical tendency. And it may take a long time before it happens." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger",  2003)

"Another aspect of representativeness that is misunderstood or ignored is the tendency of regression to the mean. Stochastic phenomena where the outcomes vary randomly around stable values (so-called stationary processes) exhibit the general tendency that extreme outcomes are more likely to be followed by an outcome closer to the mean or mode than by other extreme values in the same direction. For example, even a bright student will observe that her or his performance in a test following an especially outstanding outcome tends to be less brilliant. Similarly, extremely low or extremely high sales in a given period tend to be followed by sales that are closer to the stable mean or the stable trend." (Hans G Daellenbach & Donald C McNickle, "Management Science: Decision making through systems thinking", 2005)

"Behavioural research shows that we tend to use simplifying heuristics when making judgements about uncertain events. These are prone to biases and systematic errors, such as stereotyping, disregard of sample size, disregard for regression to the mean, deriving estimates based on the ease of retrieving instances of the event, anchoring to the initial frame, the gambler’s fallacy, and wishful thinking, which are all affected by our inability to consider more than a few aspects or dimensions of any phenomenon or situation at the same time." (Hans G Daellenbach & Donald C McNickle, "Management Science: Decision making through systems thinking", 2005)

"Concluding that the population is becoming more centralized by observing behavior at the extremes is called the 'Regression to the Mean' Fallacy. […] When looking for a change in a population, do not look only at the extremes; there you will always find a motion to the mean. Look at the entire population." (Charles Livingston & Paul Voakes, "Working with Numbers and Statistics: A handbook for journalists", 2005)

"regression to the mean: The fact that unexpectedly high or low numbers from the mean are an exception and are usually followed by numbers that are closer to the mean. Over the long haul, we tend to get relatively more numbers that are near the mean compared to numbers that are far from the mean." (Hari Singh, "Framed! Solve an Intriguing Mystery and Master How to Make Smart Choices", 2006)

21 August 2022

Peter Bevelin - Collected Quotes

"Changes in size or time influences form, function and behavior. If something of a certain size is made bigger or smaller, it may not work the same way. Some things get better and others get worse." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger",  2003)

"Every action has consequences. Both intended and unintended. No matter how carefully we plan, we can't anticipate everything. Often we fail to consider what other events are likely to occur as a result of some action. […] By solving one problem, we generate another one and sometimes create an even worse one." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"It is hard to predict something when we don't (or can't) foresee or understand how an entire system works, what key variables are involved, their attributes, how they influence one another and their impact. Even if we know the key variables, their values may be impossible to estimate. They may also change over time and be dependent on context. It may also be impossible to estimate how they will interact as a whole." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Many systems fail because they focus on the machines, not the people that use them. […] Humans are involved in designing, execution and follow-up. Excluding ignorance and insufficient knowledge, given the complexity of human and non-human factors interacting, there is a multitude of ways in which things can go wrong." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Often we try to get too much information, including misinformation, or information of no use to explain or predict. We also focus on details and what's irrelevant or unknowable and overlook the obvious truths. Dealing with what's important forces us to prioritize. There are often just a few actions that produce most of what we are trying to achieve. There are only a few decisions of real importance." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003) 

"Optimization of one variable may cause the whole system to work less efficiently. Why? The performance of most systems is constrained by the performance of its weakest link. A variable that limits the system from achieving its goal or optimum performance. […] When trying to improve the performance of a system, first find out the system's key contraint(s)- which may be physical (capacity, material, the market) or non-physical (policies, rules, measurements) -and its cause and effect relationship with the system. Maybe the constraint is based on faulty assumptions that can be corrected. Then try to "strengthen" or change the weakest link. Watch out for other effects - wanted or unwanted - that pop up as a consequence. Always consider the effects on the whole system." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Predictions about the future are often just projections of past curves and present trends. This is natural since our predictions about the future are made in the present. We therefore assume the future will be much like the present. But the future can't be known until it arrives. It is contingent on events we can't see." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"'Regression to the mean' […] says that, in any series of events where chance is involved, very good or bad performances, high or low scores, extreme events, etc. tend on the average, to be followed by more average performance or less extreme events. If we do extremely well, we're likely to do worse the next time, while if we do poorly, we're likely to do better the next time. But regression to the mean is not a natural law. Merely a statistical tendency. And it may take a long time before it happens." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Science works by elimination. To avoid drowning in low-information observations or experiments, scientists think in advance about what the most important and conclusive experiments would be: What are we trying to achieve or prove, and how can we reach these ends? What can't happen? This way, they narrow down the possibilities." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Some systems are more prone to accidents than others because of the number of components, their connections and interactions. The more variables we add to a system, and the more they interact, the more complicated we make it and the more opportunity the system has to fail. Improving certain parts in highly interconnected systems may do little to eliminate future problems. There is always the possibility of multiple simultaneous failures and the more complicated the system, the harder it is to predict all possible failures." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Try to optimize the whole and not a system's individual parts. Think through what other variables may change when we alter a factor in a system. Trace out the short and long-term consequences in numbers and effects of a proposed action to see if the net result agrees with our ultimate goal."  (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

14 August 2022

Laws II: The Law of Large Numbers

"[…] probability as a measurable degree of certainty; necessity and chance; moral versus mathematical expectation; a priori an a posteriori probability; expectation of winning when players are divided according to dexterity; regard of all available arguments, their valuation, and their calculable evaluation; law of large numbers […]" (Jacob Bernoulli, "Ars Conjectandi" ["The Art of Conjecturing"], 1713)

"Things of all kinds are subject to a universal law which may be called the law of large numbers. It consists in the fact that, if one observes very considerable numbers of events of the same nature, dependent on constant causes and causes which vary irregularly, sometimes in one direction, sometimes in the other, it is to say without their variation being progressive in any definite direction, one shall find, between these numbers, relations which are almost constant." (Siméon-Denis Poisson, "Poisson’s Law of Large Numbers", 1837)

"It is a common fallacy to believe that the law of large numbers acts as a force endowed with memory seeking to return to the original state, and many wrong conclusions have been drawn from this assumption." (William Feller, "An Introduction To Probability Theory And Its Applications", 1950)

“We know the laws of trial and error, of large numbers and probabilities. We know that these laws are part of the mathematical and mechanical fabric of the universe, and that they are also at play in biological processes. But, in the name of the experimental method and out of our poor knowledge, are we really entitled to claim that everything happens by chance, to the exclusion of all other possibilities?” (Albert Claude, [Nobel Prize Lecture], 1974)

"The law of truly large numbers states: With a large enough sample, any outrageous thing is likely to happen." (Frederick Mosteller, Methods for Studying Coincidences Journal of the American Statistical Association, Volume 84, 1989)

"All the law [of large numbers] tells us is that the average of a large number of throws will be more likely than the average of a small number of throws to differ from the true average by less than some stated amount. And there will always be a possibility that the observed result will differ from the true average by a larger amount than the specified bound." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Jacob Bernoulli's theorem for calculating probabilities a posteriori is known as the Law of Large Numbers. Contrary to the popular view, this law does not provide a method for validating observed facts, which are only an incomplete representation of the whole truth. Nor does it say that an increasing number of observations will increase the probability that what you see is what you are going to get. The law is not a design for improving the quality of empirical tests […]." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"The Law of Large Numbers does not tell you that the average of your throws will approach 50% as you increase the number of throws; simple mathematics can tell you that, sparing you the tedious business of tossing the coin over and over. Rather, the law states that increasing the number of throws will correspondingly increase the probability that the ratio of heads thrown to total throws will vary from 50% by less than some stated amount, no matter how small. The word 'vary' is what matters. The search is not for the true mean of 50% but for the probability that the error between the observed average and the true average will be less than, say, 2% - in other words, that increasing the number of throws will increase the probability that the observed average will fall within 2% of the true average." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"The law of small numbers is not really a law. It is a sarcastic name describing the misguided attempt to apply the law of large numbers when the numbers aren't large." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"The law of large numbers is a law of mathematical statistics. It states that when random samples are sufficiently large they match the population extremely closely. […] The 'law' of small numbers is a widespread human misconception that even small samples match the population closely." (Geoff Cumming, "Understanding the New Statistics", 2012)

13 August 2022

On Laws I: The Law of Averages

"Except under controlled conditions, or in circumstances where it is possible to ignore individuals and consider only large numbers and the law of averages, any kind of accurate foresight is impossible." (Aldous Huxley, "Time Must Have a Stop", 1944)

"A misunderstanding of Bernoulli’s theorem is responsible for one of the commonest fallacies in the estimation of probabilities, the fallacy of the maturity of chances. When a coin has come down heads twice in succession, gamblers sometimes say that it is more likely to come down tails next time because ‘by the law of averages’ (whatever that may mean) the proportion of tails must be brought right some time." (William Kneale, "Probability and Induction", 1949)

"Only when there is a substantial number of trials involved is the law of averages a useful description or prediction." (Darell Huff, "How to Lie with Statistics", 1954)

"The equanimity of your average tosser of coins depends upon a law, or rather a tendency, or let us say a probability, or at any rate a mathematically calculable chance, which ensures that he will not upset himself by losing too much nor upset his opponent by winning too often." (Tom Stoppard, "Rosencrantz and Guildenstern Are Dead", 1967)

"This faulty intuition as well as many modern applications of probability theory are under the strong influence of traditional misconceptions concerning the meaning of the law of large numbers and of a popular mystique concerning a so-called law of averages." (William Feller, "An Introduction to Probability Theory and Its Applications", 1968)

"I take the view that life is a nonspiritual, almost mathematical property that can emerge from networklike arrangements of matter. It is sort of like the laws of probability; if you get enough components together, the system will behave like this, because the law of averages dictates so. Life results when anything is organized according to laws only now being uncovered; it follows rules as strict as those that light obeys." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The slightly chaotic character of mind goes even deeper, to a degree our egos may find uncomfortable. It is very likely that intelligence, at bottom, is a probabilistic or statistical phenomenon — on par with the law of averages. The distributed mass of ricocheting impulses which form the foundation of intelligence forbid deterministic results for a given starting point. Instead of repeatable results, outcomes are merely probabilistic. Arriving at a particular thought, then, entails a bit of luck." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Losing streaks and winning streaks occur frequently in games of chance, as they do in real life. Gamblers respond to these events in asymmetric fashion: they appeal to the law of averages to bring losing streaks to a speedy end. And they appeal to that same law of averages to suspend itself so that winning streaks will go on and on. The law of averages hears neither appeal. The last sequence of throws of the dice conveys absolutely no information about what the next throw will bring. Cards, coins, dice, and roulette wheels have no memory." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"However, random walk theory also tells us that the chance that the balance never returns to zero - that is, that H stays in the lead for ever - is 0. This is the sense in which the 'law of averages' is true. If you wait long enough, then almost surely the numbers of heads and tails will even out. But this fact carries no implications about improving your chances of winning, if you're betting on whether H or T turns up. The probabilities are unchanged, and you don't know how long the 'long run' is going to be. Usually it is very long indeed." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"The basis of many misconceptions about probability is a belief in something usually referred to as 'the law of averages', which alleges that any unevenness in random events gets ironed out in the long run. For example, if a tossed coin keeps coming up heads, then it is widely believed that at some stage there will be a predominance of tails to balance things out." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"The 'law of averages' asserts itself not by removing imbalances, but by swamping them. Random walk theory tells us that if you wait long enough - on average, infinitely long - then eventually the numbers will balance out. If you stop at that very instant, then you may imagine that your intuition about a 'law of averages' is justified. But you're cheating: you stopped when you got the answer you wanted. Random walk theory also tells us that if you carry on for long enough, you will reach a situation where the number of H's is a billion more than the number of T's." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"People sometimes appeal to the 'law of averages' to justify their faith in the gambler’s fallacy. They may reason that, since all outcomes are equally likely, in the long run they will come out roughly equal in frequency. However, the next throw is very much in the short run and the coin, die or roulette wheel has no memory of what went before." (Alan Graham, "Developing Thinking in Statistics", 2006)

"Another kind of error possibly related to the use of the representativeness heuristic is the gambler’s fallacy, otherwise known as the law of averages. If you are playing roulette and the last four spins of the wheel have led to the ball’s landing on black, you may think that the next ball is more likely than otherwise to land on red. This cannot be. The roulette wheel has no memory. The chance of black is just what it always is. The reason people tend to think otherwise may be that they expect the sequence of events to be representative of random sequences, and the typical random sequence at roulette does not have five blacks in a row." (Jonathan Baron, "Thinking and Deciding" 4th Ed, 2008)

"The 'law of averages' asserts that an event is more likely if it has not occurred for a long time. Perhaps belief in this bit of folk wisdom is based on confusion of different types of experiments." (Glenn Ledder, "Mathematics for the Life Sciences: Calculus, Modeling, Probability, and Dynamical Systems", 2013)

"A very different - and very incorrect - argument is that successes must be balanced by failures (and failures by successes) so that things average out. Every coin flip that lands heads makes tails more likely. Every red at roulette makes black more likely. […] These beliefs are all incorrect. Good luck will certainly not continue indefinitely, but do not assume that good luck makes bad luck more likely, or vice versa." (Gary Smith, "Standard Deviations", 2014)

"[…] many gamblers believe in the fallacious law of averages because they are eager to find a profitable pattern in the chaos created by random chance." (Gary Smith, "Standard Deviations", 2014)


Peter L Bernstein - Collected Quotes

"A normal distribution is most unlikely, although not impossible, when the observations are dependent upon one another - that is, when the probability of one event is determined by a preceding event. The observations will fail to distribute themselves symmetrically around the mean." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"All the law [of large numbers] tells us is that the average of a large number of throws will be more likely than the average of a small number of throws to differ from the true average by less than some stated amount. And there will always be a possibility that the observed result will differ from the true average by a larger amount than the specified bound." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"But real-life situations often require us to measure probability in precisely this fashion - from sample to universe. In only rare cases does life replicate games of chance, for which we can determine the probability of an outcome before an event even occurs - a priori […] . In most instances, we have to estimate probabilities from what happened after the fact - a posteriori. The very notion of a posteriori implies experimentation and changing degrees of belief." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"In a mathematical sense a zero-sum game is a loser's game when it is valued in terms of utility. The best decision for both is to refuse to play this game." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Jacob Bernoulli's theorem for calculating probabilities a posteriori is known as the Law of Large Numbers. Contrary to the popular view, this law does not provide a method for validating observed facts, which are only an incomplete representation of the whole truth. Nor does it say that an increasing number of observations will increase the probability that what you see is what you are going to get. The law is not a design for improving the quality of empirical tests […]." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Losing streaks and winning streaks occur frequently in games of chance, as they do in real life. Gamblers respond to these events in asymmetric fashion: they appeal to the law of averages to bring losing streaks to a speedy end. And they appeal to that same law of averages to suspend itself so that winning streaks will go on and on. The law of averages hears neither appeal. The last sequence of throws of the dice conveys absolutely no information about what the next throw will bring. Cards, coins, dice, and roulette wheels have no memory." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Pascal's Triangle and all the early work in probability answered only one question: what is the probability of such-and-such an outcome? The answer to that question has limited value in most cases, because it leaves us with no sense of generality." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996) 

"Probability has always carried this double meaning, one looking into the future, the other interpreting the past, one concerned with our opinions, the other concerned with what we actually know." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Probability theory is a serious instrument for forecasting, but the devil, as they say, is in the details - in the quality of information that forms the basis of probability estimates." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Reality is a series of connected events, each dependent on another, radically different from games of chance in which the outcome of any single throw has zero influence on the outcome of the next throw. Games of chance reduce everything to a hard number, but in real life we use such measures as 'a little', 'a lot', or 'not too much, please' much more often than we use a precise quantitative measure." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"So we pour in data from the past to fuel the decision-making mechanisms created by our models, be they linear or nonlinear. But therein lies the logician's trap: past data from real life constitute a sequence of events rather than a set of independent observations, which is what the laws of probability demand. [...] It is in those outliers and imperfections that the wildness lurks." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"The dice and the roulette wheel, along with the stock market and the bond market, are natural laboratories for the study of risk because they lend themselves so readily to quantification; their language is the language of numbers." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"The Law of Large Numbers does not tell you that the average of your throws will approach 50% as you increase the number of throws; simple mathematics can tell you that, sparing you the tedious business of tossing the coin over and over. Rather, the law states that increasing the number of throws will correspondingly increase the probability that the ratio of heads thrown to total throws will vary from 50% by less than some stated amount, no matter how small. The word 'vary' is what matters. The search is not for the true mean of 50% but for the probability that the error between the observed average and the true average will be less than, say, 2% - in other words, that increasing the number of throws will increase the probability that the observed average will fall within 2% of the true average." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"The resolution of how to divide the stakes in an uncompleted game marked the beginning of a systematic analysis of probability - the measure of our confidence that something is going to happen. It brings us to the threshold of the quantification of risk." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"The theory of probability can define the probabilities at the gaming casino or in a lottery - there is no need to spin the roulette wheel or count the lottery tickets to estimate the nature of the outcome - but in real life relevant information is essential. And the bother is that we never have all the information we would like. Nature has established patterns, but only for the most part. Theory, which abstracts from nature, is kinder: we either have the information we need or else we have no need for information." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Time is the dominant factor in gambling. Risk and time are opposite sides of the same coin, for if there were no tomorrow there would be no risk. Time transforms risk, and the nature of risk is shaped by the time horizon: the future is the playing field." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Under conditions of uncertainty, both rationality and measurement are essential to decision-making. Rational people process information objectively: whatever errors they make in forecasting the future are random errors rather than the result of a stubborn bias toward either optimism or pessimism. They respond to new information on the basis of a clearly defined set of preferences. They know what they want, and they use the information in ways that support their preferences." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Under conditions of uncertainty, the choice is not between rejecting a hypothesis and accepting it, but between reject and not - reject. You can decide that the probability that you are wrong is so small that you should not reject the hypothesis. You can decide that the probability that you are wrong is so large that you should reject the hypothesis. But with any probability short of zero that you are wrong - certainty rather than uncertainty - you cannot accept a hypothesis." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Until we can distinguish between an event that is truly random and an event that is the result of cause and effect, we will never know whether what we see is what we'll get, nor how we got what we got. When we take a risk, we are betting on an outcome that will result from a decision we have made, though we do not know for certain what the outcome will be. The essence of risk management lies in maximizing the areas where we have some control over the outcome while minimizing the areas where we have absolutely no control over the outcome and the linkage between effect and cause is hidden from us." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"We can assemble big pieces of information and little pieces, but we can never get all the pieces together. We never know for sure how good our sample is. That uncertainty is what makes arriving at judgments so difficult and acting on them so risky. […] When information is lacking, we have to fall back on inductive reasoning and try to guess the odds." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Whenever we make any decision based on the expectation that matters will return to 'normal', we are employing the notion of regression to the mean." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"Without numbers, there are no odds and no probabilities; without odds and probabilities, the only way to deal with risk is to appeal to the gods and the fates. Without numbers, risk is wholly a matter of gut." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

07 August 2022

On Principles VI: Uncertainty Principle

"The uncertainty principle refers to the degree of indeterminateness in the possible present knowledge of the simultaneous values of various quantities with which the quantum theory deals; it does not restrict, for example, the exactness of a position measurement alone or a velocity measurement alone." (Werner Heisenberg, "The Uncertainty Principle", [in James R Newman, "The World of Mathematics" Vol. II], 1956)

"Both the uncertainty principle and the negentropy principle of information make Laplace's scheme [of exact determinism] completely unrealistic. The problem is an artificial one; it belongs to imaginative poetry, not to experimental science." (Léon Brillouin, "Science and Information Theory" 2nd Ed., 1962)

"No branch of number theory is more saturated with mystery than the study of prime numbers: those exasperating, unruly integers that refuse to be divided evenly by any integers except themselves and 1. Some problems concerning primes are so simple that a child can understand them and yet so deep and far from solved that many mathematicians now suspect they have no solution. Perhaps they are 'undecideable'. Perhaps number theory, like quantum mechanics, has its own uncertainty principle that makes it necessary, in certain areas, to abandon exactness for probabilistic formulations." (Martin Gardner, "The remarkable lore of the prime numbers", Scientific American, 1964)

"In particular, the uncertainty principle has stood for a generation, barring the way to more detailed descriptions of nature; and yet, with the lesson of parity still fresh in our minds, how can anyone be quite so sure of its universal validity when we note that, to this day, it has never been subjected to even one direct experimental test?" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"Because of mathematical indeterminancy and the uncertainty principle, it may be a law of nature that no nervous system is capable of acquiring enough knowledge to significantly predict the future of any other intelligent system in detail. Nor can intelligent minds gain enough self-knowledge to know their own future, capture fate, and in this sense eliminate free will." (Edward O Wilson, "On Human Nature", 1978)

"In physics, there are numerous phenomena that are said to be 'true on all scales', such as the Heisenberg uncertainty relation, to which no exception has been found over vast ranges of the variables involved (such as energy versus time, or momentum versus position). But even when the size ranges are limited, as in galaxy clusters (by the size of the universe) or the magnetic domains in a piece of iron near the transition point to ferromagnetism (by the size of the magnet), the concept true on all scales is an important postulate in analyzing otherwise often obscure observations." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"A bell curve shows the 'spread' or variance in our knowledge or certainty. The wider the bell the less we know. An infinitely wide bell is a flat line. Then we know nothing. The value of the quantity, position, or speed could lie anywhere on the axis. An infinitely narrow bell is a spike that is infinitely tall. Then we have complete knowledge of the value of the quantity. The uncertainty principle says that as one bell curve gets wider the other gets thinner. As one curve peaks the other spreads. So if the position bell curve becomes a spike and we have total knowledge of position, then the speed bell curve goes flat and we have total uncertainty (infinite variance) of speed." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"According to quantum theory, the ground state, or lowest energy state, of a pendulum is not just sitting at the lowest energy point, pointing straight down. That would have both a definite position and a definite velocity, zero. This would be a violation of the uncertainty principle, which forbids the precise measurement of both position and velocity at the same time. The uncertainty in the position multiplied by the uncertainty in the momentum must be greater than a certain quantity, known as Planck's constant - a number that is too long to keep writing down, so we use a symbol for it: ħ." (Stephen W Hawking, "The Universe in a Nutshell", 2001)

"The uncertainty principle expresses a seesaw relationship between the fluctuations of certain pairs of variables, such as an electron's position and its speed. Anything that lowers the uncertainty of one must necessarily raise the uncertainty of the other; you can't push both down at the same time. For example, the more tightly you confine an electron, the more wildly it thrashes. By lowering the position end of the seesaw, you force the velocity end to lift up. On the other hand, if you try to constrain the electron's velocity instead, its position becomes fuzzier and fuzzier; the electron can turn up almost anywhere.(Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The inherent nature of complexity is to doubt certainty and any pretense to finite and flawless data. Put another way, under uncertainty principles, any attempt by political systems to 'impose order' has an equal chance to instead 'impose disorder'." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

Decision Theory I

"Years ago a statistician might have claimed that statistics deals with the processing of data [...] today’s statistician will be more likely to say that statistics is concerned with decision making in the face of uncertainty." (Herman Chernoff & Lincoln E Moses, "Elementary Decision Theory", 1959)

"Another approach to management theory, undertaken by a growing and scholarly group, might be referred to as the decision theory school. This group concentrates on rational approach to decision-the selection from among possible alternatives of a course of action or of an idea. The approach of this school may be to deal with the decision itself, or to the persons or organizational group making the decision, or to an analysis of the decision process. Some limit themselves fairly much to the economic rationale of the decision, while others regard anything which happens in an enterprise the subject of their analysis, and still others expand decision theory to cover the psychological and sociological aspect and environment of decisions and decision-makers." (Harold Koontz, "The Management Theory Jungle," 1961)

"The term hypothesis testing arises because the choice as to which process is observed is based on hypothesized models. Thus hypothesis testing could also be called model testing. Hypothesis testing is sometimes called decision theory. The detection theory of communication theory is a special case." (Fred C Scweppe, "Uncertain dynamic systems", 1973)

"In decision theory, mathematical analysis shows that once the sampling distribution, loss function, and sample are specified, the only remaining basis for a choice among different admissible decisions lies in the prior probabilities. Therefore, the logical foundations of decision theory cannot be put in fully satisfactory form until the old problem of arbitrariness (sometimes called 'subjectiveness') in assigning prior probabilities is resolved." (Edwin T Jaynes, "Prior Probabilities", 1978)

"Decision theory, as it has grown up in recent years, is a formalization of the problems involved in making optimal choices. In a certain sense - a very abstract sense, to be sure - it incorporates among others operations research, theoretical economics, and wide areas of statistics, among others." (Kenneth Arrow, "The Economics of Information", 1984) 

"Cybernetics is concerned with scientific investigation of systemic processes of a highly varied nature, including such phenomena as regulation, information processing, information storage, adaptation, self-organization, self-reproduction, and strategic behavior. Within the general cybernetic approach, the following theoretical fields have developed: systems theory (system), communication theory, game theory, and decision theory." (Fritz B Simon et al, "Language of Family Therapy: A Systemic Vocabulary and Source Book", 1985)

"A field of study that includes a methodology for constructing computer simulation models to achieve better under-standing of social and corporate systems. It draws on organizational studies, behavioral decision theory, and engineering to provide a theoretical and empirical base for structuring the relationships in complex systems." (Virginia Anderson & Lauren Johnson, "Systems Thinking Basics: From Concepts to Casual Loops", 1997) 

"A decision theory that rests on the assumptions that human cognitive capabilities are limited and that these limitations are adaptive with respect to the decision environments humans frequently encounter. Decision are thought to be made usually without elaborate calculations, but instead by using fast and frugal heuristics. These heuristics certainly have the advantage of speed and simplicity, but if they are well matched to a decision environment, they can even outperform maximizing calculations with respect to accuracy. The reason for this is that many decision environments are characterized by incomplete information and noise. The information we do have is usually structured in a specific way that clever heuristics can exploit." (E Ebenhoh, "Agent-Based Modelnig with Boundedly Rational Agents", 2007)

On Concepts IX

"[In mathematics] we behold the conscious logical activity of the human mind in its purest and most perfect form. Here we learn to realize the laborious nature of the process, the great care with which it must proceed, the accuracy which is necessary to determine the exact extent of the general propositions arrived at, the difficulty of forming and comprehending abstract concepts; but here we learn also to place confidence in the certainty, scope and fruitfulness of such intellectual activity." (Hermann von Helmholtz, "Über das Verhältnis der Naturwissenschaften zur Gesammtheit der Wissenschaft", 1896)

"Former ages thought in terms of images of the imagination, whereas we moderns have concepts. Formerly the guiding ideas of life presented themselves in concrete visual form as divinities, whereas today they are conceptualized. The ancients excelled in creation; our own strength lies rather in destruction, in analysis." (Johann Wolfgang von Goethe, 1806)

"'You cannot base a general mathematical theory on imprecisely defined concepts. You can make some progress that way; but sooner or later the theory is bound to dissolve in ambiguities which prevent you from extending it further.' Failure to recognize this fact has another unfortunate consequence which is, in a practical sense, even more disastrous: 'Unless the conceptual problems of a field have been clearly resolved, you cannot say which mathematical problems are the relevant ones worth working on; and your efforts are more than likely to be wasted.'" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"To master a concept means to be able to recognize it, that is, to be able to determine whether or not any given situation belongs to the set that characterizes this concept." (Valentin F Turchin, "The Phenomenon of Science: A cybernetic approach to human evolution", 1977)

"A conceptual system is an integrated system of concepts that supports a coherent vision of some aspect of the world. A conceptual system is personal; it is a 'way of seeing', that is, a 'way of knowing'. [...] You cannot do mathematics or science without a conceptual system but such systems are not objective and permanent. They are subject to change and development. Therefore we cannot claim that the reality that we experience and work with in science is independent of the mind of the scientist." (William Byers, "Deep Thinking: What Mathematics Can Teach Us About the Mind", 2015)

"A mathematical concept, then, is an organised pattern of ideas that are somehow interrelated, drawing on the experience of concepts already established. Psychologists call such an organised pattern of ideas a ‘schema’. " (Ian Stewart & David Tall, "The Foundations of Mathematics" 2nd Ed., 2015)

"Facts and concepts only acquire real meaning and significance when viewed through the lens of a conceptual system. [...] Facts do not exist independently of knowledge and understanding for without some conceptual basis one would not know what data to even consider. The very act of choosing implies some knowledge. One could say that data, knowledge, and understanding are different ways of describing the same situation depending on the type of human involvement implied - 'data' means a de-emphasis on the human dimension whereas 'understanding' highlights it." (William Byers, "Deep Thinking: What Mathematics Can Teach Us About the Mind", 2015)

"The problem of teaching is the problem of introducing concepts and conceptual systems. In this crucial task the procedures of formal mathematical argument are of little value. The way we reason in formal mathematics is itself a conceptual system - deductive logic - but it is a huge mistake to identify this with mathematics. [...] Mathematics lives in its concepts and conceptual systems, which need to be explicitly addressed in the teaching of mathematics." (William Byers, "Deep Thinking: What Mathematics Can Teach Us About the Mind", 2015)

Edwin T Jaynes - Collected Quotes

"In conventional statistical mechanics the energy plays a preferred role among all dynamical quantities because it is conserved both in the time development of isolated systems and in the interaction of different systems. Since, however, the principles of maximum-entropy inference are independent of any physical properties, it appears that in subjective statistical mechanics all measurable quantities may be treated on the same basis, subject to certain precautions." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"Just as in applied statistics the crux of a problem is often the devising of some method of sampling that avoids bias, our problem is that of finding a probability assignment which avoids bias, while agreeing with whatever information is given. The great advance provided by information theory lies in the discovery that there is a unique, unambiguous criterion for the 'amount of uncertainty' represented by a discrete probability distribution, which agrees with our intuitive notions that a broad distribution represents more uncertainty than does a sharply peaked one, and satisfies all other conditions which make it reasonable." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"On the other hand, the 'subjective' school of thought, regards probabilities as expressions of human ignorance; the probability of an event is merely a formal expression of our expectation that the event will or did occur, based on whatever information is available. To the subjectivist, the purpose of probability theory is to help us in forming plausible conclusions in cases where there is not enough information available to lead to certain conclusions; thus detailed verification is not expected. The test of a good subjective probability distribution is does it correctly represent our state of knowledge as to the value of x?" (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"The mere fact that the same mathematical expression -Σ pi log(pi) [i is index], occurs both in statistical mechanics and in information theory does not in itself establish any connection between these fields. This can be done only by finding new viewpoints from which thermodynamic entropy and information-theory entropy appear as the same concept." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"[...] thermodynamics knows of no such notion as the 'entropy of a physical system'. Thermodynamics does have the concept of the entropy of a thermodynamic system; but a given physical system corresponds to many different thermodynamic systems." (Edwin T Jaynes, "Gibbs vs Boltzmann Entropies", 1964)

"In particular, the uncertainty principle has stood for a generation, barring the way to more detailed descriptions of nature; and yet, with the lesson of parity still fresh in our minds, how can anyone be quite so sure of its universal validity when we note that, to this day, it has never been subjected to even one direct experimental test?" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"'You cannot base a general mathematical theory on imprecisely defined concepts. You can make some progress that way; but sooner or later the theory is bound to dissolve in ambiguities which prevent you from extending it further.' Failure to recognize this fact has another unfortunate consequence which is, in a practical sense, even more disastrous: 'Unless the conceptual problems of a field have been clearly resolved, you cannot say which mathematical problems are the relevant ones worth working on; and your efforts are more than likely to be wasted.'" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"In decision theory, mathematical analysis shows that once the sampling distribution, loss function, and sample are specified, the only remaining basis for a choice among different admissible decisions lies in the prior probabilities. Therefore, the logical foundations of decision theory cannot be put in fully satisfactory form until the old problem of arbitrariness (sometimes called 'subjectiveness') in assigning prior probabilities is resolved." (Edwin T Jaynes, "Prior Probabilities", 1978)

"It appears to be a quite general principle that, whenever there is a randomized way of doing something, then there is a nonrandomized way that delivers better performance but requires more thought." (Edwin T Jaynes, "Probability Theory: The Logic of Science", 1979)

"The semiliterate on the next bar stool will tell you with absolute, arrogant assurance just how to solve the world's problems; while the scholar who has spent a lifetime studying their causes is not at all sure how to do this." (Edwin T Jaynes, "Probability Theory: The Logic of Science", 1979)

"The difference is that energy is a property of the microstates, and so all observers, whatever macroscopic variables they may choose to define their thermodynamic states, must ascribe the same energy to a system in a given microstate. But they will ascribe different entropies to that microstate, because entropy is not a property of the microstate, but rather of the reference class in which it is embedded. As we learned from Boltzmann, Planck, and Einstein, the entropy of a thermodynamic state is a measure of the number of microstates compatible with the macroscopic quantities that you or I use to define the thermodynamic state." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

"There is no end to this search for the ultimate ‘true’ entropy until we have reached the point where we control the location of each atom independently. But just at that point the notion of entropy collapses, and we are no longer talking thermodynamics." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

On Conservation Laws (1950-1999)

"In conventional statistical mechanics the energy plays a preferred role among all dynamical quantities because it is conserved both in the time development of isolated systems and in the interaction of different systems. Since, however, the principles of maximum-entropy inference are independent of any physical properties, it appears that in subjective statistical mechanics all measurable quantities may be treated on the same basis, subject to certain precautions." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"It is common knowledge today that in general a symmetry principle (or equivalently an invariance principle) generates a conservation law. For example, the invariance of physical laws under space displacement has as a consequence the conservation of momentum, the invariance under space rotation has as a consequence the conservation of angular momentum." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957)

"Whereas the continuous symmetries always lead to conservation laws in classical mechanics, a discrete symmetry does not. With the introduction of quantum mechanics, however, this difference between the discrete and continuous symmetries disappears. The law of right-left symmetry then leads also to a conservation law: the conservation of parity." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957)

"The law of causality is no longer applied in quantum theory and the law of conservation of matter is no longer true for the elementary particles." (Werner K Heisenberg, "Physics and Philosophy: The revolution in modern science", 1958)

"In the everyday world, energy is always unalterably fixed; the law of energy conservation is a cornerstone of classical physics. But in the quantum microworld, energy can appear and disappear out of nowhere in a spontaneous and unpredictable fashion." (Paul C W Davies, "God and the New Physics", 1983)

"There is a fact, or if you wish, a law governing all natural phenomena that are known to date. There is no known exception to this law - it is exact as far as we know. The law is called the conservation of energy. It states that there is a certain quantity, which we call energy, that does not change in the manifold changes which nature undergoes. That is a most abstract idea, because it is a mathematical principle; it says that there is a numerical quantity which does not change when something happens." (Richard P Feynman et al, "The Feynman Lectures on Physics" Vol. 1, 1983)

"The most abstract conservation laws of physics come into their being in describing equilibrium in the most extreme conditions. They are the most rigorous conservation laws, the last to break down. The more extreme the conditions, the fewer the conserved structures. [...] In a deep sense, we understand the interior of the sun better that the interior of the earth, and the early stages of the big bang best of all." (Frank Wilczek, "Longing for the Harmonies: Themes and Variations from Modern Physics", 1987)

"What is conserved, in modern physics, is not any particular substance or material but only much more abstract entities such as energy, momentum, and electric charge. The permanent aspects of reality are not particular materials or structures but rather the possible forms of structures and the rules for their transformation." (Frank Wilczek, "Longing for the Harmonies: Themes and Variations from Modern Physics", 1987)

06 August 2022

On Symmetry XI

"The investigation of the symmetries of a given mathematical structure has always yielded the most powerful results. Symmetries are maps which preserve certain properties." (Emil Artin, "Geometric Algebra", 1957)

"Whereas the continuous symmetries always lead to conservation laws in classical mechanics, a discrete symmetry does not. With the introduction of quantum mechanics, however, this difference between the discrete and continuous symmetries disappears. The law of right-left symmetry then leads also to a conservation law: the conservation of parity." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957)

"A physical system is said to possess a symmetry if one can make a change in the system such that, after the change, the system is exactly the same as it was before. We call the change we are making to the system a symmetry operation or a symmetry transformation. If a system stays the same when we do a transformation to it, we say that the system is invariant under the transformation." (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"So, a scientist's definition of symmetry would be something like this: symmetry is an invariance of an object or system to a transformation. The invariance is the sameness or constancy of the system in form, appearance, composition, arrangement, and so on, and a transformation is the abstract action we apply to the system that takes it from one state into another, equivalent, one. There are often numerous transformations we can apply on a given system that take it into an equivalent state." (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"Symmetry is ubiquitous. Symmetry has myriad incarnations in the innumerable patterns designed by nature. It is a key element, often the central or defining theme, in art, music, dance, poetry, or architecture. Symmetry permeates all of science, occupying a prominent place in chemistry, biology, physiology, and astronomy. Symmetry pervades the inner world of the structure of matter, the outer world of the cosmos, and the abstract world of mathematics itself. The basic laws of physics, the most fundamental statements we can make about nature, are founded upon symmetry." (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"The symmetries that we sense and observe in the world around us affirm the notion of the existence of a perfect order and harmony underlying everything in the universe. Through symmetry we sense an apparent logic at work in the universe, external to, yet resonant with, our own minds. [...] Symmetry gives wings to our creativity. It provides organizing principles for our artistic impulses and our thinking, and it is a source of hypotheses that we can make to understand the physical world." (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"A symmetry of some mathematical structure is a transformation of that structure, of a specified kind, that leaves specified properties of the structure unchanged." (Ian Stewart, "Symmetry: A Very Short Introduction", 2013)

"A system governed by a deterministic theory can only evolve along a single trajectory - namely, that dictated by its laws and initial conditions; all other trajectories are excluded. Symmetry principles, on the other hand, fit the freedom-inducing model. Rather than distinguishing what is excluded from what is bound to happen, these principles distinguish what is excluded from what is possible. In other words, although they place restrictions on what is possible, they do not usually determine a single trajectory." (Yemima Ben-Menahem, "Causation in Science", 2018)

"Symmetries are transformations that keep certain parameters (properties, equations, and so on) invariant, that is, the parameters they refer to are conserved under these transformations. It is to be expected, therefore, that the identification of conserved quantities is inseparable from the identification of fundamental symmetries in the laws of nature. Symmetries single out 'privileged' operations, conservation laws single out 'privileged' quantities or properties that correspond to these operations. Yet the specific connections between a particular symmetry and the invariance it entails are far from obvious. For instance, the isotropy of space (the indistinguishability of its directions) is intuitive enough, but the conservation of angular momentum based on that symmetry, and indeed, the concept of angular momentum, are far less intuitive." (Yemima Ben-Menahem, "Causation in Science", 2018)

Chen-Ning Yang - Collected Quotes

"It is common knowledge today that in general a symmetry principle (or equivalently an invariance principle) generates a conservation law. For example, the invariance of physical laws under space displacement has as a consequence the conservation of momentum, the invariance under space rotation has as a consequence the conservation of angular momentum." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957)

"Nature possesses an order that one may aspire to comprehend." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957)

"[...] nature seems to take advantage of the simple mathematical representations of the symmetry laws. When one pauses to consider the elegance and the beautiful perfection of the mathematical reasoning involved and contrast it with the complex and far-reaching physical consequences, a deep sense of respect for the power of the symmetry laws never fails to develop." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957)

"The quantum numbers that designate the states of a system are often identical with those that represent the symmetries of the system." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957)

"Whereas the continuous symmetries always lead to conservation laws in classical mechanics, a discrete symmetry does not. With the introduction of quantum mechanics, however, this difference between the discrete and continuous symmetries disappears. The law of right-left symmetry then leads also to a conservation law: the conservation of parity." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957) 

"With the advent of special and general relativity, the symmetry laws gained new importance. Their connection with the dynamic laws of physics takes on a much more integrated and interdependent relationship than in classical mechanics, where logically the symmetry laws were only consequences of the dynamical laws that by chance possess the symmetries. Also in the relativity theories the realm of the symmetry laws was greatly enriched to include invariances that were by no means apparent from daily experience. Their validity rather was deduced from, or was later confirmed by complicated experimentation. Let me emphasize that the conceptual simplicity and intrinsic beauty of the symmetries that so evolve from complex experiments are for the physicists great sources of encouragement. One learns to hope that Nature possesses an order that one may aspire to comprehend." (Chen-Ning Yang, "The Law of Parity Conservation and Other Symmetry Laws of Physics", [Nobel lecture] 1957)

On Conservation Laws (2000-)

"Perhaps the most profound synthesis of physical sciences came from the realization that everything could be understood from 'conservation laws' and symmetry principals." (Didier Sornette, "Why Stock Markets Crash: Critical Events in Complex Systems", 2003)

"According to a 'sociological' view of mathematics, a system, in general, should be able to do whatever is permitted by the laws governing it: the normal state of anarchy is chaos! From this point of view, we should expect that, in the absence of conservation laws, typical motions should be dense in the space available to them; Kolomogorov’s theorem denies this, saying that when the laws are relaxed a bit, the majority of motions stay 'pretty much' where they were, as if in fear of a non-existent police force." (John H Hubbard, "The KAM Theorem", 2004)

"A great deal of the results in many areas of physics are presented in the form of conservation laws, stating that some quantities do not change during evolution of the system. However, the formulations in cybernetical physics are different. Since the results in cybernetical physics establish how the evolution of the system can be changed by control, they should be formulated as transformation laws, specifying the classes of changes in the evolution of the system attainable by control function from the given class, i.e., specifying the limits of control." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"Each of the most basic physical laws that we know corresponds to some invariance, which in turn is equivalent to a collection of changes which form a symmetry group. […] whilst leaving some underlying theme unchanged. […] for example, the conservation of energy is equivalent to the invariance of the laws of motion with respect to translations backwards or forwards in time […] the conservation of linear momentum is equivalent to the invariance of the laws of motion with respect to the position of your laboratory in space, and the conservation of angular momentum to an invariance with respect to directional orientation [...] discovery of conservation laws indicated that Nature possessed built-in sustaining principles which prevented the world from just ceasing to be." (John D Barrow, "New Theories of Everything", 2007)

"The methodology of feedback design is borrowed from cybernetics (control theory). It is based upon methods of controlled system model’s building, methods of system states and parameters estimation (identification), and methods of feedback synthesis. The models of controlled system used in cybernetics differ from conventional models of physics and mechanics in that they have explicitly specified inputs and outputs. Unlike conventional physics results, often formulated as conservation laws, the results of cybernetical physics are formulated in the form of transformation laws, establishing the possibilities and limits of changing properties of a physical system by means of control." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"Entanglement (non-separability) has been interpreted in several non-physical ways, including recourse to telekinesis; it has also been claimed that it refutes realism and confirms holism. In my view, all entanglement does is to confirm the thesis Once a system, always a system. However, this is not an independent postulate, but a consequence of conservation laws." (Mario Bunge, "Matter and Mind: A Philosophical Inquiry", 2010)

"Symmetries are transformations that keep certain parameters (properties, equations, and so on) invariant, that is, the parameters they refer to are conserved under these transformations. It is to be expected, therefore, that the identification of conserved quantities is inseparable from the identification of fundamental symmetries in the laws of nature. Symmetries single out 'privileged' operations, conservation laws single out 'privileged' quantities or properties that correspond to these operations. Yet the specific connections between a particular symmetry and the invariance it entails are far from obvious. For instance, the isotropy of space (the indistinguishability of its directions) is intuitive enough, but the conservation of angular momentum based on that symmetry, and indeed, the concept of angular momentum, are far less intuitive." (Yemima Ben-Menahem, "Causation in Science", 2018)

05 August 2022

On Conservation Laws (-1949)

"A true philosopher does not engage in vain disputes about the nature of motion; rather, he wishes to know the laws by which it is distributed, conserved or destroyed, knowing that such laws is the basis for all natural philosophy." (Pierre L Maupertuis, "Les Loix du Mouvement et du Repos, déduites d'un Principe Métaphysique", 1746) 

"The supreme Being is everywhere; but He is not equally visible everywhere. Let us seek Him in the simplest things, in the most fundamental laws of Nature, in the universal rules by which movement is conserved, distributed or destroyed; and let us not seek Him in phenomena that are merely complex consequences of these laws." (Pierre L Maupertuis, "Les Loix du Mouvement et du Repos, déduites d'un Principe Métaphysique", 1746) 

"Nature as a whole possesses a store of force which cannot in any way be either increased or diminished [...] therefore, the quantity of force in Nature is just as eternal and unalterable as the quantity of matter [...]. I have named [this] general law 'The Principle of the Conservation of Force'." (Hermann von Helmholtz, "Uber die Erhaltung der Kraft", 1847)

"Energy really is only an integral; now, what we want to have is a substantial definition, like that of Leibniz, and this demand is justifiable to a certain degree, since our very conviction of the conservation of energy rests in great part on this foundation. [..] And so the manuals of physics contain really two discordant definitions of energy, the first which is verbal, intelligible, capable of establishing our conviction, and false; and the second which is mathematical, exact, but lacking verbal expression." (Emile Meyerson, "Identity & Reality", 1908)

"The miracles of religion are to be discredited, not because we cannot conceive of them, but because they run counter to all the rest of our knowledge; while the mysteries of science, such as chemical affinity, the conservation of energy, the indivisibility of the atom, the change of the non-living into the living […] extend the boundaries of our knowledge, though the modus operandi of the changes remains hidden." (John Burroughs, "Scientific Faith", The Atlantic Monthly, 1915)

"The most important result of a general character to which the special theory has led is concerned with the conception of mass. Before the advent of relativity, physics recognized two conservation laws of fundamental importance, namely, the law of conservation of energy and the law of the conservation of mass; these two fundamental laws appeared to be quite independent of each other. By means of the theory of relativity they have been united into one law." (Albert Einstein, 1920)

"Matter [...] could be measured as a quantity and [...] its characteristic expression as a substance was the Law of Conservation of Matter [...] This, which has hitherto represented our knowledge of space and matter, and which was in many quarters claimed by philosophers as a priori knowledge, absolutely general and necessary, stands to-day a tottering structure." (Hermann Weyl, "Space, Time, Matter", 1922)

Brian Christian - Collected Quotes

"As with all issues involving overfitting, how early to stop depends on the gap between what you can measure and what really matters. If you have all the facts, they’re free of all error and uncertainty, and you can directly assess whatever is important to you, then don’t stop early. Think long and hard: the complexity and effort are appropriate." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"Bayes’s Rule tells us that when it comes to making predictions based on limited evidence, few things are as important as having good priors - that is, a sense of the distribution from which we expect that evidence to have come. Good predictions thus begin with having good instincts about when we’re dealing with a normal distribution and when with a power-law distribution. As it turns out, Bayes’s Rule offers us a simple but dramatically different predictive rule of thumb for each." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"Game theory covers an incredibly broad spectrum of scenarios of cooperation and competition, but the field began with those resembling heads-up poker: two-person contests where one player’s gain is another player’s loss. Mathematicians analyzing these games seek to identify a so-called equilibrium: that is, a set of strategies that both players can follow such that neither player would want to change their own play, given the play of their opponent. It’s called an equilibrium because it’s stable—no amount of further reflection by either player will bring them to different choices. I’m content with my strategy, given yours, and you’re content with your strategy, given mine." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"In other words, overfitting poses a danger any time we’re dealing with noise or mismeasurement—and we almost always are. There can be errors in how the data were collected, or in how they were reported. Sometimes the phenomena being investigated, such as human happiness, are hard to even define, let alone measure. Thanks to their flexibility, the most complex models available to us can fit any patterns that appear in the data, but this means that they will also do so even when those patterns are mere phantoms and mirages in the noise." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"Many prediction algorithms, for instance, start out by searching for the single most important factor rather than jumping to a multi-factor model. Only after finding that first factor do they look for the next most important factor to add to the model, then the next, and so on. Their models can therefore be kept from becoming overly complex simply by stopping the process short, before overfitting has had a chance to creep in. A related approach to calculating predictions considers one data point at a time, with the model tweaked to account for each new point before more points are added; there, too, the complexity of the model increases gradually, so stopping the process short can help keep it from overfitting." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"Randomness seems like the opposite of reason - a form of giving up on a problem, a last resort. Far from it. The surprising and increasingly important role of randomness in computer science shows us that making use of chance can be a deliberate and effective part of approaching the hardest sets of problems. In fact, there are times when nothing else will do."  (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"The effectiveness of regularization in all kinds of machine-learning tasks suggests that we can make better decisions by deliberately thinking and doing less. If the factors we come up with first are likely to be the most important ones, then beyond a certain point thinking more about a problem is not only going to be a waste of time and effort - it will lead us to worse solutions. Early Stopping provides the foundation for a reasoned argument against reasoning, the thinking person’s case against thought. But turning this into practical advice requires answering one more question: when should we stop thinking?" (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"The greater the uncertainty, the bigger the gap between what you can measure and what matters, the more you should watch out for overfitting - that is, the more you should prefer simplicity." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

On Certainty (2000-)

"Information entropy has its own special interpretation and is defined as the degree of unexpectedness in a message. The more unexpected words or phrases, the higher the entropy. It may be calculated with the regular binary logarithm on the number of existing alternatives in a given repertoire. A repertoire of 16 alternatives therefore gives a maximum entropy of 4 bits. Maximum entropy presupposes that all probabilities are equal and independent of each other. Minimum entropy exists when only one possibility is expected to be chosen. When uncertainty, variety or entropy decreases it is thus reasonable to speak of a corresponding increase in information." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Most physical systems, particularly those complex ones, are extremely difficult to model by an accurate and precise mathematical formula or equation due to the complexity of the system structure, nonlinearity, uncertainty, randomness, etc. Therefore, approximate modeling is often necessary and practical in real-world applications. Intuitively, approximate modeling is always possible. However, the key questions are what kind of approximation is good, where the sense of 'goodness' has to be first defined, of course, and how to formulate such a good approximation in modeling a system such that it is mathematically rigorous and can produce satisfactory results in both theory and applications." (Guanrong Chen & Trung Tat Pham, "Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems", 2001) 

"In fact, H [entropy] measures the amount of uncertainty that exists in the phenomenon. If there were only one event, its probability would be equal to 1, and H would be equal to 0 - that is, there is no uncertainty about what will happen in a phenomenon with a single event because we always know what is going to occur. The more events that a phenomenon possesses, the more uncertainty there is about the state of the phenomenon. In other words, the more entropy, the more information." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"The storytelling mind is allergic to uncertainty, randomness, and coincidence. It is addicted to meaning. If the storytelling mind cannot find meaningful patterns in the world, it will try to impose them. In short, the storytelling mind is a factory that churns out true stories when it can, but will manufacture lies when it can't." (Jonathan Gottschall, "The Storytelling Animal: How Stories Make Us Human", 2012)

"The data is a simplification - an abstraction - of the real world. So when you visualize data, you visualize an abstraction of the world, or at least some tiny facet of it. Visualization is an abstraction of data, so in the end, you end up with an abstraction of an abstraction, which creates an interesting challenge. […] Just like what it represents, data can be complex with variability and uncertainty, but consider it all in the right context, and it starts to make sense." (Nathan Yau, "Data Points: Visualization That Means Something", 2013)

"Without precise predictability, control is impotent and almost meaningless. In other words, the lesser the predictability, the harder the entity or system is to control, and vice versa. If our universe actually operated on linear causality, with no surprises, uncertainty, or abrupt changes, all future events would be absolutely predictable in a sort of waveless orderliness." (Lawrence K Samuels, "Defense of Chaos", 2013)

"We have minds that are equipped for certainty, linearity and short-term decisions, that must instead make long-term decisions in a non-linear, probabilistic world." (Paul Gibbons, "The Science of Successful Organizational Change", 2015)

"The greater the uncertainty, the bigger the gap between what you can measure and what matters, the more you should watch out for overfitting - that is, the more you should prefer simplicity." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

"A notable difference between many fields and data science is that in data science, if a customer has a wish, even an experienced data scientist may not know whether it’s possible. Whereas a software engineer usually knows what tasks software tools are capable of performing, and a biologist knows more or less what the laboratory can do, a data scientist who has not yet seen or worked with the relevant data is faced with a large amount of uncertainty, principally about what specific data is available and about how much evidence it can provide to answer any given question. Uncertainty is, again, a major factor in the data scientific process and should be kept at the forefront of your mind when talking with customers about their wishes."  (Brian Godsey, "Think Like a Data Scientist", 2017)

"The elements of this cloud of uncertainty (the set of all possible errors) can be described in terms of probability. The center of the cloud is the number zero, and elements of the cloud that are close to zero are more probable than elements that are far away from that center. We can be more precise in this definition by defining the cloud of uncertainty in terms of a mathematical function, called the probability distribution." (David S Salsburg, "Errors, Blunders, and Lies: How to Tell the Difference", 2017)

"Uncertainty is an adversary of coldly logical algorithms, and being aware of how those algorithms might break down in unusual circumstances expedites the process of fixing problems when they occur - and they will occur. A data scientist’s main responsibility is to try to imagine all of the possibilities, address the ones that matter, and reevaluate them all as successes and failures happen." (Brian Godsey, "Think Like a Data Scientist", 2017)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." (G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"Estimates based on data are often uncertain. If the data were intended to tell us something about a wider population (like a poll of voting intentions before an election), or about the future, then we need to acknowledge that uncertainty. This is a double challenge for data visualization: it has to be calculated in some meaningful way and then shown on top of the data or statistics without making it all too cluttered." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

Related Posts Plugin for WordPress, Blogger...

Alexander von Humboldt - Collected Quotes

"Whatever relates to extent and quantity may be represented by geometrical figures. Statistical projections which speak to the senses w...