Showing posts with label noise. Show all posts
Showing posts with label noise. Show all posts

24 October 2023

On Noise VI

"In a white-noise process, every value of the process (e.g., the successive frequencies of a melody) is completely independent of its past - it is a total surprise. By contrast, in 'brown music' (a term derived from Brownian motion), only the increments are independent of the past, giving rise to a rather boring tune." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"In contrast to gravitation, interatomic forces are typically modeled as inhomogeneous power laws with at least two different exponents. Such laws (and exponential laws, too) are not scale-free; they necessarily introduce a characteristic length, related to the size of the atoms. Power laws also govern the power spectra of all kinds of noises, most intriguing among them the ubiquitous (but sometimes difficult to explain)." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"We might expect that the noise will 'smear out' each data point and make it difficult for the network to fit individual data points precisely and hence will reduce over-fitting. In practice it has been demonstrated that training with noise can indeed lead to improvements in network generalization." (Christopher M Bishop, "Neural Networks for Pattern Recognition", 1995)

"A moderate amount of noise leads to enhanced order in excitable systems, manifesting itself in a nearly periodic spiking of single excitable systems, enhancement of synchronized oscillations in coupled systems, and noise-induced stability of spatial pattens in reaction-diffusion systems." (Benjamin Lindner et al, "Effects of Noise in Excitable Systems", Physical Reports. vol. 392, 2004)

"Linear systems do not benefit from noise because the output of a linear system is just a simple scaled version of the input [...] Put noise in a linear system and you get out noise. Sometimes you get out a lot more noise than you put in. This can produce explosive effects in feedback systems that take their own outputs as inputs." (Bart Kosko, "Noise", 2006)

"One person’s signal is another person’s noise and vice versa. We call this relative role reversal the noise-signal duality." (Bart Kosko, "Noise", 2006)

"Many scientists who work not just with noise but with probability make a common mistake: They assume that a bell curve is automatically Gauss's bell curve. Empirical tests with real data can often show that such an assumption is false. The result can be a noise model that grossly misrepresents the real noise pattern. It also favors a limited view of what counts as normal versus non-normal or abnormal behavior. This assumption is especially troubling when applied to human behavior. It can also lead one to dismiss extreme data as error when in fact the data is part of a pattern." (Bart Kosko, "Noise", 2006)

"Noise is an unwanted signal. A signal is anything that conveys information or ultimately anything that has energy. The universe consists of a great deal of energy. Indeed a working definition of the universe is all energy anywhere ever. So the answer turns on how one defines what it means to be wanted and by whom." (Bart Kosko, "Noise", 2006)

"The noise takes its toll on the message as it randomly turns some of the 1 bits into 0 bits and randomly turns some of the 0 bits into 1 bits: Noise randomly flips bits. [...] But noise can be subtler in a digital system. Noise can disturb the timing of when a bit value arrives at a receiver as well as randomly flipping that bit value." (Bart Kosko, "Noise", 2006)

"[...] in the statistical world, what we see and measure around us can be considered as the sum of a systematic mathematical idealized form plus some random contribution that cannot yet be explained. This is the classic idea of the signal and the noise." (David Spiegelhalter, "The Art of Statistics: Learning from Data", 2019)

02 September 2023

Bart Kosko - Collected Quotes

"A bell curve shows the 'spread' or variance in our knowledge or certainty. The wider the bell the less we know. An infinitely wide bell is a flat line. Then we know nothing. The value of the quantity, position, or speed could lie anywhere on the axis. An infinitely narrow bell is a spike that is infinitely tall. Then we have complete knowledge of the value of the quantity. The uncertainty principle says that as one bell curve gets wider the other gets thinner. As one curve peaks the other spreads. So if the position bell curve becomes a spike and we have total knowledge of position, then the speed bell curve goes flat and we have total uncertainty (infinite variance) of speed." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"A fuzzy cognitive map or FCM draws a causal picture. It ties facts and things and processes to values and policies and objectives. And it lets you predict how complex events interact and play out. [...] Neural nets give a shortcut to tuning an FCM. The trick is to let the fuzzy causal edges change as if they were synapses in a neural net. They cannot change with the same math laws because FCM edges stand for causal effect not signal flow. We bombard the FCM nodes with real data. The data state which nodes are on or off and to which degree at each moment in time. Then the edges grow among the nodes."  (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"A neural net acts like the eyes and ears of an adaptive fuzzy system, a fuzzy system whose rules change with experience. The neural net senses fuzzy patterns in the data and learns to associate the patterns. The associations are rules: If fuzzy set A, then fuzzy set B." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Bivalence trades accuracy for simplicity. Binary outcomes of yes and no, white and black, true and false simplify math and computer processing. You can work with strings of 0s and 1s more easily than you can work with fractions. But bivalence requires some force fitting and rounding off [...] Bivalence holds at cube corners. Multivalence holds everywhere else." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Descriptions split into two groups, the logical and the factual, or the mathematical and the scientific, or the coherent and the correspondent. The split depends on accuracy. Logical statements are completely accurate or completely inaccurate. They alone are all or none. Factual statements are partially accurate or partially inaccurate." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Fuzziness has a formal name in science: multivalence. The opposite of fuzziness is bivalence or two-valuedness, two ways to answer each question, true or false, 1 or 0. Fuzziness means multivalence. It means three or more options, perhaps an infinite spectrum of options, instead of just two extremes. It means analog instead of binary, infinite shades of gray between black and white." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Fuzzy entropy measures the fuzziness of a fuzzy set. It answers the question 'How fuzzy is a fuzzy set?' And it is a matter of degree. Some fuzzy sets are fuzzier than others. Entropy means the uncertainty or disorder in a system. A set describes a system or collection of things. When the set is fuzzy, when elements belong to it to some degree, the set is uncertain or vague to some degree. Fuzzy entropy measures this degree. And it is simple enough that you can see it in a picture of a cube." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"[...] fuzzy things resemble fuzzy nonthings. A resembles not-A. Fuzzy things have vague boundaries with their opposites, with nonthings. The more a thing resembles its opposite, the fuzzier it is. In the fuzziest case the thing equals its opposite: the glass of water half empty and half full [...]" (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"I called this the mismatch problem: The world is gray but science is black and white. We talk in zeroes and ones but the truth lies in between. Fuzzy world, nonfuzzy description. The statements of formal logic and computer programming are all true or all false, 1 or 0. But statements about the world differ." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Laws of science are not laws at all. They are not laws in the sense of logical laws like two plus two equals four. Logic does not legislate them. Laws of science state tendencies we have recently observed in our corner of the universe. The best you can say about them is so far, so good. In the next instant every 'law' of science may change. Their truth is a matter of degree and is always up for grabs." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Math is a formal system. We can manipulate math symbols and not understand what they mean. We can just apply the syntax rules as a computer does when it adds up numbers or proves a theorem. The computer shows the truth of the theorem but does not 'understand' its 'meaning'." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Science undercuts ethics because we have made science the measure of all things. Truth means truth of science. Truth means logical truth or factual truth. Truth means math proof or data test. The truth can be a matter of degree. But that does not help ethics." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Scientific claims or statements are inexact and provisional. They depend on dozens of simplifying assumptions and on a particular choice of words and symbols and on 'all other things being equal'." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Scientists try to make things simple. That is in good part why we are stuck with bivalence. Scientists' first instinct is to fit a linear model to a nonlinear world. This creates another mismatch problem, the math modeler's dilemma: linear math, nonlinear world." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Somewhere in the process wishful thinking seems to take over. Scientists start to believe they do math when they do science. This holds to greatest degree in an advanced science like physics or at the theoretical frontier of any science where the claims come as math claims. The first victim is truth. What was inaccurate or fuzzy truth all along gets bumped up a letter grade to the all-or-none status of binary logic. Most scientists draw the line at giving up the tentative status of science. They will concede that it can all go otherwise in the next experiment. But most have crossed the bivalent line by this point and believe that in the next experiment a statement or hypothesis or theory may jump from TRUE to FALSE, from 1 to 0." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"The binary logic of modern computers often falls short when describing the vagueness of the real world. Fuzzy logic offers more graceful alternatives." (Bart Kosko & Satoru Isaka, "Fuzzy Logic,” Scientific American Vol. 269, 1993)

"We can put black-and-white labels on these things. But labels will pass from accurate to inaccurate as the things change. Language ties a string between a word and the thing it stands for. When the thing changes to a nonthing, the string stretches or breaks or tangles with other strings." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)

"A signal has a finite-length frequency spectrum only if it lasts infinitely long in time. So a finite spectrum implies infinite time and vice versa. The reverse also holds in the ideal world of mathematics: A signal is finite in time only if it has a frequency spectrum that is infinite in extent." (Bart Kosko, "Noise", 2006)

"Adaptive systems learn by enlightened trial and error. The system can take a long time to learn well just as it can take a human a long time to learn to properly swing a golf club even with the help of the best golf instructor. But this iterative learning can also produce solutions that we could not find or at least could not find easily by pure mathematical analysis."  (Bart Kosko, "Noise", 2006)

"Any technical discussion of noise begins with white noise because white noise is pure or ideal noise. White noise serves as the gold standard of noise. Scientists and engineers have explored hundreds of other noise types but most of these deviate from white noise in some specific way. White noise is noisy because it has a wide and flat band of frequencies if one looks at its spectrum. This reflects the common working definition of noise as a so-called wideband signal. Good signals or wanted signals concentrate their energy on a comparatively narrow band of the frequency spectrum. Hence good signals tend to be so-called narrowband signals at least relative to the wide band of white noise. White noise is so noisy because its spectrum is as wide as possible - it runs the whole infinite length of the frequency spectrum itself. So pure or ideal white noise exists only as a mathematical abstraction. It cannot exist physically because it would require infinite energy." (Bart Kosko, "Noise", 2006)

"Bell curves don't differ that much in their bells. They differ in their tails. The tails describe how frequently rare events occur. They describe whether rare events really are so rare. This leads to the saying that the devil is in the tails." (Bart Kosko, "Noise", 2006)

"Chaos can leave statistical footprints that look like noise. This can arise from simple systems that are deterministic and not random. [...] The surprising mathematical fact is that most systems are chaotic. Change the starting value ever so slightly and soon the system wanders off on a new chaotic path no matter how close the starting point of the new path was to the starting point of the old path. Mathematicians call this sensitivity to initial conditions but many scientists just call it the butterfly effect. And what holds in math seems to hold in the real world - more and more systems appear to be chaotic." (Bart Kosko, "Noise", 2006)

"'Chaos' refers to systems that are very sensitive to small changes in their inputs. A minuscule change in a chaotic communication system can flip a 0 to a 1 or vice versa. This is the so-called butterfly effect: Small changes in the input of a chaotic system can produce large changes in the output. Suppose a butterfly flaps its wings in a slightly different way. can change its flight path. The change in flight path can in time change how a swarm of butterflies migrates." (Bart Kosko, "Noise", 2006)

"I wage war on noise every day as part of my work as a scientist and engineer. We try to maximize signal-to-noise ratios. We try to filter noise out of measurements of sounds or images or anything else that conveys information from the world around us. We code the transmission of digital messages with extra 0s and 1s to defeat line noise and burst noise and any other form of interference. Wc design sophisticated algorithms to track noise and then cancel it in headphones or in a sonogram. Some of us even teach classes on how to defeat this nemesis of the digital age. Such action further conditions our anti-noise reflexes." (Bart Kosko, "Noise", 2006)

"Is the universe noise? That question is not as strange as it sounds. Noise is an unwanted signal. A signal is anything that conveys information or ultimately anything that has energy. The universe consists of a great deal of energy. Indeed a working definition of the universe is all energy anywhere ever. So the answer turns on how one defines what it means to be wanted and by whom." (Bart Kosko, "Noise", 2006)

"Linear systems do not benefit from noise because the output of a linear system is just a simple scaled version of the input [...] Put noise in a linear system and you get out noise. Sometimes you get out a lot more noise than you put in. This can produce explosive effects in feedback systems that take their own outputs as inputs." (Bart Kosko, "Noise", 2006)

"Many scientists who work not just with noise but with probability make a common mistake: They assume that a bell curve is automatically Gauss's bell curve. Empirical tests with real data can often show that such an assumption is false. The result can be a noise model that grossly misrepresents the real noise pattern. It also favors a limited view of what counts as normal versus non-normal or abnormal behavior. This assumption is especially troubling when applied to human behavior. It can also lead one to dismiss extreme data as error when in fact the data is part of a pattern." (Bart Kosko, "Noise", 2006)

"Mutual information is the receiver's entropy minus the conditional entropy of what the receiver receives - given what message the sender sends through the noisy channel. Conditioning or getting data can only reduce uncertainty and so this gap is always positive or zero. It can never be negative. You can only learn from further experience. Information theorists capture this theorem in a slogan: Conditioning reduces entropy. The channel capacity itself is the largest gap given all possible probability descriptions of what [the sender] sent. It is the most information that on average you could ever get out of the noisy channel." (Bart Kosko, "Noise", 2006)

"Noise is a signal we don't like. Noise has two parts. The first has to do with the head and the second with the heart. The first part is the scientific or objective part: Noise is a signal. [...] The second part of noise is the subjective part: It deals with values. It deals with how we draw the fuzzy line between good signals and bad signals. Noise signals are the bad signals. They are the unwanted signals that mask or corrupt our preferred signals. They not only interfere but they tend to interfere at random." (Bart Kosko, "Noise", 2006)

"Noise is an unwanted signal. A signal is anything that conveys information or ultimately anything that has energy. The universe consists of a great deal of energy. Indeed a working definition of the universe is all energy anywhere ever. So the answer turns on how one defines what it means to be wanted and by whom." (Bart Kosko, "Noise", 2006)

"One person’s signal is another person’s noise and vice versa. We call this relative role reversal the noise-signal duality." (Bart Kosko, "Noise", 2006)

"One of the ironies of mathematics is that the ratio of two Gaussian quantities gives a Cauchy quantity. So you get Cauchy noise if you divide one Gaussian white noise process by another. [...] There is a still deeper relationship between the Cauchy and Gaussian bell curves. Both belong to a special family of probability curves called stable distributions [...] Gaussian quantities [...] are closed under addition. If you add two Gaussian noises then the result is still a Gaussian noise. This 'stable' property is not true for most noise or probability types. It is true for Cauchy processes." (Bart Kosko, "Noise", 2006)

"The central limit theorem differs from laws of large numbers because random variables vary and so they differ from constants such as population means. The central limit theorem says that certain independent random effects converge not to a constant population value such as the mean rate of unemployment but rather they converge to a random variable that has its own Gaussian bell-curve description." (Bart Kosko, "Noise", 2006)

"The flaw in the classical thinking is the assumption that variance equals dispersion. Variance tends to exaggerate outlying data because it squares the distance between the data and their mean. This mathematical artifact gives too much weight to rotten apples. It can also result in an infinite value in the face of impulsive data or noise. [...] Yet dispersion remains an elusive concept. It refers to the width of a probability bell curve in the special but important case of a bell curve. But most probability curves don't have a bell shape. And its relation to a bell curve's width is not exact in general. We know in general only that the dispersion increases as the bell gets wider. A single number controls the dispersion for stable bell curves and indeed for all stable probability curves - but not all bell curves are stable curves." (Bart Kosko, "Noise", 2006)

"The noise takes its toll on the message as it randomly turns some of the 1 bits into 0 bits and randomly turns some of the 0 bits into 1 bits: Noise randomly flips bits. [...] But noise can be subtler in a digital system. Noise can disturb the timing of when a bit value arrives at a receiver as well as randomly flipping that bit value." (Bart Kosko, "Noise", 2006)

"The universe is noisy on all scales even if the universe itself is not noise. The fading noise of the ancient big bang explosion fills the cosmos. It still gently hisses and crackles all around us in the form of junk microwave radiation." (Bart Kosko, "Noise", 2006)

On Noise V

"If the channel is noisy it is not in general possible to reconstruct the original message or the transmitted signal with certainty by any operation on the received signal. There are ways, however, of transmitting the information which are optimal in combating noise." (Claude E. Shannon, "A Mathematical Theory of Communication", Bell System Technical journal, 1948)

"Black-noise phenomena govern natural and unnatural catastrophes like floods, droughts, bear markets, and various outrageous outages, such as those of electrical power. Because of their black spectra, such disasters often come in clusters." (Manfred R Schroeder, "Fractals, Chaos, Power Laws", 1991)

"Engineers have sought to minimize the effects of noise in electronic circuits and communication systems. But recent research has established that noise can play a constructive role in the detection of weak periodic signals." (Kurt Wiesenfeld & Frank Moss, "Stochastic Resonance and the Benefits of Noise: From Ice Ages to Crayfish and SQUIDs", Nature vol. 373, 1995)

"Most engineering systems in communication, control, and signal processing are developed under the often erroneous assumption that the interfering noise is Gaussian. Many physical environments are more accurately modeled as impulsive, characterized by heavy-tailed non-Gaussian distributions. The performances of systems developed under the assumption of Gaussian noise can be severely degraded by the non-Gaussian noise due to potent deviation from normality in the tails." (Seong Rag Kim & Adam Efron, "Adaptive Robust Impulse Noise Filtering", IEEE Transactions on Signal Processing vol. 43 (8), 1995)

"Stochastic resonance simply stands for a new paradigm wherein noise represents a useful tool rather than a nuisance." (Luca Gammaitoni et al, "Stochastic Resonance" Reviews of Modern Physics vol. 70 (l), 1998)

"Uncovering the mysteries of natural phenomena that were formerly someone else's 'noise' is a recurring theme in science." (Alfred Bedard Jr & Thomas Georges, "Atmospheric Infrasound", Physics Today vol. 53 (3), 2000)

"Not all systems crackle. Some respond to external fines with many similar-sized small events (popcorn popping it is heated). Others give way in one single event (chalk snapping as it is stressed). Crackling noise is between these two limits." (James P. Sethna et al, "Crackling Noise", Nature vol. 410, 2001)

"Apart from intrinsic noise sources at the level of an individual neuron there are also sources of noise that are due to signal transmission and network effects. Synaptic transmission failures, for instance, seem to impose a substantial limitation within a neuronal network." (Wulfram Gerstner & Werner Kistler, "Spiking Neuron Models: Single Neurons, Population, Plasticity", 2002)

"Any technical discussion of noise begins with white noise because white noise is pure or ideal noise. White noise serves as the gold standard of noise. Scientists and engineers have explored hundreds of other noise types but most of these deviate from white noise in some specific way. White noise is noisy because it has a wide and flat band of frequencies if one looks at its spectrum. This reflects the common working definition of noise as a so-called wideband signal. Good signals or wanted signals concentrate their energy on a comparatively narrow band of the frequency spectrum. Hence good signals tend to be so-called narrowband signals at least relative to the wide band of white noise. White noise is so noisy because its spectrum is as wide as possible - it runs the whole infinite length of the frequency spectrum itself. So pure or ideal white noise exists only as a mathematical abstraction. It cannot exist physically because it would require infinite energy." (Bart Kosko, "Noise", 2006)

"Noise is a signal we don't like. Noise has two parts. The first has to do with the head and the second with the heart. The first part is the scientific or objective part: Noise is a signal. [...] The second part of noise is the subjective part: It deals with values. It deals with how we draw the fuzzy line between good signals and bad signals. Noise signals are the bad signals. They are the unwanted signals that mask or corrupt our preferred signals. They not only interfere but they tend to interfere at random." (Bart Kosko, "Noise", 2006)

05 April 2023

On Noise IV

"Experiments usually are looking for 'signals' of truth, and the search is always ham pered by 'noise' of one kind or another. In judging someone else's experimental results it's important to find out whether they represent a true signal or whether they are just so much noise." (Robert Hooke, "How to Tell the Liars from the Statisticians", 1983)

"In a real experiment the noise present in a signal is usually considered to be the result of the interplay of a large number of degrees of freedom over which one has no control. This type of noise can be reduced by improving the experimental apparatus. But we have seen that another type of noise, which is not removable by any refinement of technique, can be present. This is what we have called the deterministic noise. Despite its intractability it provides us with a way to describe noisy signals by simple mathematical models, making possible a dynamical system approach to the problem of turbulence." (David Ruelle, "Chaotic Evolution and Strange Attractors: The statistical analysis of time series for deterministic nonlinear systems", 1989)

"Fitting is essential to visualizing hypervariate data. The structure of data in many dimensions can be exceedingly complex. The visualization of a fit to hypervariate data, by reducing the amount of noise, can often lead to more insight. The fit is a hypervariate surface, a function of three or more variables. As with bivariate and trivariate data, our fitting tools are loess and parametric fitting by least-squares. And each tool can employ bisquare iterations to produce robust estimates when outliers or other forms of leptokurtosis are present." (William S Cleveland, "Visualizing Data", 1993)

"Noise is a problem in most signals. [...] It's easy to see that noise is random; it fluctuates erratically with no pattern." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"Although the shape of chaos is nightmarish, its voice is oddly soothing. When played through a loudspeaker, chaos sounds like white noise, like the soft static that helps insomniacs fall asleep." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Before you can even consider creating a data story, you must have a meaningful insight to share. One of the essential attributes of a data story is a central or main insight. Without a main point, your data story will lack purpose, direction, and cohesion. A central insight is the unifying theme (telos appeal) that ties your various findings together and guides your audience to a focal point or climax for your data story. However, when you have an increasing amount of data at your disposal, insights can be elusive. The noise from irrelevant and peripheral data can interfere with your ability to pinpoint the important signals hidden within its core." (Brent Dykes, "Effective Data Storytelling: How to Drive Change with Data, Narrative and Visuals", 2019)

"In addition to managing how the data is visualized to reduce noise, you can also decrease the visual interference by minimizing the extraneous cognitive load. In these cases, the nonrelevant information and design elements surrounding the data can cause extraneous noise. Poor design or display decisions by the data storyteller can inadvertently interfere with the communication of the intended signal. This form of noise can occur at both a macro and micro level." (Brent Dykes, "Effective Data Storytelling: How to Drive Change with Data, Narrative and Visuals", 2019)

"A defining feature of system noise is that it is unwanted, and we should stress right here that variability in judgments is not always unwanted." (Daniel Kahneman, "Noise: A Flaw in Human Judgment", 2021)

"A general property of noise is that you can recognize and measure it while knowing nothing about the target or bias." (Daniel Kahneman, "Noise: A Flaw in Human Judgment", 2021) 

"Bias and noise - systematic deviation and random scatter - are different components of error. […] To understand error in judgment, we must understand both bias and noise. Sometimes, as we will see, noise is the more important problem. But in public conversations about human error and in organizations all over the world, noise is rarely recognized. Bias is the star of the show. Noise is a bit player, usually offstage. […] Wherever you look at human judgments, you are likely to find noise. To improve the quality of our judgments, we need to overcome noise as well as bias." (Daniel Kahneman, "Noise: A Flaw in Human Judgment", 2021)

09 January 2023

John R Pierce - Collected Quotes

"A valid scientific theory seldom if ever offers the solution to the pressing problems which we repeatedly state. It seldom supplies a sensible answer to our multitudinous questions. Rather than rationalizing our ideas, it discards them entirely, or, rather, it leaves them as they were. It tells us in a fresh and new way what aspects of our experience can profitably be related and simply understood." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Communication theory deals with certain important but abstract aspects of communication. Communication theory proceeds from clear and definite assumptions to theorems concerning information sources and communication channels. In this it is essentially mathematical, and in order to understand it we must understand the idea of a theorem as a statement which must be proved, that is, which must be shown to be the necessary consequence of a set of initial assumptions. This is an idea which is the very heart of mathematics as mathematicians understand it." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Communication theory tells us how many bits of information can be sent per second over perfect and imperfect communication channels in terms of rather abstract descriptions of the properties of these channels. Communication theory tells us how to measure the rate at which a message source, such as a speaker or a writer, generates information. Communication theory tells us how to represent, or encode, messages from a particular message source efficiently for transmission over a particular sort of channel, such as an electrical circuit, and it tells us when we can avoid errors in transmission." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"However, it turns out that a one-to-one mapping of the points in a square into the points on a line cannot be continuous. As we move smoothly along a curve through the square, the points on the line which represent the successive points on the square necessarily jump around erratically, not only for the mapping described above but for any one-to-one mapping whatever. Any one-to-one mapping of the square onto the line is discontinuous." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"In communication theory we consider a message source, such as a writer or a speaker, which may produce on a given occasion any one of many possible messages. The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Mathematics is a way of finding out, step by step, facts which are inherent in the statement of the problem but which are not immediately obvious. Usually, in applying mathematics one must first hit on the facts and then verify them by proof. Here we come upon a knotty problem, for the proofs which satisfied mathematicians of an earlier day do not satisfy modem mathematicians." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Mathematicians start out with certain assumptions and definitions, and then by means of mathematical arguments or proofs they are able to show that certain statements or theorems are true." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"One of these is that many of the most general and powerful discoveries of science have arisen, not through the study of phenomena as they occur in nature, but, rather, through the study of phenomena in man-made devices, in products of technology, if you will. This is because the phenomena in man’s machines are simplified and ordered in comparison with those occurring naturally, and it is these simplified phenomena that man understands most easily." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Ordinarily, while mathematicians may suspect or conjecture the truth of certain statements, they have to prove theorems in order to be certain." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"The ideas and assumptions of a theory determine the generalityof the theory, that is, to how wide a range of phenomena the theory applies." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"The fact that network theory evolved from the study of idealized electrical systems rather than from the study of idealized mechanical systems is a matter of history, not of necessity." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Theories are strongly physical when they describe very completely some range of physical phenomena, which in practice is always limited. Theories become more mathematical or abstract when they deal with an idealized class of phenomena or with only certain aspects of phenomena." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Thus, in physics, entropy is associated with the possibility of converting thermal energy into mechanical energy. If the entropy does not change during a process, the process is reversible. If the entropy increases, the available energy decreases. Statistical mechanics interprets an increase of entropy as a decrease in order or, if we wish, as a decrease in our knowledge." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Thus, information is sometimes associated with the idea of knowledge through its popular use rather than with uncertainty and the resolution of uncertainty, as it is in communication theory." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

08 June 2021

On Patterns (2010-2019)

"Because the question for me was always whether that shape we see in our lives was there from the beginning or whether these random events are only called a pattern after the fact. Because otherwise we are nothing." (Cormac McCarthy, "All the Pretty Horses", 2010)

"The human mind delights in finding pattern - so much so that we often mistake coincidence or forced analogy for profound meaning. No other habit of thought lies so deeply within the soul of a small creature trying to make sense of a complex world not constructed for it." (Stephen J Gould, "The Flamingo's Smile: Reflections in Natural History", 2010)

"What advantages do diagrams have over verbal descriptions in promoting system understanding? First, by providing a diagram, massive amounts of information can be presented more efficiently. A diagram can strip down informational complexity to its core - in this sense, it can result in a parsimonious, minimalist description of a system. Second, a diagram can help us see patterns in information and data that may appear disordered otherwise. For example, a diagram can help us see mechanisms of cause and effect or can illustrate sequence and flow in a complex system. Third, a diagram can result in a less ambiguous description than a verbal description because it forces one to come up with a more structured description." (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)

"A surprising proportion of mathematicians are accomplished musicians. Is it because music and mathematics share patterns that are beautiful?" (Martin Gardner, "The Dover Math and Science Newsletter", 2011)

"It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern." (Daniel Kahneman, "Thinking, Fast and Slow", 2011)

"Knowing the importance of luck, you should be particularly suspicious when highly consistent patterns emerge from the comparison of successful and less successful firms. In the presence of randomness, regular patterns can only be mirages." (Daniel Kahneman, "Thinking, Fast and Slow", 2011)

"Once a myth becomes established, it forms part of our mental model of the world and alters our perception, the way our brains interpret the fleeting patterns our eyes pick up." (Jeremy Wade, "River Monsters: True Stories of the Ones that Didn't Get Away", 2011)

"Randomness might be defined in terms of order - its absence, that is. […] Everything we care about lies somewhere in the middle, where pattern and randomness interlace." (James Gleick, "The Information: A History, a Theory, a Flood", 2011)

"Equations have hidden powers. They reveal the innermost secrets of nature. […] The power of equations lies in the philosophically difficult correspondence between mathematics, a collective creation of human minds, and an external physical reality. Equations model deep patterns in the outside world. By learning to value equations, and to read the stories they tell, we can uncover vital features of the world around us." (Ian Stewart, "In Pursuit of the Unknown", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Mathematical intuition is the mind’s ability to sense form and structure, to detect patterns that we cannot consciously perceive. Intuition lacks the crystal clarity of conscious logic, but it makes up for that by drawing attention to things we would never have consciously considered." (Ian Stewart, "Visions of Infinity", 2013)

"Proof, in fact, is the requirement that makes great problems problematic. Anyone moderately competent can carry out a few calculations, spot an apparent pattern, and distil its essence into a pithy statement. Mathematicians demand more evidence than that: they insist on a complete, logically impeccable proof. Or, if the answer turns out to be negative, a disproof. It isn’t really possible to appreciate the seductive allure of a great problem without appreciating the vital role of proof in the mathematical enterprise. Anyone can make an educated guess. What’s hard is to prove it’s right. Or wrong." (Ian Stewart, "Visions of Infinity", 2013)

"Swarm intelligence illustrates the complex and holistic way in which the world operates. Order is created from chaos; patterns are revealed; and systems are free to work out their errors and problems at their own level. What natural systems can teach humanity is truly amazing." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"To put it simply, we communicate when we display a convincing pattern, and we discover when we observe deviations from our expectations. These may be explicit in terms of a mathematical model or implicit in terms of a conceptual model. How a reader interprets a graphic will depend on their expectations. If they have a lot of background knowledge, they will view the graphic differently than if they rely only on the graphic and its surrounding text." (Andrew Gelman & Antony Unwin, "Infovis and Statistical Graphics: Different Goals, Different Looks", Journal of Computational and Graphical Statistics Vol. 22(1), 2013)

"Another way to secure statistical significance is to use the data to discover a theory. Statistical tests assume that the researcher starts with a theory, collects data to test the theory, and reports the results - whether statistically significant or not. Many people work in the other direction, scrutinizing the data until they find a pattern and then making up a theory that fits the pattern." (Gary Smith, "Standard Deviations", 2014)

"Intersections of lines, for example, remain intersections, and the hole in a torus (doughnut) cannot be transformed away. Thus a doughnut may be transformed topologically into a coffee cup (the hole turning into a handle) but never into a pancake. Topology, then, is really a mathematics of relationships, of unchangeable, or 'invariant', patterns." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"One of the remarkable features of these complex systems created by replicator dynamics is that infinitesimal differences in starting positions create vastly different patterns. This sensitive dependence on initial conditions is often called the butterfly-effect aspect of complex systems - small changes in the replicator dynamics or in the starting point can lead to enormous differences in outcome, and they change one’s view of how robust the current reality is. If it is complex, one small change could have led to a reality that is quite different." (David Colander & Roland Kupers, "Complexity and the art of public policy : solving society’s problems from the bottom up", 2014)

"[…] regard it in fact as the great advantage of the mathematical technique that it allows us to describe, by means of algebraic equations, the general character of a pattern even where we are ignorant of the numerical values which will determine its particular manifestation." (Friedrich A von Hayek, "The Market and Other Orders", 2014)

"We are genetically predisposed to look for patterns and to believe that the patterns we observe are meaningful. […] Don’t be fooled into thinking that a pattern is proof. We need a logical, persuasive explanation and we need to test the explanation with fresh data." (Gary Smith, "Standard Deviations", 2014)

"We are hardwired to make sense of the world around us - to notice patterns and invent theories to explain these patterns. We underestimate how easily pat - terns can be created by inexplicable random events - by good luck and bad luck." (Gary Smith, "Standard Deviations", 2014)

"A pattern is a design or model that helps grasp something. Patterns help connect things that may not appear to be connected. Patterns help cut through complexity and reveal simpler understandable trends. […] Patterns can be temporal, which is something that regularly occurs over time. Patterns can also be spatial, such as things being organized in a certain way. Patterns can be functional, in that doing certain things leads to certain effects. Good patterns are often symmetric. They echo basic structures and patterns that we are already aware of." (Anil K Maheshwari, "Business Intelligence and Data Mining", 2015)

"The human mind builds up theories by recognising familiar patterns and glossing over details that are well understood, so that it can concentrate on the new material. In fact it is limited by the amount of new information it can hold at any one time, and the suppression of familiar detail is often essential for a grasp of the total picture. In a written proof, the step-by-step logical deduction is therefore foreshortened where it is already a part of the reader's basic technique, so that they can comprehend the overall structure more easily." (Ian Stewart & David Tall, "The Foundations of Mathematics" 2nd Ed., 2015)

"Why do mathematicians care so much about pi? Is it some kind of weird circle fixation? Hardly. The beauty of pi, in part, is that it puts infinity within reach. Even young children get this. The digits of pi never end and never show a pattern. They go on forever, seemingly at random - except that they can’t possibly be random, because they embody the order inherent in a perfect circle. This tension between order and randomness is one of the most tantalizing aspects of pi." (Steven Strogatz, "Why PI Matters" 2015)

"Without chaos there would be no creation, no structure and no existence. After all, order is merely the repetition of patterns; chaos is the process that establishes those patterns. Without this creative self-organizing force, the universe would be devoid of biological life, the birth of stars and galaxies - everything we have come to know. (Lawrence K Samuels, "Chaos Gets a Bad Rap: Importance of Chaology to Liberty", 2015)

"A mental representation is a mental structure that corresponds to an object, an idea, a collection of information, or anything else, concrete or abstract, that the brain is thinking about. […] Because the details of mental representations can differ dramatically from field to field, it’s hard to offer an overarching definition that is not too vague, but in essence these representations are preexisting patterns of information - facts, images, rules, relationships, and so on - that are held in long-term memory and that can be used to respond quickly and effectively in certain types of situations." (Anders Ericsson & Robert Pool," Peak: Secrets from  the  New  Science  of  Expertise", 2016)

"String theory today looks almost fractal. The more closely people explore any one corner, the more structure they find. Some dig deep into particular crevices; others zoom out to try to make sense of grander patterns. The upshot is that string theory today includes much that no longer seems stringy. Those tiny loops of string whose harmonics were thought to breathe form into every particle and force known to nature (including elusive gravity) hardly even appear anymore on chalkboards at conferences." (K C Cole, "The Strange Second Life of String Theory", Quanta Magazine", 2016)

"The relationship of math to the real world has been a conundrum for philosophers for centuries, but it is also an inspiration for poets. The patterns of mathematics inhabit a liminal space - they were initially derived from the natural world and yet seem to exist in a separate, self-contained system standing apart from that world. This makes them a source of potential metaphor: mapping back and forth between the world of personal experience and the world of mathematical patterns opens the door to novel connections." (Alice Major, "Mapping from e to Metaphor", 2018)

"Apart from the technical challenge of working with the data itself, visualization in big data is different because showing the individual observations is just not an option. But visualization is essential here: for analysis to work well, we have to be assured that patterns and errors in the data have been spotted and understood. That is only possible by visualization with big data, because nobody can look over the data in a table or spreadsheet." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

21 January 2021

Nate Silver - Collected Quotes

"A forecaster should almost never ignore data, especially when she is studying rare events […]. Ignoring data is often a tip-off that the forecaster is overconfident, or is overfitting her model - that she is interested in showing off rather than trying to be accurate."  (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Complex systems seem to have this property, with large periods of apparent stasis marked by sudden and catastrophic failures. These processes may not literally be random, but they are so irreducibly complex (right down to the last grain of sand) that it just won’t be possible to predict them beyond a certain level. […] And yet complex processes produce order and beauty when you zoom out and look at them from enough distance." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Data-driven predictions can succeed-and they can fail. It is when we deny our role in the process that the odds of failure rise. Before we demand more of our data, we need to demand more of ourselves." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The instinctual shortcut that we take when we have 'too much information' is to engage with it selectively, picking out the parts we like and ignoring the remainder, making allies with those who have made the same choices and enemies of the rest." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The most basic tenet of chaos theory is that a small change in initial conditions - a butterfly flapping its wings in Brazil - can produce a large and unexpected divergence in outcomes - a tornado in Texas. This does not mean that the behavior of the system is random, as the term 'chaos' might seem to imply. Nor is chaos theory some modern recitation of Murphy’s Law ('whatever can go wrong will go wrong'). It just means that certain types of systems are very hard to predict." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The signal is the truth. The noise is what distracts us from the truth." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The systems are dynamic, meaning that the behavior of the system at one point in time influences its behavior in the future; And they are nonlinear, meaning they abide by exponential rather than additive relationships." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"We forget - or we willfully ignore - that our models are simplifications of the world. We figure that if we make a mistake, it will be at the margin. In complex systems, however, mistakes are not measured in degrees but in whole orders of magnitude." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"We need to stop, and admit it: we have a prediction problem. We love to predict things—and we aren't very good at it." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Whether information comes in a quantitative or qualitative flavor is not as important as how you use it. [...] The key to making a good forecast […] is not in limiting yourself to quantitative information. Rather, it’s having a good process for weighing the information appropriately. […] collect as much information as possible, but then be as rigorous and disciplined as possible when analyzing it. [...] Many times, in fact, it is possible to translate qualitative information into quantitative information." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Statistics is the science of finding relationships and actionable insights from data." (Nate Silver)

20 December 2020

On Noise III

"Economists should study financial markets as they actually operate, not as they assume them to operate - observing the way in which information is actually processed, observing the serial correlations, bonanzas, and sudden stops, not assuming these away as noise around the edges of efficient and rational markets." (Adair Turner, "Economics after the Crisis: Objectives and means", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The signal is the truth. The noise is what distracts us from the truth." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"When some systems are stuck in a dangerous impasse, randomness and only randomness can unlock them and set them free. You can see here that absence of randomness equals guaranteed death. The idea of injecting random noise into a system to improve its functioning has been applied across fields. By a mechanism called stochastic resonance, adding random noise to the background makes you hear the sounds (say, music) with more accuracy." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"A signal is a useful message that resides in data. Data that isn’t useful is noise. […] When data is expressed visually, noise can exist not only as data that doesn’t inform but also as meaningless non-data elements of the display (e.g. irrelevant attributes, such as a third dimension of depth in bars, color variation that has no significance, and artificial light and shadow effects)." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)

"Data contain descriptions. Some are true, some are not. Some are useful, most are not. Skillful use of data requires that we learn to pick out the pieces that are true and useful. [...] To find signals in data, we must learn to reduce the noise - not just the noise that resides in the data, but also the noise that resides in us. It is nearly impossible for noisy minds to perceive anything but noise in data." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)

"When we find data quality issues due to valid data during data exploration, we should note these issues in a data quality plan for potential handling later in the project. The most common issues in this regard are missing values and outliers, which are both examples of noise in the data." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, worked examples, and case studies", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"Repeated observations of the same phenomenon do not always produce the same results, due to random noise or error. Sampling errors result when our observations capture unrepresentative circumstances, like measuring rush hour traffic on weekends as well as during the work week. Measurement errors reflect the limits of precision inherent in any sensing device. The notion of signal to noise ratio captures the degree to which a series of observations reflects a quantity of interest as opposed to data variance. As data scientists, we care about changes in the signal instead of the noise, and such variance often makes this problem surprisingly difficult." (Steven S Skiena, "The Data Science Design Manual", 2017)

"Using noise (the uncorrelated variables) to fit noise (the residual left from a simple model on the genuinely correlated variables) is asking for trouble." (Steven S Skiena, "The Data Science Design Manual", 2017)

On Noise II

"Noise signals are unwanted signals that are always present in a transmission system." (John R Pierce, "Signals: The Telephone and Beyond", 1981)

"Neither noise nor information is predictable." (Ray Kurzweil, "The Age of Spiritual Machines: When Computers Exceed Human Intelligence", 1999)

"No matter what the data, and no matter how the values are arranged and presented, you must always use some method of analysis to come up with an interpretation of the data. While every data set contains noise, some data sets may contain signals. Therefore, before you can detect a signal within any given data set, you must first filter out the noise." (Donald J Wheeler," Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"We analyze numbers in order to know when a change has occurred in our processes or systems. We want to know about such changes in a timely manner so that we can respond appropriately. While this sounds rather straightforward, there is a complication - the numbers can change even when our process does not. So, in our analysis of numbers, we need to have a way to distinguish those changes in the numbers that represent changes in our process from those that are essentially noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"While all data contain noise, some data contain signals. Before you can detect a signal, you must filter out the noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"In fact, an information theory that leaves out the issue of noise turns out to have no content." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"This phenomenon, common to chaos theory, is also known as sensitive dependence on initial conditions. Just a small change in the initial conditions can drastically change the long-term behavior of a system. Such a small amount of difference in a measurement might be considered experimental noise, background noise, or an inaccuracy of the equipment." (Greg Rae, Chaos Theory: A Brief Introduction, 2006)

"Data analysis is not generally thought of as being simple or easy, but it can be. The first step is to understand that the purpose of data analysis is to separate any signals that may be contained within the data from the noise in the data. Once you have filtered out the noise, anything left over will be your potential signals. The rest is just details." (Donald J Wheeler," Myths About Data Analysis", International Lean & Six Sigma Conference, 2012)

"Distinguishing the signal from the noise requires both scientific knowledge and self-knowledge." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

On Noise I

"Noise is the most impertinent of all forms of interruption. It is not only an interruption, but also a disruption of thought." (Arthur Schopenhauer, "Parerga and Paralipomena", 1851)

"Mathematics is the predominant science of our time; its conquests grow daily, though without noise; he who does not employ it for himself, will some day find it employed against himself." (Johann F Herbart, Werke, 1890)

"Life pushes its way through this fatalistically determined world like a river flowing upstream. It is a system of utterly improbable order, a message in a world of noise." (Joseph H Rush, "The Dawn of Life", 1957)

"Higher, directed forms of energy (e.g., mechanical, electric, chemical) are dissipated, that is, progressively converted into the lowest form of energy, i.e., undirected heat movement of molecules; chemical systems tend toward equilibria with maximum entropy; machines wear out owing to friction; in communication channels, information can only be lost by conversion of messages into noise but not vice versa, and so forth." (Ludwig von Bertalanffy, "Robots, Men and Minds", 1967)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of 'nois' is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"An essential element of dynamics systems is a positive feedback that self-enhances the initial deviation from the mean. The avalanche is proverbial. Cities grow since they attract more people, and in the universe, a local accumulation of dust may attract more dust, eventually leading to the birth of a star. Earlier or later, self-enhancing processes evoke an antagonistic reaction. A collapsing stock market stimulates the purchase of shares at a low price, thereby stabilizing the market. The increasing noise, dirt, crime and traffic jams may discourage people from moving into a big city." (Hans Meinhardt, "The Algorithmic Beauty of Sea Shells", 1995)

"Rather mathematicians like to look for patterns, and the primes probably offer the ultimate challenge. When you look at a list of them stretching off to infinity, they look chaotic, like weeds growing through an expanse of grass representing all numbers. For centuries mathematicians have striven to find rhyme and reason amongst this jumble. Is there any music that we can hear in this random noise? Is there a fast way to spot that a particular number is prime? Once you have one prime, how much further must you count before you find the next one on the list? These are the sort of questions that have tantalized generations." (Marcus du Sautoy, "The Music of the Primes", 1998)

"Data are collected as a basis for action. Yet before anyone can use data as a basis for action the data have to be interpreted. The proper interpretation of data will require that the data be presented in context, and that the analysis technique used will filter out the noise."  (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"Data are generally collected as a basis for action. However, unless potential signals are separated from probable noise, the actions taken may be totally inconsistent with the data. Thus, the proper use of data requires that you have simple and effective methods of analysis which will properly separate potential signals from probable noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

05 December 2020

Information Theory I

"Cybernetics is concerned primarily with the construction of theories and models in science, without making a hard and fast distinction between the physical and the biological sciences. The theories and models occur both in symbols and in hardware, and by 'hardware’ we shall mean a machine or computer built in terms of physical or chemical, or indeed any handleable parts. Most usually we shall think of hardware as meaning electronic parts such as valves and relays. Cybernetics insists, also, on a further and rather special condition that distinguishes it from ordinary scientific theorizing: it demands a certain standard of effectiveness. In this respect it has acquired some of the same motive power that has driven research on modern logic, and this is especially true in the construction and application of artificial languages and the use of operational definitions. Always the search is for precision and effectiveness, and we must now discuss the question of effectiveness in some detail. It should be noted that when we talk in these terms we are giving pride of place to the theory of automata at the expense, at least to some extent, of feedback and information theory." (Frank H George, "The Brain As A Computer", 1962)

"The general notion in communication theory is that of information. In many cases, the flow of information corresponds to a flow of energy, e. g. if light waves emitted by some objects reach the eye or a photoelectric cell, elicit some reaction of the organism or some machinery, and thus convey information." (Ludwig von Bertalanffy, "General System Theory", 1968) 

"The 'flow of information' through human communication channels is enormous. So far no theory exists, to our knowledge, which attributes any sort of unambiguous measure to this 'flow'." (Anatol Rapoport, "Modern Systems Research for the Behavioral Scientist", 1969)

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of "noise" is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"The field of 'information theory' began by using the old hardware paradigm of transportation of data from point to point." (Marshall McLuhan & Eric McLuhan, Laws of Media: The New Science, 1988)

"Without an understanding of causality there can be no theory of communication. What passes as information theory today is not communication at all, but merely transportation." (Marshall McLuhan & Eric McLuhan, "Laws of Media: The New Science", 1988)

"If quantum communication and quantum computation are to flourish, a new information theory will have to be developed." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In fact, an information theory that leaves out the issue of noise turns out to have no content." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In an information economy, entrepreneurs master the science of information in order to overcome the laws of the purely physical sciences. They can succeed because of the surprising power of the laws of information, which are conducive to human creativity. The central concept of information theory is a measure of freedom of choice. The principle of matter, on the other hand, is not liberty but limitation - it has weight and occupies space." (George Gilder, "Knowledge and Power: The Information Theory of Capitalism and How it is Revolutionizing our World", 2013)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...