"A bell curve shows the 'spread' or variance in our knowledge or certainty. The wider the bell the less we know. An infinitely wide bell is a flat line. Then we know nothing. The value of the quantity, position, or speed could lie anywhere on the axis. An infinitely narrow bell is a spike that is infinitely tall. Then we have complete knowledge of the value of the quantity. The uncertainty principle says that as one bell curve gets wider the other gets thinner. As one curve peaks the other spreads. So if the position bell curve becomes a spike and we have total knowledge of position, then the speed bell curve goes flat and we have total uncertainty (infinite variance) of speed." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"A fuzzy cognitive map or FCM draws a causal picture. It ties facts and things and processes to values and policies and objectives. And it lets you predict how complex events interact and play out. [...] Neural nets give a shortcut to tuning an FCM. The trick is to let the fuzzy causal edges change as if they were synapses in a neural net. They cannot change with the same math laws because FCM edges stand for causal effect not signal flow. We bombard the FCM nodes with real data. The data state which nodes are on or off and to which degree at each moment in time. Then the edges grow among the nodes." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"A neural net acts like the eyes and ears of an adaptive fuzzy system, a fuzzy system whose rules change with experience. The neural net senses fuzzy patterns in the data and learns to associate the patterns. The associations are rules: If fuzzy set A, then fuzzy set B." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Bivalence trades accuracy for simplicity. Binary outcomes of yes and no, white and black, true and false simplify math and computer processing. You can work with strings of 0s and 1s more easily than you can work with fractions. But bivalence requires some force fitting and rounding off [...] Bivalence holds at cube corners. Multivalence holds everywhere else." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Descriptions split into two groups, the logical and the factual, or the mathematical and the scientific, or the coherent and the correspondent. The split depends on accuracy. Logical statements are completely accurate or completely inaccurate. They alone are all or none. Factual statements are partially accurate or partially inaccurate." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Fuzziness has a formal name in science: multivalence. The opposite of fuzziness is bivalence or two-valuedness, two ways to answer each question, true or false, 1 or 0. Fuzziness means multivalence. It means three or more options, perhaps an infinite spectrum of options, instead of just two extremes. It means analog instead of binary, infinite shades of gray between black and white." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Fuzzy entropy measures the fuzziness of a fuzzy set. It answers the question 'How fuzzy is a fuzzy set?' And it is a matter of degree. Some fuzzy sets are fuzzier than others. Entropy means the uncertainty or disorder in a system. A set describes a system or collection of things. When the set is fuzzy, when elements belong to it to some degree, the set is uncertain or vague to some degree. Fuzzy entropy measures this degree. And it is simple enough that you can see it in a picture of a cube." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"[...] fuzzy things resemble fuzzy nonthings. A resembles not-A. Fuzzy things have vague boundaries with their opposites, with nonthings. The more a thing resembles its opposite, the fuzzier it is. In the fuzziest case the thing equals its opposite: the glass of water half empty and half full [...]" (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"I called this the mismatch problem: The world is gray but science is black and white. We talk in zeroes and ones but the truth lies in between. Fuzzy world, nonfuzzy description. The statements of formal logic and computer programming are all true or all false, 1 or 0. But statements about the world differ." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Laws of science are not laws at all. They are not laws in the sense of logical laws like two plus two equals four. Logic does not legislate them. Laws of science state tendencies we have recently observed in our corner of the universe. The best you can say about them is so far, so good. In the next instant every 'law' of science may change. Their truth is a matter of degree and is always up for grabs." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Math is a formal system. We can manipulate math symbols and not understand what they mean. We can just apply the syntax rules as a computer does when it adds up numbers or proves a theorem. The computer shows the truth of the theorem but does not 'understand' its 'meaning'." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Science undercuts ethics because we have made science the measure of all things. Truth means truth of science. Truth means logical truth or factual truth. Truth means math proof or data test. The truth can be a matter of degree. But that does not help ethics." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Scientific claims or statements are inexact and provisional. They depend on dozens of simplifying assumptions and on a particular choice of words and symbols and on 'all other things being equal'." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Scientists try to make things simple. That is in good part why we are stuck with bivalence. Scientists' first instinct is to fit a linear model to a nonlinear world. This creates another mismatch problem, the math modeler's dilemma: linear math, nonlinear world." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"Somewhere in the process wishful thinking seems to take over. Scientists start to believe they do math when they do science. This holds to greatest degree in an advanced science like physics or at the theoretical frontier of any science where the claims come as math claims. The first victim is truth. What was inaccurate or fuzzy truth all along gets bumped up a letter grade to the all-or-none status of binary logic. Most scientists draw the line at giving up the tentative status of science. They will concede that it can all go otherwise in the next experiment. But most have crossed the bivalent line by this point and believe that in the next experiment a statement or hypothesis or theory may jump from TRUE to FALSE, from 1 to 0." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"The binary logic of modern computers often falls short when describing the vagueness of the real world. Fuzzy logic offers more graceful alternatives." (Bart Kosko & Satoru Isaka, "Fuzzy Logic,” Scientific American Vol. 269, 1993)
"We can put black-and-white labels on these things. But labels will pass from accurate to inaccurate as the things change. Language ties a string between a word and the thing it stands for. When the thing changes to a nonthing, the string stretches or breaks or tangles with other strings." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)
"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)
"A signal has a finite-length frequency spectrum only if it lasts infinitely long in time. So a finite spectrum implies infinite time and vice versa. The reverse also holds in the ideal world of mathematics: A signal is finite in time only if it has a frequency spectrum that is infinite in extent." (Bart Kosko, "Noise", 2006)
"Adaptive systems learn by enlightened trial and error. The system can take a long time to learn well just as it can take a human a long time to learn to properly swing a golf club even with the help of the best golf instructor. But this iterative learning can also produce solutions that we could not find or at least could not find easily by pure mathematical analysis." (Bart Kosko, "Noise", 2006)
"Any technical discussion of noise begins with white noise because white noise is pure or ideal noise. White noise serves as the gold standard of noise. Scientists and engineers have explored hundreds of other noise types but most of these deviate from white noise in some specific way. White noise is noisy because it has a wide and flat band of frequencies if one looks at its spectrum. This reflects the common working definition of noise as a so-called wideband signal. Good signals or wanted signals concentrate their energy on a comparatively narrow band of the frequency spectrum. Hence good signals tend to be so-called narrowband signals at least relative to the wide band of white noise. White noise is so noisy because its spectrum is as wide as possible - it runs the whole infinite length of the frequency spectrum itself. So pure or ideal white noise exists only as a mathematical abstraction. It cannot exist physically because it would require infinite energy." (Bart Kosko, "Noise", 2006)
"Bell curves don't differ that much in their bells. They differ in their tails. The tails describe how frequently rare events occur. They describe whether rare events really are so rare. This leads to the saying that the devil is in the tails." (Bart Kosko, "Noise", 2006)
"Chaos can leave statistical footprints that look like noise. This can arise from simple systems that are deterministic and not random. [...] The surprising mathematical fact is that most systems are chaotic. Change the starting value ever so slightly and soon the system wanders off on a new chaotic path no matter how close the starting point of the new path was to the starting point of the old path. Mathematicians call this sensitivity to initial conditions but many scientists just call it the butterfly effect. And what holds in math seems to hold in the real world - more and more systems appear to be chaotic." (Bart Kosko, "Noise", 2006)
"'Chaos' refers to systems that are very sensitive to small changes in their inputs. A minuscule change in a chaotic communication system can flip a 0 to a 1 or vice versa. This is the so-called butterfly effect: Small changes in the input of a chaotic system can produce large changes in the output. Suppose a butterfly flaps its wings in a slightly different way. can change its flight path. The change in flight path can in time change how a swarm of butterflies migrates." (Bart Kosko, "Noise", 2006)
"I wage war on noise every day as part of my work as a scientist and engineer. We try to maximize signal-to-noise ratios. We try to filter noise out of measurements of sounds or images or anything else that conveys information from the world around us. We code the transmission of digital messages with extra 0s and 1s to defeat line noise and burst noise and any other form of interference. Wc design sophisticated algorithms to track noise and then cancel it in headphones or in a sonogram. Some of us even teach classes on how to defeat this nemesis of the digital age. Such action further conditions our anti-noise reflexes." (Bart Kosko, "Noise", 2006)
"Is the universe noise? That question is not as strange as it sounds. Noise is an unwanted signal. A signal is anything that conveys information or ultimately anything that has energy. The universe consists of a great deal of energy. Indeed a working definition of the universe is all energy anywhere ever. So the answer turns on how one defines what it means to be wanted and by whom." (Bart Kosko, "Noise", 2006)
"Linear systems do not benefit from noise because the output of a linear system is just a simple scaled version of the input [...] Put noise in a linear system and you get out noise. Sometimes you get out a lot more noise than you put in. This can produce explosive effects in feedback systems that take their own outputs as inputs." (Bart Kosko, "Noise", 2006)
"Many scientists who work not just with noise but with probability make a common mistake: They assume that a bell curve is automatically Gauss's bell curve. Empirical tests with real data can often show that such an assumption is false. The result can be a noise model that grossly misrepresents the real noise pattern. It also favors a limited view of what counts as normal versus non-normal or abnormal behavior. This assumption is especially troubling when applied to human behavior. It can also lead one to dismiss extreme data as error when in fact the data is part of a pattern." (Bart Kosko, "Noise", 2006)
"Mutual information is the receiver's entropy minus the conditional entropy of what the receiver receives - given what message the sender sends through the noisy channel. Conditioning or getting data can only reduce uncertainty and so this gap is always positive or zero. It can never be negative. You can only learn from further experience. Information theorists capture this theorem in a slogan: Conditioning reduces entropy. The channel capacity itself is the largest gap given all possible probability descriptions of what [the sender] sent. It is the most information that on average you could ever get out of the noisy channel." (Bart Kosko, "Noise", 2006)
"Noise is a signal we don't like. Noise has two parts. The first has to do with the head and the second with the heart. The first part is the scientific or objective part: Noise is a signal. [...] The second part of noise is the subjective part: It deals with values. It deals with how we draw the fuzzy line between good signals and bad signals. Noise signals are the bad signals. They are the unwanted signals that mask or corrupt our preferred signals. They not only interfere but they tend to interfere at random." (Bart Kosko, "Noise", 2006)
"Noise is an unwanted signal. A signal is anything that conveys information or ultimately anything that has energy. The universe consists of a great deal of energy. Indeed a working definition of the universe is all energy anywhere ever. So the answer turns on how one defines what it means to be wanted and by whom." (Bart Kosko, "Noise", 2006)
"One person’s signal is another person’s noise and vice versa. We call this relative role reversal the noise-signal duality." (Bart Kosko, "Noise", 2006)
"One of the ironies of mathematics is that the ratio of two Gaussian quantities gives a Cauchy quantity. So you get Cauchy noise if you divide one Gaussian white noise process by another. [...] There is a still deeper relationship between the Cauchy and Gaussian bell curves. Both belong to a special family of probability curves called stable distributions [...] Gaussian quantities [...] are closed under addition. If you add two Gaussian noises then the result is still a Gaussian noise. This 'stable' property is not true for most noise or probability types. It is true for Cauchy processes." (Bart Kosko, "Noise", 2006)
"The central limit theorem differs from laws of large numbers because random variables vary and so they differ from constants such as population means. The central limit theorem says that certain independent random effects converge not to a constant population value such as the mean rate of unemployment but rather they converge to a random variable that has its own Gaussian bell-curve description." (Bart Kosko, "Noise", 2006)
"The flaw in the classical thinking is the assumption that variance equals dispersion. Variance tends to exaggerate outlying data because it squares the distance between the data and their mean. This mathematical artifact gives too much weight to rotten apples. It can also result in an infinite value in the face of impulsive data or noise. [...] Yet dispersion remains an elusive concept. It refers to the width of a probability bell curve in the special but important case of a bell curve. But most probability curves don't have a bell shape. And its relation to a bell curve's width is not exact in general. We know in general only that the dispersion increases as the bell gets wider. A single number controls the dispersion for stable bell curves and indeed for all stable probability curves - but not all bell curves are stable curves." (Bart Kosko, "Noise", 2006)
"The noise takes its toll on the message as it randomly turns some of the 1 bits into 0 bits and randomly turns some of the 0 bits into 1 bits: Noise randomly flips bits. [...] But noise can be subtler in a digital system. Noise can disturb the timing of when a bit value arrives at a receiver as well as randomly flipping that bit value." (Bart Kosko, "Noise", 2006)
"The universe is noisy on all scales even if the universe itself is not noise. The fading noise of the ancient big bang explosion fills the cosmos. It still gently hisses and crackles all around us in the form of junk microwave radiation." (Bart Kosko, "Noise", 2006)
No comments:
Post a Comment