Showing posts with label prediction. Show all posts
Showing posts with label prediction. Show all posts

12 December 2023

On Prediction XV: Systems III

"The state of a system at a given moment depends on two things - its initial state, and the law according to which that state varies. If we know both this law and this initial state, we have a simple mathematical problem to solve, and we fall back upon our first degree of ignorance. Then it often happens that we know the law and do not know the initial state. It may be asked, for instance, what is the present distribution of the minor planets? We know that from all time they have obeyed the laws of Kepler, but we do not know what was their initial distribution. In the kinetic theory of gases we assume that the gaseous molecules follow rectilinear paths and obey the laws of impact and elastic bodies; yet as we know nothing of their initial velocities, we know nothing of their present velocities. The calculus of probabilities alone enables us to predict the mean phenomena which will result from a combination of these velocities. This is the second degree of ignorance. Finally it is possible, that not only the initial conditions but the laws themselves are unknown. We then reach the third degree of ignorance, and in general we can no longer affirm anything at all as to the probability of a phenomenon. It often happens that instead of trying to discover an event by means of a more or less imperfect knowledge of the law, the events may be known, and we want to find the law; or that, instead of deducing effects from causes, we wish to deduce the causes." (Henri Poincaré, "Science and Hypothesis", 1902)

"Although a system may exhibit sensitive dependence on initial conditions, this does not mean that everything is unpredictable about it. In fact, finding what is predictable in a background of chaos is a deep and important problem. (Which means that, regrettably, it is unsolved.) In dealing with this deep and important problem, and for want of a better approach, we shall use common sense." (David Ruelle, "Chance and Chaos", 1991)

"Although the detailed moment-to-moment behavior of a chaotic system cannot be predicted, the overall pattern of its 'random' fluctuations may be similar from scale to scale. Likewise, while the fine details of a chaotic system cannot be predicted one can know a little bit about the range of its 'random' fluctuation." (F David Peat, "From Certainty to Uncertainty", 2002)

"[…] while chaos theory deals in regions of randomness and chance, its equations are entirely deterministic. Plug in the relevant numbers and out comes the answer. In principle at least, dealing with a chaotic system is no different from predicting the fall of an apple or sending a rocket to the moon. In each case deterministic laws govern the system. This is where the chance of chaos differs from the chance that is inherent in quantum theory." (F David Peat, "From Certainty to Uncertainty", 2002)

"Chaos is a phenomenon encountered in science and mathematics wherein a deterministic (rule-based) system behaves unpredictably. That is, a system which is governed by fixed, precise rules, nevertheless behaves in a way which is, for all practical purposes, unpredictable in the long run. The mathematical use of the word 'chaos' does not align well with its more common usage to indicate lawlessness or the complete absence of order. On the contrary, mathematically chaotic systems are, in a sense, perfectly ordered, despite their apparent randomness. This seems like nonsense, but it is not." (David P Feldman, "Chaos and Fractals: An Elementary Introduction", 2012)

02 July 2023

On Linearity II

"Why are nonlinear systems so much harder to analyze than linear ones? The essential difference is that linear systems can be broken down into parts. Then each part can be solved separately and finally recombined to get the answer. This idea allows a fantastic simplification of complex problems, and underlies such methods as normal modes, Laplace transforms, superposition arguments, and Fourier analysis. In this sense, a linear system is precisely equal to the sum of its parts." (Steven H Strogatz, "Non-Linear Dynamics and Chaos, 1994)

"When we examine the modeling literature, its most striking aspect is the predominance of 'flat' linear models. Why is this the case? After all, from a singularity theory viewpoint these linear objects are mathematical rarities. On mathematical grounds we should certainly not expect to see them put forth as credible representations of reality. Yet they are. And the reason is simple: linearity is a neutral assumption that leads to mathematically tractable models. So unless there is good reason to do otherwise, why not use a linear model?"  (John L Casti, "Five Golden Rules", 1995)

"In a linear system a tiny push produces a small effect, so that cause and effect are always proportional to each other. If one plotted on a graph the cause against the effect, the result would be a straight line. In nonlinear systems, however, a small push may produce a small effect, a slightly larger push produces a proportionately larger effect, but increase that push by a hair’s breadth and suddenly the system does something radically different." (F David Peat, "From Certainty to Uncertainty", 2002)

"Linearity means that the rule that determines what a piece of a system is going to do next is not influenced by what it is doing now. More precisely, this is intended in a differential or incremental sense: For a linear spring, the increase of its tension is proportional to the increment whereby it is stretched, with the ratio of these increments exactly independent of how much it has already been stretched. Such a spring can be stretched arbitrarily far, and in particular will never snap or break. Accordingly, no real spring is linear." (Heinz-Otto Peitgen et al, "Chaos and Fractals: New Frontiers of Science" 2nd Ed., 2004) 

"Most long-range forecasts of what is technically feasible in future time periods dramatically underestimate the power of future developments because they are based on what I call the 'intuitive linear' view of history rather than the 'historical exponential' view." (Ray Kurzweil, "The Singularity is Near", 2005)

"Linear systems do not benefit from noise because the output of a linear system is just a simple scaled version of the input [...] Put noise in a linear system and you get out noise. Sometimes you get out a lot more noise than you put in. This can produce explosive effects in feedback systems that take their own outputs as inputs." (Bart Kosko, "Noise", 2006)

"On a linear system like a scale, the whole is equal to the sum of the parts. That’s the first key property of linearity. The second is that causes are proportional to effects. […] These two properties - the proportionality between cause and effect, and the equality of the whole to the sum of the parts - are the essence of what it means to be linear. […] The great advantage of linearity is that it allows for reductionist thinking. To solve a linear problem, we can break it down to its simplest parts, solve each part separately, and put the parts back together to get the answer." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

"[...] perhaps one of the most important features of complex systems, which is a key differentiator when comparing with chaotic systems, is the concept of emergence. Emergence 'breaks' the notion of determinism and linearity because it means that the outcome of these interactions is naturally unpredictable. In large systems, macro features often emerge in ways that cannot be traced back to any particular event or agent. Therefore, complexity theory is based on interaction, emergence and iterations." (Luis Tomé & Şuay Nilhan Açıkalın, "Complexity Theory as a New Lens in IR: System and Change" [in "Chaos, Complexity and Leadership 2017", Şefika Şule Erçetin & Nihan Potas], 2019)

"With a linear growth of errors, improving the measurements could always keep pace with the desire for longer prediction. But when errors grow exponentially fast, a system is said to have sensitive dependence on its initial conditions. Then long-term prediction becomes impossible. This is the philosophically disturbing message of chaos." (Steven H Strogatz, "Infinite Powers: The Story of Calculus - The Most Important Discovery in Mathematics", 2019)

20 August 2021

John D Kelleher - Collected Quotes

"A predictive model overfits the training set when at least some of the predictions it returns are based on spurious patterns present in the training data used to induce the model. Overfitting happens for a number of reasons, including sampling variance and noise in the training set. The problem of overfitting can affect any machine learning algorithm; however, the fact that decision tree induction algorithms work by recursively splitting the training data means that they have a natural tendency to segregate noisy instances and to create leaf nodes around these instances. Consequently, decision trees overfit by splitting the data on irrelevant features that only appear relevant due to noise or sampling variance in the training data. The likelihood of overfitting occurring increases as a tree gets deeper because the resulting predictions are based on smaller and smaller subsets as the dataset is partitioned after each feature test in the path." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies", 2015)

"Decision trees are also discriminative models. Decision trees are induced by recursively partitioning the feature space into regions belonging to the different classes, and consequently they define a decision boundary by aggregating the neighboring regions belonging to the same class. Decision tree model ensembles based on bagging and boosting are also discriminative models." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies", 2015)

"Decision trees are also considered nonparametric models. The reason for this is that when we train a decision tree from data, we do not assume a fixed set of parameters prior to training that define the tree. Instead, the tree branching and the depth of the tree are related to the complexity of the dataset it is trained on. If new instances were added to the dataset and we rebuilt the tree, it is likely that we would end up with a (potentially very) different tree." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies", 2015)

"It is important to remember that predictive data analytics models built using machine learning techniques are tools that we can use to help make better decisions within an organization and are not an end in themselves. It is paramount that, when tasked with creating a predictive model, we fully understand the business problem that this model is being constructed to address and ensure that it does address it." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, worked examples, and case studies", 2015)

"The main advantage of decision tree models is that they are interpretable. It is relatively easy to understand the sequences of tests a decision tree carried out in order to make a prediction. This interpretability is very important in some domains. [...] Decision tree models can be used for datasets that contain both categorical and continuous descriptive features. A real advantage of the decision tree approach is that it has the ability to model the interactions between descriptive features. This arises from the fact that the tests carried out at each node in the tree are performed in the context of the results of the tests on the other descriptive features that were tested at the preceding nodes on the path from the root. Consequently, if there is an interaction effect between two or more descriptive features, a decision tree can model this."  (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies", 2015)

"There are two kinds of mistakes that an inappropriate inductive bias can lead to: underfitting and overfitting. Underfitting occurs when the prediction model selected by the algorithm is too simplistic to represent the underlying relationship in the dataset between the descriptive features and the target feature. Overfitting, by contrast, occurs when the prediction model selected by the algorithm is so complex that the model fits to the dataset too closely and becomes sensitive to noise in the data."(John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies", 2015)

"Tree pruning identifies and removes subtrees within a decision tree that are likely to be due to noise and sample variance in the training set used to induce it. In cases where a subtree is deemed to be overfitting, pruning the subtree means replacing the subtree with a leaf node that makes a prediction based on the majority target feature level (or average target feature value) of the dataset created by merging the instances from all the leaf nodes in the subtree. Obviously, pruning will result in decision trees being created that are not consistent with the training set used to build them. In general, however, we are more interested in creating prediction models that generalize well to new data rather than that are strictly consistent with training data, so it is common to sacrifice consistency for generalization capacity." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies", 2015)

"When datasets are small, a parametric model may perform well because the strong assumptions made by the model - if correct - can help the model to avoid overfitting. However, as the size of the dataset grows, particularly if the decision boundary between the classes is very complex, it may make more sense to allow the data to inform the predictions more directly. Obviously the computational costs associated with nonparametric models and large datasets cannot be ignored. However, support vector machines are an example of a nonparametric model that, to a large extent, avoids this problem. As such, support vector machines are often a good choice in complex domains with lots of data." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, Worked Examples, and Case Studies", 2015)

"When we find data quality issues due to valid data during data exploration, we should note these issues in a data quality plan for potential handling later in the project. The most common issues in this regard are missing values and outliers, which are both examples of noise in the data." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, worked examples, and case studies", 2015)

"A neural network consists of a set of neurons that are connected together. A neuron takes a set of numeric values as input and maps them to a single output value. At its core, a neuron is simply a multi-input linear-regression function. The only significant difference between the two is that in a neuron the output of the multi-input linear-regression function is passed through another function that is called the activation function." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"Data scientists should have some domain expertise. Most data science projects begin with a real-world, domain-specific problem and the need to design a data-driven solution to this problem. As a result, it is important for a data scientist to have enough domain expertise that they understand the problem, why it is important, an dhow a data science solution to the problem might fit into an organization’s processes. This domain expertise guides the data scientist as she works toward identifying an optimized solution." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"One of the biggest myths is the belief that data science is an autonomous process that we can let loose on our data to find the answers to our problems. In reality, data science requires skilled human oversight throughout the different stages of the process. [...] The second big myth of data science is that every data science project needs big data and needs to use deep learning. In general, having more data helps, but having the right data is the more important requirement. [...] A third data science myth is that modern data science software is easy to use, and so data science is easy to do. [...] The last myth about data science [...] is the belief that data science pays for itself quickly. The truth of this belief depends on the context of the organization. Adopting data science can require significant investment in terms of developing data infrastructure and hiring staff with data science expertise. Furthermore, data science will not give positive results on every project." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"One of the most important skills for a data scientist is the ability to frame a real-world problem as a standard data science task." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"Presenting data in a graphical format makes it much easier to see and understand what is happening with the data. Data visualization applies to all phases of the data science process."  (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"The goal of data science is to improve decision making by basing decisions on insights extracted from large data sets. As a field of activity, data science encompasses a set of principles, problem definitions, algorithms, and processes for extracting nonobvious and useful patterns from large data sets. It is closely related to the fields of data mining and machine learning, but it is broader in scope." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"The patterns that we extract using data science are useful only if they give us insight into the problem that enables us to do something to help solve the problem." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"The promise of data science is that it provides a way to understand the world through data." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"Using data science, we can uncover the important patterns in a data set, and these patterns can reveal the important attributes in the domain. The reason why data science is used in so many domains is that it doesn’t matter what the problem domain is: if the right data are available and the problem can be clearly defined, then data science can help."  (John D Kelleher & Brendan Tierney, "Data Science", 2018)

"We humans are reasonably good at defining rules that check one, two, or even three attributes (also commonly referred to as features or variables), but when we go higher than three attributes, we can start to struggle to handle the interactions between them. By contrast, data science is often applied in contexts where we want to look for patterns among tens, hundreds, thousands, and, in extreme cases, millions of attributes." (John D Kelleher & Brendan Tierney, "Data Science", 2018)

08 June 2021

On Patterns (2000-2009)

"In a linear world of equilibrium and predictability, the sparse research into an evidence base for management prescriptions and the confused findings it produces would be a sign of incompetence; it would not make much sense. Nevertheless, if organizations are actually patterns of nonlinear interaction between people; if small changes could produce widespread major consequences; if local interaction produces emergent global pattern; then it will not be possible to provide a reliable evidence base. In such a world, it makes no sense to conduct studies looking for simple causal relationships between an action and an outcome. I suggest that the story of the last few years strongly indicates that human action is nonlinear, that time and place matter a great deal, and that since this precludes simple evidence bases we do need to rethink the nature of organizations and the roles of managers and leaders in them." (Ralph D Stacey, "Complexity and Organizational Reality", 2000)

"The central proposition in [realistic thinking] is that human actions and interactions are processes, not systems, and the coherent patterning of those processes becomes what it becomes because of their intrinsic capacity, the intrinsic capacity of interaction and relationship, to form coherence. That emergent form is radically unpredictable, but it emerges in a controlled or patterned way because of the characteristic of relationship itself, creation and destruction in conditions at the edge of chaos." (Ralph D Stacey et al, "Complexity and Management: Fad or Radical Challenge to Systems Thinking?", 2000)

"Although the detailed moment-to-moment behavior of a chaotic system cannot be predicted, the overall pattern of its 'random' fluctuations may be similar from scale to scale. Likewise, while the fine details of a chaotic system cannot be predicted one can know a little bit about the range of its 'random' fluctuation." (F David Peat, "From Certainty to Uncertainty", 2002)

"There are endless examples of elaborate structures and apparently complex processes being generated through simple repetitive rules, all of which can be easily simulated on a computer. It is therefore tempting to believe that, because many complex patterns can be generated out of a simple algorithmic rule, all complexity is created in this way." (F David Peat, "From Certainty to Uncertainty", 2002)

"Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns - which frequently occur in random data - could equally well be derived from tossing a coin. So it is in the stock market as well." (Didier Sornette, "Why Stock Markets Crash: Critical events in complex financial systems", 2003)

"Learning is the process of creating networks. Nodes are external entities which we can use to form a network. Or nodes may be people, organizations, libraries, web sites, books, journals, database, or any other source of information. The act of learning (things become a bit tricky here) is one of creating an external network of nodes - where we connect and form information and knowledge sources. The learning that happens in our heads is an internal network (neural). Learning networks can then be perceived as structures that we create in order to stay current and continually acquire, experience, create, and connect new knowledge (external). And learning networks can be perceived as structures that exist within our minds (internal) in connecting and creating patterns of understanding." (George Siemens, "Knowing Knowledge", 2006)

"Some number patterns, like even and odd numbers, lie on the surface. But the more you learn about numbers, both experimentally and theoretically, the more you discover patterns that are not so obvious. […] After a hidden pattern is exposed, it can be used to find more hidden patterns. At the end of a long chain of patterned reasoning, you can get to very difficult theorems, exploring facts about numbers that you otherwise would not know were true." (Avner Ash & Robert Gross, "Fearless Symmetry: Exposing the hidden patterns of numbers", 2006)

"Still, in the end, we find ourselves drawn to the beauty of the patterns themselves, and the amazing fact that we humans are smart enough to prove even a feeble fraction of all possible theorems about them. Often, greater than the contemplation of this beauty for the active mathematician is the excitement of the chase. Trying to discover first what patterns actually do or do not occur, then finding the correct statement of a conjecture, and finally proving it - these things are exhilarating when accomplished successfully. Like all risk-takers, mathematicians labor months or years for these moments of success." (Avner Ash & Robert Gross, "Fearless Symmetry: Exposing the hidden patterns of numbers", 2006)

"There is a big debate as to whether logic is part of mathematics or mathematics is part of logic. We use logic to think. We notice that our thinking, when it is valid, goes in certain patterns. These patterns can be studied mathematically. Thus, logic is a part of mathematics, called 'mathematical logic'." (Avner Ash & Robert Gross, "Fearless Symmetry: Exposing the hidden patterns of numbers", 2006) 

"The system is highly sensitive to some small changes and blows them up into major alterations in weather patterns. This is popularly known as the butterfly effect in that it is possible for a butterfly to flap its wings in São Paolo, so making a tiny change to air pressure there, and for this tiny change to escalate up into a hurricane over Miami. You would have to measure the flapping of every butterfly’s wings around the earth with infinite precision in order to be able to make long-term forecasts. The tiniest error made in these measurements could produce spurious forecasts. However, short-term forecasts are possible because it takes time for tiny differences to escalate."  (Ralph D Stacey, "Strategic Management and Organisational Dynamics: The Challenge of Complexity" 5th Ed. , 2007)

"Perception requires imagination because the data people encounter in their lives are never complete and always equivocal. [...] We also use our imagination and take shortcuts to fill gaps in patterns of nonvisual data. As with visual input, we draw conclusions and make judgments based on uncertain and incomplete information, and we conclude, when we are done analyzing the patterns, that out picture is clear and accurate. But is it?" (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"Why is the human need to be in control relevant to a discussion of random patterns? Because if events are random, we are not in control, and if we are in control of events, they are not random. There is therefore a fundamental clash between our need to feel we are in control and our ability to recognize randomness. That clash is one of the principal reasons we misinterpret random events."  (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"In emergent processes, the whole is greater than the sum of the parts. A mathematical phenomenon that appears in certain dynamic systems also occurs within biological systems, from molecular interactions within the cells to the cognitive processes that we use to move within society. [...] Emergent patterns of ideas, beauty, desires, or tragicomedy wait, ready to trap the next traveler in their complex domain of neatly patterned squares - the never-ending world of chess metaphors." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Obviously, the final goal of scientists and mathematicians is not simply the accumulation of facts and lists of formulas, but rather they seek to understand the patterns, organizing principles, and relationships between these facts to form theorems and entirely new branches of human thought." (Clifford A Pickover, "The Math Book", 2009)

"The master of chess is deeply familiar with these patterns and knows very well the position that would be beneficial to reach. The rest is thinking in a logical way (calculating) about how each piece should be moved to reach the new pattern that has already taken shape in the chess player’s mind. This way of facing chess is closely related to the solving of theorems in mathematics. For example, a mathematician who wishes to prove an equation needs to imagine how the terms on each side of the equal sign can be manipulated so that one is reduced to the other. The enterprise is far from easy, to judge by the more than two hundred years that have been needed to solve theorems such as that of Fermat (z^n = x^n + y^n), using diverse tricks to prove the equation." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

07 June 2021

On Patterns (1990-1999)

"Mathematics is an exploratory science that seeks to understand every kind of pattern - patterns that occur in nature, patterns invented by the human mind, and even patterns created by other patterns." (Lynn A Steen, "The Future of Mathematics Education", 1990)

"Phenomena having uncertain individual outcomes but a regular pattern of outcomes in many repetitions are called random. 'Random' is not a synonym for 'haphazard' but a description of a kind of order different from the deterministic one that is popularly associated with science and mathematics. Probability is the branch of mathematics that describes randomness." (David S Moore, "Uncertainty", 1990)

"Systems thinking is a framework for seeing interrelationships rather than things, for seeing patterns rather than static snapshots. It is a set of general principles spanning fields as diverse as physical and social sciences, engineering and management." (Peter Senge, "The Fifth Discipline", 1990)

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"Chaos demonstrates that deterministic causes can have random effects […] There's a similar surprise regarding symmetry: symmetric causes can have asymmetric effects. […] This paradox, that symmetry can get lost between cause and effect, is called symmetry-breaking. […] From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"In everyday language, the words 'pattern' and 'symmetry' are used almost interchangeably, to indicate a property possessed by a regular arrangement of more-or-less identical units […]” (Ian Stewart & Martin Golubitsky, “Fearful Symmetry: Is God a Geometer?”, 1992)

"Scientists have discovered many peculiar things, and many beautiful things. But perhaps the most beautiful and the most peculiar thing that they have discovered is the pattern of science itself. Our scientific discoveries are not independent isolated facts; one scientific generalization finds its explanation in another, which is itself explained by yet another. By tracing these arrows of explanation back toward their source we have discovered a striking convergent pattern - perhaps the deepest thing we have yet learned about the universe." (Steven Weinberg, "Dreams of a Final Theory: The Scientist’s Search for the Ultimate Laws of Nature", 1992)

"Searching for patterns is a way of thinking that is essential for making generalizations, seeing relationships, and understanding the logic and order of mathematics. Functions evolve from the investigation of patterns and unify the various aspects of mathematics." (Marilyn Burns, "About Teaching Mathematics: A K–8 Resource", 1992)

"Symmetry is bound up in many of the deepest patterns of Nature, and nowadays it is fundamental to our scientific understanding of the universe. Conservation principles, such as those for energy or momentum, express a symmetry that (we believe) is possessed by the entire space-time continuum: the laws of physics are the same everywhere." (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"World view, a concept borrowed from cultural anthropology, refers to the culturally dependent, generally subconscious, fundamental organization of the mind. This conceptual organization manifests itself as a set of presuppositions that predispose one to feel, think, and act in predictable patterns." (Kenneth G Tobin, "The practice of constructivism in science education", 1993)

"[For] us to be able to speak and understand novel sentences, we have to store in our heads not just the words of our language but also the patterns of sentences possible in our language. These patterns, in turn, describe not just patterns of words but also patterns of patterns. Linguists refer to these patterns as the rules of language stored in memory; they refer to the complete collection of rules as the mental grammar of the language, or grammar for short." (Ray Jackendoff, "Patterns in the Mind", 1994)

"A neural network is characterized by A) its pattern of connections between the neurons (called its architecture), B) its method of determining the weights on the connections (called its training, or learning, algorithm), and C) its activation function." (Laurene Fausett, "Fundamentals of Neural Networks", 1994)

"At the other far extreme, we find many systems ordered as a patchwork of parallel operations, very much as in the neural network of a brain or in a colony of ants. Action in these systems proceeds in a messy cascade of interdependent events. Instead of the discrete ticks of cause and effect that run a clock, a thousand clock springs try to simultaneously run a parallel system. Since there is no chain of command, the particular action of any single spring diffuses into the whole, making it easier for the sum of the whole to overwhelm the parts of the whole. What emerges from the collective is not a series of critical individual actions but a multitude of simultaneous actions whose collective pattern is far more important. This is the swarm model." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Each of nature's patterns is a puzzle, nearly always a deep one. Mathematics is brilliant at helping us to solve puzzles. It is a more or less systematic way of digging out the rules and structures that lie behind some observed pattern or regularity, and then using those rules and structures to explain what's going on." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Human mind and culture have developed a formal system of thought for recognizing, classifying, and exploiting patterns. We call it mathematics. By using mathematics to organize and systematize our ideas about patterns, we have discovered a great secret: nature's patterns are not just there to be admired, they are vital clues to the rules that govern natural processes." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Patterns possess utility as well as beauty. Once we have learned to recognize a background pattern, exceptions suddenly stand out." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Self-organization refers to the spontaneous formation of patterns and pattern change in open, nonequilibrium systems. […] Self-organization provides a paradigm for behavior and cognition, as well as the structure and function of the nervous system. In contrast to a computer, which requires particular programs to produce particular results, the tendency for self-organization is intrinsic to natural systems under certain conditions." (J A Scott Kelso, "Dynamic Patterns : The Self-organization of Brain and Behavior", 1995)

"Symmetry is basically a geometrical concept. Mathematically it can be defined as the invariance of geometrical patterns under certain operations. But when abstracted, the concept applies to all sorts of situations. It is one of the ways by which the human mind recognizes order in nature. In this sense symmetry need not be perfect to be meaningful. Even an approximate symmetry attracts one's attention, and makes one wonder if there is some deep reason behind it." (Eguchi Tohru & ?K Nishijima , "Broken Symmetry: Selected Papers Of Y Nambu", 1995)

"Whatever the reasons, mathematics definitely is a useful way to think about nature. What do we want it to tell us about the patterns we observe? There are many answers. We want to understand how they happen; to understand why they happen, which is different; to organize the underlying patterns and regularities in the most satisfying way; to predict how nature will behave; to control nature for our own ends; and to make practical use of what we have learned about our world. Mathematics helps us to do all these things, and often it is indispensable." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"If we are to have meaningful, connected experiences; ones that we can comprehend and reason about; we must be able to discern patterns to our actions, perceptions, and conceptions. Underlying our vast network of interrelated literal meanings (all of those words about objects and actions) are those imaginative structures of understanding such as schema and metaphor, such as the mental imagery that allows us to extrapolate a path, or zoom in on one part of the whole, or zoom out until the trees merge into a forest." (William H Calvin, "The Cerebral Code", 1996)

"The methods of science include controlled experiments, classification, pattern recognition, analysis, and deduction. In the humanities we apply analogy, metaphor, criticism, and (e)valuation. In design we devise alternatives, form patterns, synthesize, use conjecture, and model solutions." (Béla H Bánáthy, "Designing Social Systems in a Changing World", 1996)

"The more complex the network is, the more complex its pattern of interconnections, the more resilient it will be." (Fritjof Capra, "The Web of Life: A New Scientific Understanding of Living Systems", 1996)

"The role of science, like that of art, is to blend proximate imagery with more distant meaning, the parts we already understand with those given as new into larger patterns that are coherent enough to be acceptable as truth. Biologists know this relation by intuition during the course of fieldwork, as they struggle to make order out of the infinitely varying patterns of nature." (Edward O Wilson, "In Search of Nature", 1996)

"Mathematics can function as a telescope, a microscope, a sieve for sorting out the signal from the noise, a template for pattern perception, a way of seeking and validating truth. […] A knowledge of the mathematics behind our ideas can help us to fool ourselves a little less often, with less drastic consequences." (K C Cole, "The Universe and the Teacup: The Mathematics of Truth and Beauty", 1997)

"Mathematics is a way of thinking that can help make muddy relationships clear. It is a language that allows us to translate the complexity of the world into manageable patterns. In a sense, it works like turning off the houselights in a theater the better to see a movie. Certainly, something is lost when the lights go down; you can no longer see the faces of those around you or the inlaid patterns on the ceiling. But you gain a far better view of the subject at hand." (K C Cole, "The Universe and the Teacup: The Mathematics of Truth and Beauty", 1997)

"A formal system consists of a number of tokens or symbols, like pieces in a game. These symbols can be combined into patterns by means of a set of rules which defines what is or is not permissible (e.g. the rules of chess). These rules are strictly formal, i.e. they conform to a precise logic. The configuration of the symbols at any specific moment constitutes a ‘state’ of the system. A specific state will activate the applicable rules which then transform the system from one state to another. If the set of rules governing the behaviour of the system are exact and complete, one could test whether various possible states of the system are or are not permissible." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Mathematics, in the common lay view, is a static discipline based on formulas taught in the school subjects of arithmetic, geometry, algebra, and calculus. But outside public view, mathematics continues to grow at a rapid rate, spreading into new fields and spawning new applications. The guide to this growth is not calculation and formulas but an open-ended search for pattern." (Lynn A Steen, "The Future of Mathematics Education", 1998)

"A neural network consists of large numbers of simple neurons that are richly interconnected. The weights associated with the connections between neurons determine the characteristics of the network. During a training period, the network adjusts the values of the interconnecting weights. The value of any specific weight has no significance; it is the patterns of weight values in the whole system that bear information. Since these patterns are complex, and are generated by the network itself (by means of a general learning strategy applicable to the whole network), there is no abstract procedure available to describe the process used by the network to solve the problem. There are only complex patterns of relationships." (Paul Cilliers, "Complexity and Postmodernism: Understanding Complex Systems", 1998)

"Mathematics has traditionally been described as the science of number and shape. […] When viewed in this broader context, we see that mathematics is not just about number and shape but about pattern and order of all sorts. Number and shape - arithmetic and geometry - are but two of many media in which mathematicians work. Active mathematicians seek patterns wherever they arise." (Lynn A Steen, "The Future of Mathematics Education", 1998)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Sequences of random numbers also inevitably display certain regularities. […] The trouble is, just as no real die, coin, or roulette wheel is ever likely to be perfectly fair, no numerical recipe produces truly random numbers. The mere existence of a formula suggests some sort of predictability or pattern." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"We use mathematics and statistics to describe the diverse realms of randomness. From these descriptions, we attempt to glean insights into the workings of chance and to search for hidden causes. With such tools in hand, we seek patterns and relationships and propose predictions that help us make sense of the world."  (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"Complexity is looking at interacting elements and asking how they form patterns and how the patterns unfold. It’s important to point out that the patterns may never be finished. They’re open-ended. In standard science this hit some things that most scientists have a negative reaction to. Science doesn’t like perpetual novelty." (W Brian Arthur, 1999)

"Randomness is the very stuff of life, looming large in our everyday experience. […] The fascination of randomness is that it is pervasive, providing the surprising coincidences, bizarre luck, and unexpected twists that color our perception of everyday events." (Edward Beltrami, "Chaos and Order in Mathematics and Life", 1999)

"The first view of randomness is of clutter bred by complicated entanglements. Even though we know there are rules, the outcome is uncertain. Lotteries and card games are generally perceived to belong to this category. More troublesome is that nature's design itself is known imperfectly, and worse, the rules may be hidden from us, and therefore we cannot specify a cause or discern any pattern of order. When, for instance, an outcome takes place as the confluence of totally unrelated events, it may appear to be so surprising and bizarre that we say that it is due to blind chance." (Edward Beltrami. "What is Random?: Chance and Order in Mathematics and Life", 1999)

27 May 2021

On Randomness VI (Systems II)

"Systems, acting dynamically, produce (and incidentally, reproduce) their own boundaries, as structures which are complementary (necessarily so) to their motion and dynamics. They are liable, for all that, to instabilities chaos, as commonly interpreted of chaotic form, where nowadays, is remote from the random. Chaos is a peculiar situation in which the trajectories of a system, taken in the traditional sense, fail to converge as they approach their limit cycles or 'attractors' or 'equilibria'. Instead, they diverge, due to an increase, of indefinite magnitude, in amplification or gain.(Gordon Pask, "Different Kinds of Cybernetics", 1992)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"For the study of the topology of the interactions of a complex system it is of central importance to have proper random null models of networks, i.e., models of how a graph arises from a random process. Such models are needed for comparison with real world data. When analyzing the structure of real world networks, the null hypothesis shall always be that the link structure is due to chance alone. This null hypothesis may only be rejected if the link structure found differs significantly from an expectation value obtained from a random model. Any deviation from the random null model must be explained by non-random processes." (Jörg Reichardt, "Structure in Complex Networks", 2009)

"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini et al, "Chaos: From Simple Models to Complex Systems", 2010)

"Chaos is a phenomenon encountered in science and mathematics wherein a deterministic (rule-based) system behaves unpredictably. That is, a system which is governed by fixed, precise rules, nevertheless behaves in a way which is, for all practical purposes, unpredictable in the long run. The mathematical use of the word 'chaos' does not align well with its more common usage to indicate lawlessness or the complete absence of order. On the contrary, mathematically chaotic systems are, in a sense, perfectly ordered, despite their apparent randomness. This seems like nonsense, but it is not." (David P Feldman, "Chaos and Fractals: An Elementary Introduction", 2012)

"Systems subjected to randomness - and unpredictability - build a mechanism beyond the robust to opportunistically reinvent themselves each generation, with a continuous change of population and species." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"When some systems are stuck in a dangerous impasse, randomness and only randomness can unlock them and set them free. You can see here that absence of randomness equals guaranteed death. The idea of injecting random noise into a system to improve its functioning has been applied across fields. By a mechanism called stochastic resonance, adding random noise to the background makes you hear the sounds (say, music) with more accuracy." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"A system in which a few things interacting produce tremendously divergent behavior; deterministic chaos; it looks random but its not." (Christopher Langton)

09 May 2021

On Randomness VIII (Events II)

"Our lives today are not conducted in linear terms. They are much more quantified; a stream of random events is taking place." (James G Ballard, [Conversation with George MacBeth on Third Programme - BBC], 1967)

"Events may appear to us to be random, but this could be attributed to human ignorance about the details of the processes involved." (Brain S Everitt, "Chance Rules", 1999)

"That randomness gives rise to innovation and diversity in nature is echoed by the notion that chance is also the source of invention in the arts and everyday affairs in which naturally occurring processes are balanced between tight organization, where redundancy is paramount, and volatility, in which little order is possible. One can argue that there is a difference in kind between the unconscious, and sometimes conscious, choices made by a writer or artist in creating a string of words or musical notes and the accidental succession of events taking place in the natural world. However, it is the perception of ambiguity in a string that matters, and not the process that generated it, whether it be man-made or from nature at large." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"The subject of probability begins by assuming that some mechanism of uncertainty is at work giving rise to what is called randomness, but it is not necessary to distinguish between chance that occurs because of some hidden order that may exist and chance that is the result of blind lawlessness. This mechanism, figuratively speaking, churns out a succession of events, each individually unpredictable, or it conspires to produce an unforeseeable outcome each time a large ensemble of possibilities is sampled."  (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"Random events often come like the raisins in a box of cereal - in groups, streaks, and clusters. And although Fortune is fair in potentialities, she is not fair in outcomes." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"The outline of our lives, like the candles flame, is continuously coaxed in new directions by a variety of random events that, along with our responses to them, determine our fate." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"Why is the human need to be in control relevant to a discussion of random patterns? Because if events are random, we are not in control, and if we are in control of events, they are not random. There is therefore a fundamental clash between our need to feel we are in control and our ability to recognize randomness. That clash is one of the principal reasons we misinterpret random events."  (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"Quantum physicists today are reconciled to randomness at the individual event level, but to expect causality to underlie statistical quantum phenomena is reasonable. Suppose a person shakes an ink pen such that ink spots are formed on a white wall, in what appears for all intents and purposes, randomly. Let us further suppose the random ink spots accumulate to form precise pictures of different known persons' faces every time. We will not regard the overall result to be a happenchance; we are apt to suspect there must be a 'method' to the person who is shaking the ink pen." (Ravi Gomatam) [response to Nobel Laureate Steven Weinberg's article "Einstein's Mistakes", Physics Today Vol. 59 (4), 2005]

"We are hardwired to make sense of the world around us - to notice patterns and invent theories to explain these patterns. We underestimate how easily pat - terns can be created by inexplicable random events - by good luck and bad luck." (Gary Smith, "Standard Deviations", 2014)

On Randomness XXVI (Universe)

"Random chance was not a sufficient explanation of the Universe - in fact, random chance was not sufficient to explain random chance; the pot could not hold itself." (Robert A Heinlein, "Stranger in a Strange Land", 1961)

"The line between inner and outer landscapes is breaking down. Earthquakes can result from seismic upheavals within the human mind. The whole random universe of the industrial age is breaking down into cryptic fragments." (William S Burroughs, [preface] 1972)

"There is no reason to assume that the universe has the slightest interest in intelligence -  or even in life. Both may be random accidental by-products of its operations like the beautiful patterns on a butterfly's wings. The insect would fly just as well without them […]" (Arthur C Clarke, "The Lost Worlds of 2001", 1972)

"It is tempting to wonder if our present universe, large as it is and complex though it seems, might not be merely the result of a very slight random increase in order over a very small portion of an unbelievably colossal universe which is virtually entirely in heat-death." (Isaac Asimov, 1976)

"Perhaps randomness is not merely an adequate description for complex causes that we cannot specify. Perhaps the world really works this way, and many events are uncaused in any conventional sense of the word." (Stephen J Gould, "Hen's Teeth and Horse's Toes", 1983)

"The world of science lives fairly comfortably with paradox. We know that light is a wave and also that light is a particle. The discoveries made in the infinitely small world of particle physics indicate randomness and chance, and I do not find it any more difficult to live with the paradox of a universe of randomness and chance and a universe of pattern and purpose than I do with light as a wave and light as a particle. Living with contradiction is nothing new to the human being." (Madeline L'Engle, "Two-Part Invention: The Story of a Marriage", 1988)

"Intriguingly, the mathematics of randomness, chaos, and order also furnishes what may be a vital escape from absolute certainty - an opportunity to exercise free will in a deterministic universe. Indeed, in the interplay of order and disorder that makes life interesting, we appear perpetually poised in a state of enticingly precarious perplexity. The universe is neither so crazy that we can’t understand it at all nor so predictable that there’s nothing left for us to discover." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1997)

"Often, we use the word random loosely to describe something that is disordered, irregular, patternless, or unpredictable. We link it with chance, probability, luck, and coincidence. However, when we examine what we mean by random in various contexts, ambiguities and uncertainties inevitably arise. Tackling the subtleties of randomness allows us to go to the root of what we can understand of the universe we inhabit and helps us to define the limits of what we can know with certainty." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"The first view of randomness is of clutter bred by complicated entanglements. Even though we know there are rules, the outcome is uncertain. Lotteries and card games are generally perceived to belong to this category. More troublesome is that nature's design itself is known imperfectly, and worse, the rules may be hidden from us, and therefore we cannot specify a cause or discern any pattern of order. When, for instance, an outcome takes place as the confluence of totally unrelated events, it may appear to be so surprising and bizarre that we say that it is due to blind chance." (Edward Beltrami. "What is Random?: Chance and Order in Mathematics and Life", 1999)

"The tissue of the world is built from necessities and randomness; the intellect of men places itself between both and can control them; it considers the necessity and the reason of its existence; it knows how randomness can be managed, controlled, and used." (Johann Wolfgang von Goethe)

30 April 2021

Statistical Tools I: Coins

"Equiprobability in the physical world is purely a hypothesis. We may exercise the greatest care and the most accurate of scientific instruments to determine whether or not a penny is symmetrical. Even if we are satisfied that it is, and that our evidence on that point is conclusive, our knowledge, or rather our ignorance, about the vast number of other causes which affect the fall of the penny is so abysmal that the fact of the penny’s symmetry is a mere detail. Thus, the statement 'head and tail are equiprobable' is at best an assumption." (Edward Kasner & James R Newman, "Mathematics and the Imagination", 1940)

"A misunderstanding of Bernoulli’s theorem is responsible for one of the commonest fallacies in the estimation of probabilities, the fallacy of the maturity of chances. When a coin has come down heads twice in succession, gamblers sometimes say that it is more likely to come down tails next time because ‘by the law of averages’ (whatever that may mean) the proportion of tails must be brought right some time." (William Kneale, "Probability and Induction", 1949)

"We must emphasize that such terms as 'select at random', 'choose at random', and the like, always mean that some mechanical device, such as coins, cards, dice, or tables of random numbers, is used." (Frederick Mosteller et al, "Principles of Sampling", Journal of the American Statistical Association Vol. 49 (265), 1954)

"And nobody can get [...] far without at least an acquaintance with the mathematics of probability, not to the extent of making its calculations and filling examination papers with typical equations, but enough to know when they can be trusted, and when they are cooked. For when their imaginary numbers correspond to exact quantities of hard coins unalterably stamped with heads and tails, they are safe within certain limits; for here we have solid certainty [...] but when the calculation is one of no constant and several very capricious variables, guesswork, personal bias, and pecuniary interests, come in so strong that those who began by ignorantly imagining that statistics cannot lie end by imagining equally ignorantly, that they never do anything else." (George B Shaw, "The World of Mathematics", 1956)

"[...] there can be such a thing as a simple probabilistic system. For example, consider the tossing of a penny. Here is a perfectly simple system, but one which is notoriously unpredictable. It maybe described in terms of a binary decision process, with a built-in even probability between the two possible outcomes." (Stafford Beer, "Cybernetics and Management", 1959)

"The shrewd guess, the fertile hypothesis, the courageous leap to a tentative conclusion - these are the most valuable coin of the thinker at work." (Jerome S Bruner, "The Process of Education", 1960)

"No Chancellor of the Exchequer could introduce his proposals for monetary and fiscal policy in the House of Commons by saying 'I have looked at all the forecasts, some go one way, some another; so I decided to toss a coin and assume inflationary tendencies if it came down heads and deflationary if it came down tails' [...] And statistics, however uncertain, can apparently provide some basis." (Ely Devons, "Essays in Economics", 1961)

"The equanimity of your average tosser of coins depends upon a law, or rather a tendency, or let us say a probability, or at any rate a mathematically calculable chance, which ensures that he will not upset himself by losing too much nor upset his opponent by winning too often." (Tom Stoppard, "Rosencrantz and Guildenstern Are Dead", 1967)

"A significant property of the value function, called loss aversion, is that the response to losses is more extreme than the response to gains. The common reluctance to accept a fair bet on the toss of a coin suggests that the displeasure of losing a sum of money exceeds the pleasure of winning the same amount. Thus the proposed value function is (i) defined on gains and losses, (ii) generally concave for gains and convex for losses, and (iii) steeper for losses than for gains." (Amos Tversky & Daniel Kahneman, "Rational Choice and the Framing of Decisions", The Journal of Business Vol. 59 (4), 1986)

"Flip a coin 100 times. Assume that 99 heads are obtained. If you ask a statistician, the response is likely to be: 'It is a biased coin'. But if you ask a probabilist, he may say: 'Wooow, what a rare event'." (Chamont Wang, "Sense and Nonsense of Statistical Inference", 1993)

"The coin is an example of complete randomness. It is the sort of randomness that one commonly has in mind when thinking of random numbers, or deciding to use a random-number generator." (Edward N Lorenz, "The Essence of Chaos", 1993)

"Losing streaks and winning streaks occur frequently in games of chance, as they do in real life. Gamblers respond to these events in asymmetric fashion: they appeal to the law of averages to bring losing streaks to a speedy end. And they appeal to that same law of averages to suspend itself so that winning streaks will go on and on. The law of averages hears neither appeal. The last sequence of throws of the dice conveys absolutely no information about what the next throw will bring. Cards, coins, dice, and roulette wheels have no memory." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"The dice and the roulette wheel, along with the stock market and the bond market, are natural laboratories for the study of risk because they lend themselves so readily to quantification; their language is the language of numbers." (Peter L Bernstein, "Against the Gods: The Remarkable Story of Risk", 1996)

"However, random walk theory also tells us that the chance that the balance never returns to zero - that is, that H stays in the lead for ever - is 0. This is the sense in which the 'law of averages' is true. If you wait long enough, then almost surely the numbers of heads and tails will even out. But this fact carries no implications about improving your chances of winning, if you're betting on whether H or T turns up. The probabilities are unchanged, and you don't know how long the 'long run' is going to be. Usually it is very long indeed." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"In everyday language, a fair coin is called random, but not a coin that shows head more often than tail. A coin that keeps a memory of its own record of heads and tails is viewed as even less random. This mental picture is present in the term random walk, especially as used in finance." (Benoit B Mandelbrot, "Fractals and Scaling in Finance: Discontinuity, concentration, risk", 1997)

"The basis of many misconceptions about probability is a belief in something usually referred to as 'the law of averages', which alleges that any unevenness in random events gets ironed out in the long run. For example, if a tossed coin keeps coming up heads, then it is widely believed that at some stage there will be a predominance of tails to balance things out." (Ian Stewart, The Magical Maze: Seeing the world through mathematical eyes", 1997)

"Sequences of random numbers also inevitably display certain regularities. […] The trouble is, just as no real die, coin, or roulette wheel is ever likely to be perfectly fair, no numerical recipe produces truly random numbers. The mere existence of a formula suggests some sort of predictability or pattern." (Ivars Peterson, "The Jungles of Randomness: A Mathematical Safari", 1998)

"For several centuries that we know of, and probably for many centuries before that, flipping a coin (or rolling a die) has been the epitome of probability, the paradigm of randomness. You flip the coin (or roll the die), and nobody can accurately predict how it will fall. Nor can the most powerful computer predict correctly how it will fall, if it is flipped energetically enough. This is why cards, dice, and other gambling aids crop up so often in literature both directly and as metaphors. No doubt it is also the reason for the (perhaps excessive) popularity of gambling as entertainment. If anyone had any idea what numbers the lottery would show, or where the roulette ball will land, the whole industry would be a dead duck." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"From the moment we first roll a die in a children’s board game, or pick a card (any card), we start to learn what probability is. But even as adults, it is not easy to tell what it is, in the general way." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"[...] the chance of a head (or a double six) is just a chance. The whole point of probability is to discuss uncertain eventualities before they occur. After this event, things are completely different. As the simplest illustration of this, note that even though we agree that if we flip a coin and roll two dice then the chance of a head is greater than the chance of a double six, nevertheless it may turn out that the coin shows a tail when the dice show a double six." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"We cannot really have a perfectly shuffled pack of perfect cards; this ‘collection of equally likely hands’ is actually a fiction. We create the idea, and then use the rules of arithmetic to calculate the required chances. This is characteristic of all mathematics, which concerns itself only with rules defining the behaviour of entities which are themselves undefined (such as ‘numbers’ or ‘points’)." (David Stirzaker, "Probability and Random Variables: A Beginner’s Guide", 1999)

"If sinks, sources, saddles, and limit cycles are coins landing heads or tails, then the exceptions are a coin landing on edge. Yes, it might happen, in theory; but no, it doesn't, in practice." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"It's a bit like having a theory about coins that move in space, but only being able to measure their state by interrupting them with a table. We hypothesize that the coin may be able to revolve in space, a state that is neither ‘heads’ nor ‘tails’ but a kind of mixture. Our experimental proof is that when you stick a table in, you get heads half the time and tails the other half - randomly. This is by no means a perfect analogy with standard quantum theory - a revolving coin is not exactly in a superposition of heads and tails - but it captures some of the flavour." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"The chance events due to deterministic chaos, on the other hand, occur even within a closed system determined by immutable laws. Our most cherished examples of chance - dice, roulette, coin-tossing – seem closer to chaos than to the whims of outside events. So, in this revised sense, dice are a good metaphor for chance after all. It's just that we've refined our concept of randomness. Indeed, the deterministic but possibly chaotic stripes of phase space may be the true source of probability." (Ian Stewart, "Does God Play Dice: The New Mathematics of Chaos", 2002)

"The possibility of translating uncertainties into risks is much more restricted in the propensity view. Propensities are properties of an object, such as the physical symmetry of a die. If a die is constructed to be perfectly symmetrical, then the probability of rolling a six is 1 in 6. The reference to a physical design, mechanism, or trait that determines the risk of an event is the essence of the propensity interpretation of probability. Note how propensity differs from the subjective interpretation: It is not sufficient that someone’s subjective probabilities about the outcomes of a die roll are coherent, that is, that they satisfy the laws of probability. What matters is the die’s design. If the design is not known, there are no probabilities." (Gerd Gigerenzer, "Calculated Risks: How to know when numbers deceive you", 2002)

"Randomness is a difficult notion for people to accept. When events come in clusters and streaks, people look for explanations and patterns. They refuse to believe that such patterns - which frequently occur in random data - could equally well be derived from tossing a coin. So it is in the stock market as well." (Didier Sornette, "Why Stock Markets Crash: Critical events in complex financial systems", 2003)

"Suppose that while flipping a coin, a small black hole passed by and ate the coin. As long as we got to see the coin, the probabilities of heads and tails would add to one, but the possibility of a coin disappearing altogether into a black hole would have to be included. Once the coin crosses the event horizon of the black hole, it simply does not meaningfully exist in our universe anymore. Can we simply adjust our probabilistic interpretation to accommodate this outcome? Will we ever encounter negative probabilities?" (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)

"Random number generators do not always need to be symmetrical. This misconception of assuming equal likelihood for each outcome is fostered in a restricted learning environment, where learners see only such situations (that is, dice, coins and spinners). It is therefore very important for learners to be aware of situations where the different outcomes are not equally likely (as with the drawing-pins example)." (Alan Graham, "Developing Thinking in Statistics", 2006)

"The objectivist view is that probabilities are real aspects of the universe - propensities of objects to behave in certain ways - rather than being just descriptions of an observer’s degree of belief. For example, the fact that a fair coin comes up heads with probability 0.5 is a propensity of the coin itself. In this view, frequentist measurements are attempts to observe these propensities. Most physicists agree that quantum phenomena are objectively probabilistic, but uncertainty at the macroscopic scale - e.g., in coin tossing - usually arises from ignorance of initial conditions and does not seem consistent with the propensity view." (Stuart J Russell & Peter Norvig, "Artificial Intelligence: A Modern Approach", 2010)

"A very different - and very incorrect - argument is that successes must be balanced by failures (and failures by successes) so that things average out. Every coin flip that lands heads makes tails more likely. Every red at roulette makes black more likely. […] These beliefs are all incorrect. Good luck will certainly not continue indefinitely, but do not assume that good luck makes bad luck more likely, or vice versa." (Gary Smith, "Standard Deviations", 2014)

"Remember that even random coin flips can yield striking, even stunning, patterns that mean nothing at all. When someone shows you a pattern, no matter how impressive the person’s credentials, consider the possibility that the pattern is just a coincidence. Ask why, not what. No matter what the pattern, the question is: Why should we expect to find this pattern?" (Gary Smith, "Standard Deviations", 2014)

"We are seduced by patterns and we want explanations for these patterns. When we see a string of successes, we think that a hot hand has made success more likely. If we see a string of failures, we think a cold hand has made failure more likely. It is easy to dismiss such theories when they involve coin flips, but it is not so easy with humans. We surely have emotions and ailments that can cause our abilities to go up and down. The question is whether these fluctuations are important or trivial." (Gary Smith, "Standard Deviations", 2014)

"When statisticians, trained in math and probability theory, try to assess likely outcomes, they demand a plethora of data points. Even then, they recognize that unless it’s a very simple and controlled action such as flipping a coin, unforeseen variables can exert significant influence." (Zachary Karabell, "The Leading Indicators: A short history of the numbers that rule our world", 2014)

10 April 2021

On Generalization (1920-1929)

"If we are not content with the dull accumulation of experimental facts, if we make any deductions or generalizations, if we seek for any theory to guide us, some degree of speculation cannot be avoided. Some will prefer to take the interpretation which seems to be most immediately indicated and at once adopted as an hypothesis; others will rather seek to explore and classify the widest possibilities which are not definitely inconsistent with the facts. Either choice has its dangers: the first may be too narrow a view and lead progress into a cul-de-sac; the second may be so broad that it is useless as a guide and diverge indefinitely from experimental knowledge." (Sir Arthur S Eddington, "The Internal Constitution of the Stars Observatory", Vol. 43, 1920)

"It is well to be explicit when a positive generalization is made from negative experimental evidence." (Arthur Eddington, "Space, Time and Gravitation: An Outline of the General Relativity", 1920)

"Generalization is the golden thread which binds many facts into one simple description." (Joseph W Mellor, "A Comprehensive Treatise on Inorganic and Theoretical Chemistry", 1922)

"[…] a history of mathematics is largely a history of discoveries which no longer exist as separate items, but are merged into some more modern generalization, these discoveries have not been forgotten or made valueless. They are not dead, but transmuted." (John W N Sullivan, "The History of Mathematics in Europe", 1925)

"Number knows no limitations, either from the side of the infinitely great or from the side of the infinitely small, and the facility it offers for generalization is too great for us not to be tempted by it." (Émile Borel, "Space and Time", 1926)

"[…] the statistical prediction of the future from the past cannot be generally valid, because whatever is future to any given past, is in tum past for some future. That is, whoever continually revises his judgment of the probability of a statistical generalization by its successively observed verifications and failures, cannot fail to make more successful predictions than if he should disregard the past in his anticipation of the future. This might be called the ‘Principle of statistical accumulation’." (Clarence I Lewis, "Mind and the World-Order: Outline of a Theory of Knowledge", 1929)

"The true method of discovery is like the flight of an aeroplane. It starts from the ground of particular observation; it makes a flight in the thin air of imaginative generalization; and it again lands for renewed observation rendered acute by rational interpretation." (Alfred N Whitehead, "Process and Reality", 1929)

"Without doubt, if we are to go back to that ultimate, integral experience, unwarped by the sophistications of theory, that experience whose elucidation is the final aim of philosophy, the flux of things is one ultimate generalization around which we must weave our philosophical system." (Alfred N Whitehead, "Process and Reality: An Essay in Cosmology", 1929)

On Generalization (1970-1979)

"Accordingly there are two main types of science, exact science [...] and empirical science [...] seeking laws which are generalizations from particular experiences and are verifiable (or, more strictly, 'probabilities') only by observation and experiment." (Errol E Harris, "Hypothesis and Perception: The Roots of Scientific Method", 1970)

"One often hears that successive theories grow ever closer to, or approximate more and more closely to, the truth. Apparently, generalizations like that refer not to the puzzle-solutions and the concrete predictions derived from a theory but rather to its ontology, to the match, that is, between the entities with which the theory populates nature and what is ‘really there’." (Thomas S Kuhn, "The Structure of Scientific Revolutions", 1970)

"Science uses the senses but does not enjoy them; finally buries them under theory, abstraction, mathematical generalization." (Theodore Roszak, "Where the Wasteland Ends", 1972)

"A single observation that is inconsistent with some generalization points to the falsehood of the generalization, and thereby 'points to itself'." (Ian Hacking, "The Emergence Of Probability", 1975)

"The sciences have started to swell. Their philosophical basis has never been very strong. Starting as modest probing operations to unravel the works of God in the world, to follow its traces in nature, they were driven gradually to ever more gigantic generalizations. Since the pieces of the giant puzzle never seemed to fit together perfectly, subsets of smaller, more homogeneous puzzles had to be constructed, in each of which the fit was better." (Erwin Chargaff, "Voices in the Labyrinth", 1975)

"The word generalization in literature usually means covering too much territory too thinly to be persuasive, let alone convincing. In science, however, a generalization means a principle that has been found to hold true in every special case. [...] The principle of leverage is a scientific generalization." (Buckminster Fuller, "Synergetics: Explorations in the Geometry of Thinking", 1975)

"And when such claims are extraordinary, that is, revolutionary in their implications for established scientific generalizations already accumulated and verified, we must demand extraordinary proof." (Marcello Truzzi, Zetetic Scholar, Vol. 1 (1), 1976)

"If it is to be effective as a tool of thought, a notation must allow convenient expression not only of notions arising directly from a problem, but also of those arising in subsequent analysis, generalization, and specialization." (Kenneth E Iverson, "Notation as a Tool of Thought", 1979)

"Prediction can never be absolutely valid and therefore science can never prove some generalization or even test a single descriptive statement and in that way arrive at final truth." (Gregory Bateson, "Mind and Nature, A necessary unity", 1979)

On Generalization (1930-1949)

"The steady progress of physics requires for its theoretical formulation a mathematics which get continually more advanced. […] it was expected that mathematics would get more and more complicated, but would rest on a permanent basis of axioms and definitions, while actually the modern physical developments have required a mathematics that continually shifts its foundation and gets more abstract. Non-Euclidean geometry and noncommutative algebra, which were at one time were considered to be purely fictions of the mind and pastimes of logical thinkers, have now been found to be very necessary for the description of general facts of the physical world. It seems likely that this process of increasing abstraction will continue in the future and the advance in physics is to be associated with continual modification and generalisation of the axioms at the base of mathematics rather than with a logical development of any one mathematical scheme on a fixed foundation." (Paul A M Dirac, "Quantities singularities in the electromagnetic field", Proceedings of the Royal Society of London, 1931)

"It is time, therefore, to abandon the superstition that natural science cannot be regarded as logically respectable until philosophers have solved the problem of induction. The problem of induction is, roughly speaking, the problem of finding a way to prove that certain empirical generalizations which are derived from past experience will hold good also in the future." (Alfred J Ayer, "Language, Truth and Logic", 1936)

"The problem of induction is, roughly speaking, the problem of finding a way to prove that certain empirical generalizations which are derived from past experience will hold good also in the future. There are only two ways of approaching this problem on the assumption that it is a genuine problem, and it is easy to see that neither of them can lead to its solution." (Alfred J Ayer, "Language, Truth, and Logic", 1936)

"The ethos of science involves the functionally necessary demand that theories or generalizations be evaluated in [terms of] their logical consistency and consonance with facts." (Robert K Merton, "Science and the Social Order", Philosophy of Science Vol 5 (3), 1938)

"The former distrust of specialization has been supplanted by its opposite, a distrust of generalization. Not only has man become a specialist in practice, he is being taught that special facts represent the highest form of knowledge." (Richard Weaver, "Ideas have Consequences", 1948)

04 April 2021

On Technology II

"The 'message' of any medium or technology is the change of scale or pace or pattern that it introduces into human affairs." (Marshall McLuhan, "Understanding Media", 1964)

"Our technology forces us to live mythically, but we continue to think fragmentarily, and on single, separate planes." (Marshall McLuhan, "The Medium is the Massage: An inventory of effects", 1967)

"Modern scientific principle has been drawn from the investigation of natural laws, technology has developed from the experience of doing, and the two have been combined by means of mathematical system to form what we call engineering." (George S Emmerson, "Engineering Education: A Social History", 1973)

"The system of nature, of which man is a part, tends to be self-balancing, self-adjusting, self-cleansing. Not so with technology." (Ernst F Schumacher, "Small is Beautiful", 1973)

"Technology has not advanced because people are starved for instruments to make a better civilization, but because they are starved for entertainment - technology is still mostly a toy factory for grown-ups." (Eugene J Martin, 1977-1978)

"People’s views of the world, of themselves, of their own capabilities, and of the tasks that they are asked to perform, or topics they are asked to learn, depend heavily on the conceptualizations that they bring to the task. In interacting with the environment, with others, and with the artifacts of technology, people form internal, mental models of themselves and of the things with which they are interacting. These models provide predictive and explanatory power for understanding the interaction." (Donald A Norman, "Some observations on Mental Models", 1983)

"With the changes in technological complexity, especially in information technology, the leadership task has changed. Leadership in a networked organization is a fundamentally different thing from leadership in a traditional hierarchy." (Edgar Schein, "Organizational Culture and Leadership", 1985)

"The new information technologies can be seen to drive societies toward increasingly dynamic high-energy regions further and further from thermodynamical equilibrium, characterized by decreasing specific entropy and increasingly dense free-energy flows, accessed and processed by more and more complex social, economic, and political structures." (Ervin László, "Information Technology and Social Change: An Evolutionary Systems Analysis", Behavioral Science 37, 1992) 

"Now that knowledge is taking the place of capital as the driving force in organizations worldwide, it is all too easy to confuse data with knowledge and information technology with information." (Peter Drucker, "Managing in a Time of Great Change", 1995)

"Commonly, the threats to strategy are seen to emanate from outside a company because of changes in technology or the behavior of competitors. Although external changes can be the problem, the greater threat to strategy often comes from within. A sound strategy is undermined by a misguided view of competition, by organizational failures, and, especially, by the desire to grow." (Michael E Porter, "What is Strategy?", Harvard Business Review, 1996)

17 March 2021

Mathematical Models III

"Mathematical model making is an art. If the model is too small, a great deal of analysis and numerical solution can be done, but the results, in general, can be meaningless. If the model is too large, neither analysis nor numerical solution can be carried out, the interpretation of the results is in any case very difficult, and there is great difficulty in obtaining the numerical values of the parameters needed for numerical results." (Richard E Bellman, "Eye of the Hurricane: An Autobiography", 1984)

"Symmetries abound in nature, in technology, and - especially - in the simplified mathematical models we study so assiduously. Symmetries complicate things and simplify them. They complicate them by introducing exceptional types of behavior, increasing the number of variables involved, and making vanish things that usually do not vanish. They simplify them by introducing exceptional types of behavior, increasing the number of variables involved, and making vanish things that usually do not vanish. They violate all the hypotheses of our favorite theorems, yet lead to natural generalizations of those theorems. It is now standard to study the 'generic' behavior of dynamical systems. Symmetry is not generic. The answer is to work within the world of symmetric systems and to examine a suitably restricted idea of genericity." (Ian Stewart, "Bifurcation with symmetry", 1988)

"Pedantry and sectarianism aside, the aim of theoretical physics is to construct mathematical models such as to enable us, from the use of knowledge gathered in a few observations, to predict by logical processes the outcomes in many other circumstances. Any logically sound theory satisfying this condition is a good theory, whether or not it be derived from ‘ultimate’ or ‘fundamental’ truth." (Clifford Truesdell & Walter Noll, "The Non-Linear Field Theories of Mechanics" 2nd Ed., 1992)

"[…] interval mathematics and fuzzy logic together can provide a promising alternative to mathematical modeling for many physical systems that are too vague or too complicated to be described by simple and crisp mathematical formulas or equations. When interval mathematics and fuzzy logic are employed, the interval of confidence and the fuzzy membership functions are used as approximation measures, leading to the so-called fuzzy systems modeling." (Guanrong Chen & Trung Tat Pham, "Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems", 2001)

"Modeling, in a general sense, refers to the establishment of a description of a system (a plant, a process, etc.) in mathematical terms, which characterizes the input-output behavior of the underlying system. To describe a physical system […] we have to use a mathematical formula or equation that can represent the system both qualitatively and quantitatively. Such a formulation is a mathematical representation, called a mathematical model, of the physical system." (Guanrong Chen & Trung Tat Pham, "Introduction to Fuzzy Sets, Fuzzy Logic, and Fuzzy Control Systems", 2001)

"An important aspect of the global theory of dynamical systems is the stability of the orbit structure as a whole. The motivation for the corresponding theory comes from applied mathematics. Mathematical models always contain simplifying assumptions. Dominant features are modeled; supposed small disturbing forces are ignored. Thus, it is natural to ask if the qualitative structure of the set of solutions - the phase portrait - of a model would remain the same if small perturbations were included in the model. The corresponding mathematical theory is called structural stability." (Carmen Chicone, "Stability Theory of Ordinary Differential Equations" [Mathematics of Complexity and Dynamical Systems, 2012])

"Models do not and need not match reality in all of its aspects and details to be adequate. A mathematical model is usually developed for a specific class of target systems, and its validity is determined relative to its intended applications. A model is considered valid within its intended domain of applicability provided that its predictions in that domain fall within an acceptable range of error, specified prior to the model’s development or identification." (Zoltan Domotor, "Mathematical Models in Philosophy of Science" [Mathematics of Complexity and Dynamical Systems, 2012])

"Simplified description of a real world system in mathematical terms, e. g., by means of differential equations or other suitable mathematical structures." (Benedetto Piccoli, Andrea Tosin, "Vehicular Traffic: A Review of Continuum Mathematical Models" [Mathematics of Complexity and Dynamical Systems, 2012])

"Stated loosely, models are simplified, idealized and approximate representations of the structure, mechanism and behavior of real-world systems. From the standpoint of set-theoretic model theory, a mathematical model of a target system is specified by a nonempty set - called the model’s domain, endowed with some operations and relations, delineated by suitable axioms and intended empirical interpretation." (Zoltan Domotor, "Mathematical Models in Philosophy of Science" [Mathematics of Complexity and Dynamical Systems, 2012])

"The standard view among most theoretical physicists, engineers and economists is that mathematical models are syntactic (linguistic) items, identified with particular systems of equations or relational statements. From this perspective, the process of solving a designated system of (algebraic, difference, differential, stochastic, etc.) equations of the target system, and interpreting the particular solutions directly in the context of predictions and explanations are primary, while the mathematical structures of associated state and orbit spaces, and quantity algebras – although conceptually important, are secondary." (Zoltan Domotor, "Mathematical Models in Philosophy of Science" [Mathematics of Complexity and Dynamical Systems, 2012])

08 March 2021

On Machines XII (Mind vs. Machines IV)

"In other words then, if a machine is expected to be infallible, it cannot also be intelligent. There are several theorems which say almost exactly that. But these theorems say nothing about how much intelligence may be displayed if a machine makes no pretense at infallibility." (Alan M Turing, 1946)

"The brain has been compared to a digital computer because the neuron, like a switch or valve, either does or does not complete a circuit. But at that point the similarity ends. The switch in the digital computer is constant in its effect, and its effect is large in proportion to the total output of the machine. The effect produced by the neuron varies with its recovery from [the] refractory phase and with its metabolic state. The number of neurons involved in any action runs into millions so that the influence of any one is negligible. [...] Any cell in the system can be dispensed with. [...] The brain is an analogical machine, not digital. Analysis of the integrative activities will probably have to be in statistical terms. (Karl S Lashley, "The problem of serial order in behavior", 1951)

"Although it sounds implausible, it might turn out that above a certain level of complexity, a machine ceased to be predictable, even in principle, and started doing things on its own account, or, to use a very revealing phrase, it might begin to have a mind of its own." (John R Lucas, "Minds, Machines and Gödel", 1959)

"There are now machines in the world that think, that learn and create. Moreover, their ability to do these things is going to increase rapidly until - in the visible future - the range of problems they can handle will be coextensive with the range to which the human mind has been applied." (Allen Newell & Herbert A Simon, "Human problem solving", 1976)

"We can divide those who uphold the doctrine that men are machines, or a similar doctrine, into two categories: those who deny the existence of mental events, or personal experiences, or of consciousness; [...] and those who admit the existence of mental events, but assert that they are 'epiphenomena' - that everything can be explained without them, since the material world is causally closed." (Karl Popper & John Eccles, "The self and its brain", 1977)

"It is essential to realize that a computer is not a mere 'number cruncher', or supercalculating arithmetic machine, although this is how computers are commonly regarded by people having no familiarity with artificial intelligence. Computers do not crunch numbers; they manipulate symbols. [...] Digital computers originally developed with mathematical problems in mind, are in fact general purpose symbol manipulating machines." (Margaret A Boden, "Minds and mechanisms", 1981)

"What makes people smarter than machines? They certainly are not quicker or more precise. Yet people are far better at perceiving objects in natural scenes and noting their relations, at understanding language and retrieving contextually appropriate information from memory, at making plans and carrying out contextually appropriate actions, and at a wide range of other natural cognitive tasks. People are also far better at learning to do these things more accurately and fluently through processing experience." (James L McClelland et al, "The appeal of parallel distributed processing", 1986)

"A popular myth says that the invention of the computer diminishes our sense of ourselves, because it shows that rational thought is not special to human beings, but can be carried on by a mere machine. It is a short stop from there to the conclusion that intelligence is mechanical, which many people find to be an affront to all that is most precious and singular about their humanness." (Jeremy Campbell, "The improbable machine", 1989)

"Looking at ourselves from the computer viewpoint, we cannot avoid seeing that natural language is our most important 'programming language'. This means that a vast portion of our knowledge and activity is, for us, best communicated and understood in our natural language. [...] One could say that natural language was our first great original artifact and, since, as we increasingly realize, languages are machines, so natural language, with our brains to run it, was our primal invention of the universal computer. One could say this except for the sneaking suspicion that language isn’t something we invented but something we became, not something we constructed but something in which we created, and recreated, ourselves. (Justin Leiber, "Invitation to cognitive science", 1991)

"On the other hand, those who design and build computers know exactly how the machines are working down in the hidden depths of their semiconductors. Computers can be taken apart, scrutinized, and put back together. Their activities can be tracked, analyzed, measured, and thus clearly understood - which is far from possible with the brain. This gives rise to the tempting assumption on the part of the builders and designers that computers can tell us something about brains, indeed, that the computer can serve as a model of the mind, which then comes to be seen as some manner of information processing machine, and possibly not as good at the job as the machine." (Theodore Roszak, "The Cult of Information", 1994)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...