07 February 2025

On Entropy: Definitions

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." ("G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"In the physics [entropy is the] rate of system's messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)

05 February 2025

Out of Context: On Patterns (Definitions)

"From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"By using mathematics to organize and systematize our ideas about patterns, we have discovered a great secret: nature's patterns are not just there to be admired, they are vital clues to the rules that govern natural processes." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"When someone shows you a pattern, no matter how impressive the person’s credentials, consider the possibility that the pattern is just a coincidence. Ask why, not what. No matter what the pattern, the question is: Why should we expect to find this pattern?" (Gary Smith, "Standard Deviations", 2014)

"Don’t be fooled into thinking that a pattern is proof. We need a logical, persuasive explanation and we need to test the explanation with fresh data." (Gary Smith, "Standard Deviations", 2014)

"A pattern is a design or model that helps grasp something. Patterns help connect things that may not appear to be connected. Patterns help cut through complexity and reveal simpler understandable trends." (Anil K. Maheshwari, "Business Intelligence and Data Mining", 2015)

"By using mathematics to organize and systematize our ideas about patterns, we have discovered a great secret: nature's patterns are not just there to be admired, they are vital clues to the rules that govern natural processes." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Thanks to their flexibility, the most complex models available to us can fit any patterns that appear in the data, but this means that they will also do so even when those patterns are mere phantoms and mirages in the noise." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

04 February 2025

Out of Context: On Diagrams (Definitions)

 "Diagrams are of great utility for illustrating certain questions of vital statistics by conveying ideas on the subject through the eye, which cannot be so readily grasped when contained in figures." (Florence Nightingale, "Mortality of the British Army", 1857)

"Diagrams are sometimes used, not merely to convey several pieces of information such as several time series on one chart, but also to provide visual evidence of relationships between the series." (Alfred R Ilersic, "Statistics", 1959)

"Diagrams, whether representational or symbolic, are meaningless unless attached to some body of theory. On the other hand theories are in no need of diagrams save for psychological purposes. Let us then keep theoretical models apart from visual analogues."  (Mario Bunge, "Philosophy of Physics", 1973)

"Schematic diagrams are more abstract than pictorial drawings, showing symbolic elements and their interconnection to make clear the configuration and/or operation of a system." (Ernest O Doebelin, "Engineering experimentation: planning, execution, reporting", 1995)

"[...] (4) Diagrams are psychologically useful, but prove nothing; (5) Diagrams can even be misleading [...]" (James R Brown,"Philosophy of Mathematics", 1999)

"A model diagram declares some sets and binary relations, and imposes some basic constraints on them. A diagram is a good way to convey the outline of a model, but diagrams aren’t expressive enough to include detailed constraints." (Daniel Jackson, "Software Abstractions", 2006) 

"[...] diagrams are models, graphical in nature, that are used to illustrate structure (e.g., how components are physically interconnected); they do not capture functional behavior of a system. "  (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)

On Diagrams: Definitions

"A diagram is a representamen [representation] which is predominantly an icon of relations and is aided to be so by conventions. Indices are also more or less used. It should be carried out upon a perfectly consistent system of representation, founded upon a simple and easily intelligible basic idea." (Charles S Peirce, 1903)

"A diagram is an icon or schematic image embodying the meaning of a general predicate; and from the observation of this icon we are supposed to construct a new general predicate." (Charles S Peirce, "New Elements" ["Kaina stoiceia"], 1904)

"[The diagram] is only an heuristic to prompt certain trains of inference; [...] it is dispensable as a proof-theoretic device; indeed, [...] it has no proper place in the proof as such. For the proof is a syntactic object consisting only of sentences arranged in a finite and inspectable array." (Neil Tennant, "The withering away of formal semantics", Mind and Language Vol. 1 (4), 1986)

"Diagrams are a means of communication and explanation, and they facilitate brainstorming. They serve these ends best if they are minimal. Comprehensive diagrams of the entire object model fail to communicate or explain; they overwhelm the reader with detail and they lack meaning." (Eric Evans, "Domain-Driven Design: Tackling complexity in the heart of software", 2003)

"A diagram is a graphic shorthand. Though it is an ideogram, it is not necessarily an abstraction. It is a representation of something in that it is not the thing itself. In this sense, it cannot help but be embodied. It can never be free of value or meaning, even when it attempts to express relationships of formation and their processes. At the same time, a diagram is neither a structure nor an abstraction of structure." (Peter Eisenman, "Written Into the Void: Selected Writings", 1990-2004, 2007)

"Diagrams are information graphics that are made up primarily of geometric shapes, such as rectangles, circles, diamonds, or triangles, that are typically (but not always) interconnected by lines or arrows. One of the major purposes of a diagram is to show how things, people, ideas, activities, etc. interrelate and interconnect. Unlike quantitative charts and graphs, diagrams are used to show interrelationships in a qualitative way." (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)

"[...] diagrams are models, graphical in nature, that are used to illustrate structure (e.g., how components are physically interconnected); they do not capture functional behavior of a system. "  (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)



20 January 2025

On Chance: Definitions

"Chance is necessity hidden behind a veil." (Marie von Ebner-Eschenbach, Aphorisms, 1880/1893)

"Chance is only the measure of our ignorance." (Henri Poincaré," The Foundations of Science", 1913)

"[...] the conception of chance enters in the very first steps of scientific activity in virtue of the fact that no observation is absolutely correct. I think chance is a more fundamental conception that causality; for whether in a concrete case, a cause-effect relation holds or not can only be judged by applying the laws of chance to the observation." (Max Born, 1949)

"Chance is just as real as causation; both are modes of becoming. The way to model a random process is to enrich the mathematical theory of probability with a model of a random mechanism. In the sciences, probabilities are never made up or 'elicited' by observing the choices people make, or the bets they are willing to place.  The reason is that, in science and technology, interpreted probability exactifies objective chance, not gut feeling or intuition. No randomness, no probability." (Mario Bunge, "Chasing Reality: Strife over Realism", 2006)

"Chance is as relentless as necessity." (Simon Blackburn, Think, 1999) 

"Whether we shuffle cards or roll dice, chance is only a result of our human lack of deftness: we don't have enough control to immobilize a die at will or to individually direct the cards in a deck. The comparison is an important one nonetheless, and highlights the limits of this method of creating chance - it doesn't matter who rolls the dice, but we wouldn't let just anyone shuffle the cards." (Ivar Ekeland, "The Broken Dice, and Other Mathematical Tales of Chance", 1993)

"That randomness gives rise to innovation and diversity in nature is echoed by the notion that chance is also the source of invention in the arts and everyday affairs in which naturally occurring processes are balanced between tight organization, where redundancy is paramount, and volatility, in which little order is possible. One can argue that there is a difference in kind between the unconscious, and sometimes conscious, choices made by a writer or artist in creating a string of words or musical notes and the accidental succession of events taking place in the natural world. However, it is the perception of ambiguity in a string that matters, and not the process that generated it, whether it be man-made or from nature at large." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

Out of Context: On Chance (Definitions)

"Chance is a world void of sense; nothing can exist without a cause." (Voltaire, A Philosophical Dictionary, 1764)

"Our conception of chance is one of law and order in large numbers; it is not that idea of chaotic incidence which vexed the mediaeval mind." (Karl Pearson, "The Chances of Death", 1895)

"Chance is only the measure of our ignorance." (Henri Poincaré, "The Foundations of Science", 1913)

"Can there be laws of chance? The answer, it would seem should be negative, since chance is in fact defined as the characteristic of the phenomena which follow no law, phenomena whose causes are too complex to permit prediction." (Félix E Borel, "Probabilities and Life", 1943)

“Can there be laws of chance? The answer, it would seem should be negative, since chance is in fact defined as the characteristic of the phenomena which follow no law, phenomena whose causes are too complex to permit prediction.” (Félix E Borel, “Probabilities and Life”, 1962)

"Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not 'corrected' as a chance process unfolds, they are merely diluted." (Amos Tversky & Daniel Kahneman, "Judgment Under Uncertainty: Heuristics and Biases", Science Vol. 185 (4157), 1974)

"Quantum chance is absolute. […] Quantum chance is not a measure of ignorance but an inherent property. […] Chance in quantum theory is absolute and irreducible." (F David Peat, "From Certainty to Uncertainty", 2002)

On Chance: Gamblers III

"Behavioural research shows that we tend to use simplifying heuristics when making judgements about uncertain events. These are prone to biases and systematic errors, such as stereotyping, disregard of sample size, disregard for regression to the mean, deriving estimates based on the ease of retrieving instances of the event, anchoring to the initial frame, the gambler’s fallacy, and wishful thinking, which are all affected by our inability to consider more than a few aspects or dimensions of any phenomenon or situation at the same time." (Hans G Daellenbach & Donald C McNickle, "Management Science: Decision making through systems thinking", 2005)

"People sometimes appeal to the ‘law of averages’ to justify their faith in the gambler’s fallacy. They may reason that, since all outcomes are equally likely, in the long run they will come out roughly equal in frequency. However, the next throw is very much in the short run and the coin, die or roulette wheel has no memory of what went before." (Alan Graham, "Developing Thinking in Statistics", 2006)

"Another kind of error possibly related to the use of the representativeness heuristic is the gambler’s fallacy, otherwise known as the law of averages. If you are playing roulette and the last four spins of the wheel have led to the ball’s landing on black, you may think that the next ball is more likely than otherwise to land on red. This cannot be. The roulette wheel has no memory. The chance of black is just what it always is. The reason people tend to think otherwise may be that they expect the sequence of events to be representative of random sequences, and the typical random sequence at roulette does not have five blacks in a row." (Jonathan Baron, "Thinking and Deciding" 4th Ed, 2008)

"The theory of randomness is fundamentally a codification of common sense. But it is also a field of subtlety, a field in which great experts have been famously wrong and expert gamblers infamously correct. What it takes to understand randomness and overcome our misconceptions is both experience and a lot of careful thinking." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"[…] many gamblers believe in the fallacious law of averages because they are eager to find a profitable pattern in the chaos created by random chance." (Gary Smith, "Standard Deviations", 2014)

17 January 2025

On Algorithms: Definitions

"Algorithms are a set of procedures to generate the answer to a problem." (Stuart Kauffman, "At Home in the Universe: The Search for Laws of Complexity", 1995)

"An algorithm is a simple rule, or elementary task, that is repeated over and over again. In this way algorithms can produce structures of astounding complexity." (F David Peat, "From Certainty to Uncertainty", 2002)

"An algorithm refers to a successive and finite procedure by which it is possible to solve a certain problem. Algorithms are the operational base for most computer programs. They consist of a series of instructions that, thanks to programmers’ prior knowledge about the essential characteristics of a problem that must be solved, allow a step-by-step path to the solution." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009) 

"[...] algorithms, which are abstract or idealized process descriptions that ignore details and practicalities. An algorithm is a precise and unambiguous recipe. It’s expressed in terms of a fixed set of basic operations whose meanings are completely known and specified. It spells out a sequence of steps using those operations, with all possible situations covered, and it’s guaranteed to stop eventually." (Brian W Kernighan, "Understanding the Digital World", 2017)

"An algorithm is the computer science version of a careful, precise, unambiguous recipe or tax form, a sequence of steps that is guaranteed to compute a result correctly." (Brian W Kernighan, "Understanding the Digital World", 2017)

"Algorithms describe the solution to a problem in terms of the data needed to represent the  problem instance and a set of steps necessary to produce the intended result." (Bradley N Miller et al, "Python Programming in Context", 2019)

"An algorithm, meanwhile, is a step-by-step recipe for performing a series of actions, and in most cases 'algorithm' means simply 'computer program'." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

16 January 2025

On Chaos: Definitions

"Chaos is but unperceived order; it is a word indicating the limitations of the human mind and the paucity of observational facts. The words ‘chaos’, ‘accidental’, ‘chance’, ‘unpredictable’ are conveniences behind which we hide our ignorance." (Harlow Shapley, "Of Stars and Men", 1958) 

"The term chaos is used in a specific sense where it is an inherently random pattern of behaviour generated by fixed inputs into deterministic (that is fixed) rules (relationships). The rules take the form of non-linear feedback loops. Although the specific path followed by the behaviour so generated is random and hence unpredictable in the long-term, it always has an underlying pattern to it, a 'hidden' pattern, a global pattern or rhythm. That pattern is self-similarity, that is a constant degree of variation, consistent variability, regular irregularity, or more precisely, a constant fractal dimension. Chaos is therefore order (a pattern) within disorder (random behaviour)." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"The term chaos is also used in a general sense to describe the body of chaos theory, the complete sequence of behaviours generated by feed-back rules, the properties of those rules and that behaviour." (Ralph D Stacey, "The Chaos Frontier: Creative Strategic Control for Business", 1991)

"What is chaos? Everyone has an impression of what the word means, but scientifically chaos is more than random behavior, lack of control, or complete disorder. [...] Scientifically, chaos is defined as extreme sensitivity to initial conditions. If a system is chaotic, when you change the initial state of the system by a tiny amount you change its future significantly." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"Chaos is a phenomenon encountered in science and mathematics wherein a deterministic (rule-based) system behaves unpredictably. That is, a system which is governed by fixed, precise rules, nevertheless behaves in a way which is, for all practical purposes, unpredictable in the long run. The mathematical use of the word 'chaos' does not align well with its more common usage to indicate lawlessness or the complete absence of order. On the contrary, mathematically chaotic systems are, in a sense, perfectly ordered, despite their apparent randomness. This seems like nonsense, but it is not." (David P Feldman, "Chaos and Fractals: An Elementary Introduction", 2012)

"Chaos provides order. Chaotic agitation and motion are needed to create overall, repetitive order. This ‘order through fluctuations’ keeps dynamic markets stable and evolutionary processes robust. In essence, chaos is a phase transition that gives spontaneous energy the means to achieve repetitive and structural order." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

15 January 2025

On Continuity: Definitions

"The Law of Continuity, as we here deal with it, consists in the idea that [...] any quantity, in passing from one magnitude to another, must pass through all intermediate magnitudes of the same class. The same notion is also commonly expressed by saying that the passage is made by intermediate stages or steps; [...] the idea should be interpreted as follows: single states correspond to single instants of time, but increments or decrements only to small areas of continuous time." (Roger J Boscovich, "Philosophiae Naturalis Theoria Redacta Ad Unicam Legera Virium in Natura Existentium", 1758)

"An essential difference between continuity and differentiability is whether numbers are involved or not. The concept of continuity is characterized by the qualitative property that nearby objects are mapped to nearby objects. However, the concept of differentiation is obtained by using the ratio of infinitesimal increments. Therefore, we see that differentiability essentially involves numbers." (Kenji Ueno & Toshikazu Sunada, "A Mathematical Gift, III: The Interplay Between Topology, Functions, Geometry, and Algebra", Mathematical World Vol. 23, 1996)

"[…] continuity appears when we try to mathematically express continuously changing phenomena, and differentiability is the result of expressing smoothly changing phenomena." (Kenji Ueno & Toshikazu Sunada, "A Mathematical Gift, III: The Interplay Between Topology, Functions, Geometry, and Algebra", Mathematical World Vol. 23, 1996)

"Intuitively speaking, a visual representation associated with the concept of continuity is the property that a near object is sent to a corresponding near object, that is, a convergent sequence is sent to a corresponding convergent sequence." (Kenji Ueno & Toshikazu Sunada, "A Mathematical Gift, III: The Interplay Between Topology, Functions, Geometry, and Algebra", Mathematical World Vol. 23, 1996)

"A continuous function preserves closeness of points. A discontinuous function maps arbitrarily close points to points that are not close. The precise definition of continuity involves the relation of distance between pairs of points. […] continuity, a property of functions that allows stretching, shrinking, and folding, but preserves the closeness relation among points." (Robert Messer & Philip Straffin, "Topology Now!", 2006)

"Continuity is the rigorous formulation of the intuitive concept of a function that varies with no abrupt breaks or jumps. A function is a relationship in which every value of an independent variable - say x - is associated with a value of a dependent variable - say y. Continuity of a function is sometimes expressed by saying that if the x-values are close together, then the y-values of the function will also be close. But if the question 'How close?' is asked, difficulties arise." (Erik Gregersen [Ed.], "Math Eplained: The Britannica Guide to Analysis and Calculus", 2011)

"Continuity is only a mathematical technique for approximating very finely grained things. The world is subtly discrete, not continuous." (Carlo Rovelli, "The Order of Time", 2018)

Related Posts Plugin for WordPress, Blogger...

On Entropy: Definitions

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, i...