21 March 2025

On Optimization I

"The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming." (Donald E Knuth, "Computer Programming as an Art", 1968)

"In most engineering problems, particularly when solving optimization problems, one must have the opportunity of comparing different variants quantitatively. It is therefore important to be able to state a clear-cut quantitative criterion." (Yakov Khurgin, "Did You Say Mathematics?", 1974)

"Linear programming is viewed as a revolutionary development giving man the ability to state general objectives and to find, by means of the simplex method, optimal policy decisions for a broad class of practical decision problems of great complexity. In the real world, planning tends to be ad hoc because of the many special-interest groups with their multiple objectives." (George Dantzig, "Reminiscences about the origins of linear programming", Mathematical programming: the state of the art", 1983) 

"It remains an unhappy fact that there is no best method for finding the solution to general nonlinear optimization problems. About the best general procedure yet devised is one that relies upon imbedding the original problem within a family of problems, and then developing relations linking one member of the family to another. If this can be done adroitly so that one family member is easily solvable, then these relations can be used to step forward from the solution of the easy problem to that of the original problem. This is the key idea underlying dynamic programming, the most flexible and powerful of all optimization methods." (John L Casti, "Five Golden Rules", 1995)

"Heuristic methods may aim at local optimization rather than at global optimization, that is, the algorithm optimizes the solution stepwise, finding the best solution at each small step of the solution process and 'hoping' that the global solution, which comprises the local ones, would be satisfactory." (Nikola K Kasabov, "Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering", 1996)

"Mathematical programming (or optimization theory) is that branch of mathematics dealing with techniques for maximizing or minimizing an objective function subject to linear, nonlinear, and integer constraints on the variables." (George B Dantzig & Mukund N Thapa, "Linear Programming" Vol I, 1997)

"A heuristic is ecologically rational to the degree that it is adapted to the structure of an environment. Thus, simple heuristics and environmental structure can both work hand in hand to provide a realistic alternative to the ideal of optimization, whether unbounded or constrained." (Gerd Gigerenzer & Peter M Todd, "Fast and Frugal Heuristics: The Adaptive Toolbox" [in "Simple Heuristics That Make Us Smart"], 1999)

On Optimization II

"A model is an imitation of reality and a mathematical model is a particular form of representation. We should never forget this and get so distracted by the model that we forget the real application which is driving the modelling. In the process of model building we are translating our real world problem into an equivalent mathematical problem which we solve and then attempt to interpret. We do this to gain insight into the original real world situation or to use the model for control, optimization or possibly safety studies." (Ian T Cameron & Katalin Hangos, "Process Modelling and Model Analysis", 2001)

"Heuristics are needed in situations where the world does not permit optimization. For many real-world problems (as opposed to optimization-tuned textbook problems), optimal solutions are unknown because the problems are computationally intractable or poorly defined." (Christoph Engel & Gerd Gigerenzer, "Law and Heuristics: An interdisciplinary venture" [in "Heuristics and the Law", 2006) 

"It remains an unhappy fact that there is no best method for finding the solution to general nonlinear optimization problems. About the best general procedure yet devised is one that relies upon imbedding the original problem within a family of problems, and then developing relations linking one member of the family to another. If this can be done adroitly so that one family member is easily solvable, then these relations can be used to step forward from the solution of the easy problem to that of the original problem. This is the key idea underlying dynamic programming, the most flexible and powerful of all optimization methods." (John L Casti, "Five Golden Rules", 1995)

"Mathematical programming (or optimization theory) is that branch of mathematics dealing with techniques for maximizing or minimizing an objective function subject to linear, nonlinear, and integer constraints on the variables."  (George B Dantzig & Mukund N Thapa, "Linear Programming" Vol I, 1997)

"A heuristic is ecologically rational to the degree that it is adapted to the structure of an environment. Thus, simple heuristics and environmental structure can both work hand in hand to provide a realistic alternative to the ideal of optimization, whether unbounded or constrained." (Gerd Gigerenzer & Peter M Todd, "Fast and Frugal Heuristics: The Adaptive Toolbox" [in "Simple Heuristics That Make Us Smart"], 1999)

"Optimization by individual agents, often used to derive competitive equilibria, are unnecessary for an actual economy to approximately attain such equilibria. From the failure of humans to optimize in complex tasks, one need not conclude that the equilibria derived from the competitive model are descriptively irrelevant. We show that even in complex economic systems, such equilibria can be attained under a range of surprisingly weak assumptions about agent behavior." (Antoni Bosch-Domènech & Shyam Sunder, "Tracking the Invisible Hand", 2000)

"[...] a general-purpose universal optimization strategy is theoretically impossible, and the only way one strategy can outperform another is if it is specialized to the specific problem under consideration." Yu-Chi Ho & David L Pepyne, "Simple explanation of the no-free-lunch theorem and its implications", Journal of Optimization Theory and Applications 115, 2002)

"Optimization of one variable may cause the whole system to work less efficiently. Why? The performance of most systems is constrained by the performance of its weakest link. A variable that limits the system from achieving its goal or optimum performance. […] When trying to improve the performance of a system, first find out the system's key contraint(s)- which may be physical (capacity, material, the market) or non-physical (policies, rules, measurements) -and its cause and effect relationship with the system. Maybe the constraint is based on faulty assumptions that can be corrected. Then try to "strengthen" or change the weakest link. Watch out for other effects - wanted or unwanted - that pop up as a consequence. Always consider the effects on the whole system." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Try to optimize the whole and not a system's individual parts. Think through what other variables may change when we alter a factor in a system. Trace out the short and long-term consequences in numbers and effects of a proposed action to see if the net result agrees with our ultimate goal."  (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Heuristics are needed in situations where the world does not permit optimization. For many real-world problems (as opposed to optimization-tuned textbook problems), optimal solutions are unknown because the problems are computationally intractable or poorly defined." (Christoph Engel & Gerd Gigerenzer, "Law and Heuristics: An interdisciplinary venture" [in "Heuristics and the Law", 2006)

"How is it that an ant colony can organize itself to carry out the complex tasks of food gathering and nest building and at the same time exhibit an enormous degree of resilience if disrupted and forced to adapt to changing situations? Natural systems are able not only to survive, but also to adapt and become better suited to their environment, in effect optimizing their behavior over time. They seemingly exhibit collective intelligence, or swarm intelligence as it is called, even without the existence of or the direction provided by a central authority." (Michael J North & Charles M Macal, "Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation", 2007)

"Swarm intelligence can be effective when applied to highly complicated problems with many nonlinear factors, although it is often less effective than the genetic algorithm approach [...]. Swarm intelligence is related to swarm optimization […]. As with swarm intelligence, there is some evidence that at least some of the time swarm optimization can produce solutions that are more robust than genetic algorithms. Robustness here is defined as a solution’s resistance to performance degradation when the underlying variables are changed. (Michael J North & Charles M Macal, Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation, 2007)





"Swarm intelligence can be effective when applied to highly complicated problems with many nonlinear factors, although it is often less effective than the genetic algorithm approach discussed later in this chapter. Swarm intelligence is related to swarm optimization […]. As with swarm intelligence, there is some evidence that at least some of the time swarm optimization can produce solutions that are more robust than genetic algorithms. Robustness here is defined as a solution’s resistance to performance degradation when the underlying variables are changed. (Michael J North & Charles M Macal, Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation, 2007)

07 February 2025

On Entropy: Definitions

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." ("G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"In the physics [entropy is the] rate of system's messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)

05 February 2025

Out of Context: On Patterns (Definitions)

"From the smallest scales to the largest, many of nature's patterns are a result of broken symmetry; […]" (Ian Stewart & Martin Golubitsky, "Fearful Symmetry: Is God a Geometer?", 1992)

"By using mathematics to organize and systematize our ideas about patterns, we have discovered a great secret: nature's patterns are not just there to be admired, they are vital clues to the rules that govern natural processes." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"When someone shows you a pattern, no matter how impressive the person’s credentials, consider the possibility that the pattern is just a coincidence. Ask why, not what. No matter what the pattern, the question is: Why should we expect to find this pattern?" (Gary Smith, "Standard Deviations", 2014)

"Don’t be fooled into thinking that a pattern is proof. We need a logical, persuasive explanation and we need to test the explanation with fresh data." (Gary Smith, "Standard Deviations", 2014)

"A pattern is a design or model that helps grasp something. Patterns help connect things that may not appear to be connected. Patterns help cut through complexity and reveal simpler understandable trends." (Anil K Maheshwari, "Business Intelligence and Data Mining", 2015)

"By using mathematics to organize and systematize our ideas about patterns, we have discovered a great secret: nature's patterns are not just there to be admired, they are vital clues to the rules that govern natural processes." (Ian Stewart, "Nature's Numbers: The unreal reality of mathematics", 1995)

"Thanks to their flexibility, the most complex models available to us can fit any patterns that appear in the data, but this means that they will also do so even when those patterns are mere phantoms and mirages in the noise." (Brian Christian & Thomas L Griffiths, "Algorithms to Live By: The Computer Science of Human Decisions", 2016)

04 February 2025

Out of Context: On Diagrams (Definitions)

 "Diagrams are of great utility for illustrating certain questions of vital statistics by conveying ideas on the subject through the eye, which cannot be so readily grasped when contained in figures." (Florence Nightingale, "Mortality of the British Army", 1857)

"Diagrams are sometimes used, not merely to convey several pieces of information such as several time series on one chart, but also to provide visual evidence of relationships between the series." (Alfred R Ilersic, "Statistics", 1959)

"Diagrams, whether representational or symbolic, are meaningless unless attached to some body of theory. On the other hand theories are in no need of diagrams save for psychological purposes. Let us then keep theoretical models apart from visual analogues."  (Mario Bunge, "Philosophy of Physics", 1973)

"Schematic diagrams are more abstract than pictorial drawings, showing symbolic elements and their interconnection to make clear the configuration and/or operation of a system." (Ernest O Doebelin, "Engineering experimentation: planning, execution, reporting", 1995)

"[...] (4) Diagrams are psychologically useful, but prove nothing; (5) Diagrams can even be misleading [...]" (James R Brown,"Philosophy of Mathematics", 1999)

"A model diagram declares some sets and binary relations, and imposes some basic constraints on them. A diagram is a good way to convey the outline of a model, but diagrams aren’t expressive enough to include detailed constraints." (Daniel Jackson, "Software Abstractions", 2006) 

"[...] diagrams are models, graphical in nature, that are used to illustrate structure (e.g., how components are physically interconnected); they do not capture functional behavior of a system. "  (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)

On Diagrams: Definitions

"A diagram is a representamen [representation] which is predominantly an icon of relations and is aided to be so by conventions. Indices are also more or less used. It should be carried out upon a perfectly consistent system of representation, founded upon a simple and easily intelligible basic idea." (Charles S Peirce, 1903)

"A diagram is an icon or schematic image embodying the meaning of a general predicate; and from the observation of this icon we are supposed to construct a new general predicate." (Charles S Peirce, "New Elements" ["Kaina stoiceia"], 1904)

"[The diagram] is only an heuristic to prompt certain trains of inference; [...] it is dispensable as a proof-theoretic device; indeed, [...] it has no proper place in the proof as such. For the proof is a syntactic object consisting only of sentences arranged in a finite and inspectable array." (Neil Tennant, "The withering away of formal semantics", Mind and Language Vol. 1 (4), 1986)

"Diagrams are a means of communication and explanation, and they facilitate brainstorming. They serve these ends best if they are minimal. Comprehensive diagrams of the entire object model fail to communicate or explain; they overwhelm the reader with detail and they lack meaning." (Eric Evans, "Domain-Driven Design: Tackling complexity in the heart of software", 2003)

"A diagram is a graphic shorthand. Though it is an ideogram, it is not necessarily an abstraction. It is a representation of something in that it is not the thing itself. In this sense, it cannot help but be embodied. It can never be free of value or meaning, even when it attempts to express relationships of formation and their processes. At the same time, a diagram is neither a structure nor an abstraction of structure." (Peter Eisenman, "Written Into the Void: Selected Writings", 1990-2004, 2007)

"Diagrams are information graphics that are made up primarily of geometric shapes, such as rectangles, circles, diamonds, or triangles, that are typically (but not always) interconnected by lines or arrows. One of the major purposes of a diagram is to show how things, people, ideas, activities, etc. interrelate and interconnect. Unlike quantitative charts and graphs, diagrams are used to show interrelationships in a qualitative way." (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)

"[...] diagrams are models, graphical in nature, that are used to illustrate structure (e.g., how components are physically interconnected); they do not capture functional behavior of a system. "  (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)



20 January 2025

On Chance: Definitions

"Chance is necessity hidden behind a veil." (Marie von Ebner-Eschenbach, Aphorisms, 1880/1893)

"Chance is only the measure of our ignorance." (Henri Poincaré," The Foundations of Science", 1913)

"[...] the conception of chance enters in the very first steps of scientific activity in virtue of the fact that no observation is absolutely correct. I think chance is a more fundamental conception that causality; for whether in a concrete case, a cause-effect relation holds or not can only be judged by applying the laws of chance to the observation." (Max Born, 1949)

"Chance is just as real as causation; both are modes of becoming. The way to model a random process is to enrich the mathematical theory of probability with a model of a random mechanism. In the sciences, probabilities are never made up or 'elicited' by observing the choices people make, or the bets they are willing to place.  The reason is that, in science and technology, interpreted probability exactifies objective chance, not gut feeling or intuition. No randomness, no probability." (Mario Bunge, "Chasing Reality: Strife over Realism", 2006)

"Chance is as relentless as necessity." (Simon Blackburn, Think, 1999) 

"Whether we shuffle cards or roll dice, chance is only a result of our human lack of deftness: we don't have enough control to immobilize a die at will or to individually direct the cards in a deck. The comparison is an important one nonetheless, and highlights the limits of this method of creating chance - it doesn't matter who rolls the dice, but we wouldn't let just anyone shuffle the cards." (Ivar Ekeland, "The Broken Dice, and Other Mathematical Tales of Chance", 1993)

"That randomness gives rise to innovation and diversity in nature is echoed by the notion that chance is also the source of invention in the arts and everyday affairs in which naturally occurring processes are balanced between tight organization, where redundancy is paramount, and volatility, in which little order is possible. One can argue that there is a difference in kind between the unconscious, and sometimes conscious, choices made by a writer or artist in creating a string of words or musical notes and the accidental succession of events taking place in the natural world. However, it is the perception of ambiguity in a string that matters, and not the process that generated it, whether it be man-made or from nature at large." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

Out of Context: On Chance (Definitions)

"Chance is a world void of sense; nothing can exist without a cause." (Voltaire, A Philosophical Dictionary, 1764)

"Our conception of chance is one of law and order in large numbers; it is not that idea of chaotic incidence which vexed the mediaeval mind." (Karl Pearson, "The Chances of Death", 1895)

"Chance is only the measure of our ignorance." (Henri Poincaré, "The Foundations of Science", 1913)

"Can there be laws of chance? The answer, it would seem should be negative, since chance is in fact defined as the characteristic of the phenomena which follow no law, phenomena whose causes are too complex to permit prediction." (Félix E Borel, "Probabilities and Life", 1943)

“Can there be laws of chance? The answer, it would seem should be negative, since chance is in fact defined as the characteristic of the phenomena which follow no law, phenomena whose causes are too complex to permit prediction.” (Félix E Borel, “Probabilities and Life”, 1962)

"Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not 'corrected' as a chance process unfolds, they are merely diluted." (Amos Tversky & Daniel Kahneman, "Judgment Under Uncertainty: Heuristics and Biases", Science Vol. 185 (4157), 1974)

"Quantum chance is absolute. […] Quantum chance is not a measure of ignorance but an inherent property. […] Chance in quantum theory is absolute and irreducible." (F David Peat, "From Certainty to Uncertainty", 2002)

On Chance: Gamblers III

"Behavioural research shows that we tend to use simplifying heuristics when making judgements about uncertain events. These are prone to biases and systematic errors, such as stereotyping, disregard of sample size, disregard for regression to the mean, deriving estimates based on the ease of retrieving instances of the event, anchoring to the initial frame, the gambler’s fallacy, and wishful thinking, which are all affected by our inability to consider more than a few aspects or dimensions of any phenomenon or situation at the same time." (Hans G Daellenbach & Donald C McNickle, "Management Science: Decision making through systems thinking", 2005)

"People sometimes appeal to the ‘law of averages’ to justify their faith in the gambler’s fallacy. They may reason that, since all outcomes are equally likely, in the long run they will come out roughly equal in frequency. However, the next throw is very much in the short run and the coin, die or roulette wheel has no memory of what went before." (Alan Graham, "Developing Thinking in Statistics", 2006)

"Another kind of error possibly related to the use of the representativeness heuristic is the gambler’s fallacy, otherwise known as the law of averages. If you are playing roulette and the last four spins of the wheel have led to the ball’s landing on black, you may think that the next ball is more likely than otherwise to land on red. This cannot be. The roulette wheel has no memory. The chance of black is just what it always is. The reason people tend to think otherwise may be that they expect the sequence of events to be representative of random sequences, and the typical random sequence at roulette does not have five blacks in a row." (Jonathan Baron, "Thinking and Deciding" 4th Ed, 2008)

"The theory of randomness is fundamentally a codification of common sense. But it is also a field of subtlety, a field in which great experts have been famously wrong and expert gamblers infamously correct. What it takes to understand randomness and overcome our misconceptions is both experience and a lot of careful thinking." (Leonard Mlodinow, "The Drunkard’s Walk: How Randomness Rules Our Lives", 2008)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"[…] many gamblers believe in the fallacious law of averages because they are eager to find a profitable pattern in the chaos created by random chance." (Gary Smith, "Standard Deviations", 2014)

17 January 2025

On Algorithms: Definitions

"Algorithms are a set of procedures to generate the answer to a problem." (Stuart Kauffman, "At Home in the Universe: The Search for Laws of Complexity", 1995)

"An algorithm is a simple rule, or elementary task, that is repeated over and over again. In this way algorithms can produce structures of astounding complexity." (F David Peat, "From Certainty to Uncertainty", 2002)

"An algorithm refers to a successive and finite procedure by which it is possible to solve a certain problem. Algorithms are the operational base for most computer programs. They consist of a series of instructions that, thanks to programmers’ prior knowledge about the essential characteristics of a problem that must be solved, allow a step-by-step path to the solution." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009) 

"[...] algorithms, which are abstract or idealized process descriptions that ignore details and practicalities. An algorithm is a precise and unambiguous recipe. It’s expressed in terms of a fixed set of basic operations whose meanings are completely known and specified. It spells out a sequence of steps using those operations, with all possible situations covered, and it’s guaranteed to stop eventually." (Brian W Kernighan, "Understanding the Digital World", 2017)

"An algorithm is the computer science version of a careful, precise, unambiguous recipe or tax form, a sequence of steps that is guaranteed to compute a result correctly." (Brian W Kernighan, "Understanding the Digital World", 2017)

"Algorithms describe the solution to a problem in terms of the data needed to represent the  problem instance and a set of steps necessary to produce the intended result." (Bradley N Miller et al, "Python Programming in Context", 2019)

"An algorithm, meanwhile, is a step-by-step recipe for performing a series of actions, and in most cases 'algorithm' means simply 'computer program'." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

Related Posts Plugin for WordPress, Blogger...

On Thresholds (From Fiction to Science-Ficttion)

"For many men that stumble at the threshold Are well foretold that danger lurks within." (William Shakespeare, "King Henry th...