Showing posts with label optimization. Show all posts
Showing posts with label optimization. Show all posts

21 March 2025

On Optimization I

"The real problem is that programmers have spent far too much time worrying about efficiency in the wrong places and at the wrong times; premature optimization is the root of all evil (or at least most of it) in programming." (Donald E Knuth, "Computer Programming as an Art", 1968)

"In most engineering problems, particularly when solving optimization problems, one must have the opportunity of comparing different variants quantitatively. It is therefore important to be able to state a clear-cut quantitative criterion." (Yakov Khurgin, "Did You Say Mathematics?", 1974)

"Linear programming is viewed as a revolutionary development giving man the ability to state general objectives and to find, by means of the simplex method, optimal policy decisions for a broad class of practical decision problems of great complexity. In the real world, planning tends to be ad hoc because of the many special-interest groups with their multiple objectives." (George Dantzig, "Reminiscences about the origins of linear programming", Mathematical programming: the state of the art", 1983) 

"It remains an unhappy fact that there is no best method for finding the solution to general nonlinear optimization problems. About the best general procedure yet devised is one that relies upon imbedding the original problem within a family of problems, and then developing relations linking one member of the family to another. If this can be done adroitly so that one family member is easily solvable, then these relations can be used to step forward from the solution of the easy problem to that of the original problem. This is the key idea underlying dynamic programming, the most flexible and powerful of all optimization methods." (John L Casti, "Five Golden Rules", 1995)

"Heuristic methods may aim at local optimization rather than at global optimization, that is, the algorithm optimizes the solution stepwise, finding the best solution at each small step of the solution process and 'hoping' that the global solution, which comprises the local ones, would be satisfactory." (Nikola K Kasabov, "Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering", 1996)

"Mathematical programming (or optimization theory) is that branch of mathematics dealing with techniques for maximizing or minimizing an objective function subject to linear, nonlinear, and integer constraints on the variables." (George B Dantzig & Mukund N Thapa, "Linear Programming" Vol I, 1997)

"A heuristic is ecologically rational to the degree that it is adapted to the structure of an environment. Thus, simple heuristics and environmental structure can both work hand in hand to provide a realistic alternative to the ideal of optimization, whether unbounded or constrained." (Gerd Gigerenzer & Peter M Todd, "Fast and Frugal Heuristics: The Adaptive Toolbox" [in "Simple Heuristics That Make Us Smart"], 1999)

On Optimization II

"A model is an imitation of reality and a mathematical model is a particular form of representation. We should never forget this and get so distracted by the model that we forget the real application which is driving the modelling. In the process of model building we are translating our real world problem into an equivalent mathematical problem which we solve and then attempt to interpret. We do this to gain insight into the original real world situation or to use the model for control, optimization or possibly safety studies." (Ian T Cameron & Katalin Hangos, "Process Modelling and Model Analysis", 2001)

"Heuristics are needed in situations where the world does not permit optimization. For many real-world problems (as opposed to optimization-tuned textbook problems), optimal solutions are unknown because the problems are computationally intractable or poorly defined." (Christoph Engel & Gerd Gigerenzer, "Law and Heuristics: An interdisciplinary venture" [in "Heuristics and the Law", 2006) 

"It remains an unhappy fact that there is no best method for finding the solution to general nonlinear optimization problems. About the best general procedure yet devised is one that relies upon imbedding the original problem within a family of problems, and then developing relations linking one member of the family to another. If this can be done adroitly so that one family member is easily solvable, then these relations can be used to step forward from the solution of the easy problem to that of the original problem. This is the key idea underlying dynamic programming, the most flexible and powerful of all optimization methods." (John L Casti, "Five Golden Rules", 1995)

"Mathematical programming (or optimization theory) is that branch of mathematics dealing with techniques for maximizing or minimizing an objective function subject to linear, nonlinear, and integer constraints on the variables."  (George B Dantzig & Mukund N Thapa, "Linear Programming" Vol I, 1997)

"A heuristic is ecologically rational to the degree that it is adapted to the structure of an environment. Thus, simple heuristics and environmental structure can both work hand in hand to provide a realistic alternative to the ideal of optimization, whether unbounded or constrained." (Gerd Gigerenzer & Peter M Todd, "Fast and Frugal Heuristics: The Adaptive Toolbox" [in "Simple Heuristics That Make Us Smart"], 1999)

"Optimization by individual agents, often used to derive competitive equilibria, are unnecessary for an actual economy to approximately attain such equilibria. From the failure of humans to optimize in complex tasks, one need not conclude that the equilibria derived from the competitive model are descriptively irrelevant. We show that even in complex economic systems, such equilibria can be attained under a range of surprisingly weak assumptions about agent behavior." (Antoni Bosch-Domènech & Shyam Sunder, "Tracking the Invisible Hand", 2000)

"[...] a general-purpose universal optimization strategy is theoretically impossible, and the only way one strategy can outperform another is if it is specialized to the specific problem under consideration." Yu-Chi Ho & David L Pepyne, "Simple explanation of the no-free-lunch theorem and its implications", Journal of Optimization Theory and Applications 115, 2002)

"Optimization of one variable may cause the whole system to work less efficiently. Why? The performance of most systems is constrained by the performance of its weakest link. A variable that limits the system from achieving its goal or optimum performance. […] When trying to improve the performance of a system, first find out the system's key contraint(s)- which may be physical (capacity, material, the market) or non-physical (policies, rules, measurements) -and its cause and effect relationship with the system. Maybe the constraint is based on faulty assumptions that can be corrected. Then try to "strengthen" or change the weakest link. Watch out for other effects - wanted or unwanted - that pop up as a consequence. Always consider the effects on the whole system." (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Try to optimize the whole and not a system's individual parts. Think through what other variables may change when we alter a factor in a system. Trace out the short and long-term consequences in numbers and effects of a proposed action to see if the net result agrees with our ultimate goal."  (Peter Bevelin, "Seeking Wisdom: From Darwin to Munger", 2003)

"Heuristics are needed in situations where the world does not permit optimization. For many real-world problems (as opposed to optimization-tuned textbook problems), optimal solutions are unknown because the problems are computationally intractable or poorly defined." (Christoph Engel & Gerd Gigerenzer, "Law and Heuristics: An interdisciplinary venture" [in "Heuristics and the Law", 2006)

"How is it that an ant colony can organize itself to carry out the complex tasks of food gathering and nest building and at the same time exhibit an enormous degree of resilience if disrupted and forced to adapt to changing situations? Natural systems are able not only to survive, but also to adapt and become better suited to their environment, in effect optimizing their behavior over time. They seemingly exhibit collective intelligence, or swarm intelligence as it is called, even without the existence of or the direction provided by a central authority." (Michael J North & Charles M Macal, "Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation", 2007)

"Swarm intelligence can be effective when applied to highly complicated problems with many nonlinear factors, although it is often less effective than the genetic algorithm approach [...]. Swarm intelligence is related to swarm optimization […]. As with swarm intelligence, there is some evidence that at least some of the time swarm optimization can produce solutions that are more robust than genetic algorithms. Robustness here is defined as a solution’s resistance to performance degradation when the underlying variables are changed. (Michael J North & Charles M Macal, Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation, 2007)





"Swarm intelligence can be effective when applied to highly complicated problems with many nonlinear factors, although it is often less effective than the genetic algorithm approach discussed later in this chapter. Swarm intelligence is related to swarm optimization […]. As with swarm intelligence, there is some evidence that at least some of the time swarm optimization can produce solutions that are more robust than genetic algorithms. Robustness here is defined as a solution’s resistance to performance degradation when the underlying variables are changed. (Michael J North & Charles M Macal, Managing Business Complexity: Discovering Strategic Solutions with Agent-Based Modeling and Simulation, 2007)

06 July 2021

On Nonlinearity VI

"Up until now most economists have concerned themselves with linear systems, not because of any belief that the facts were so simple, but rather because of the mathematical difficulties involved in nonlinear systems [... Linear systems are] mathematically simple, and exact solutions are known. But a high price is paid for this simplicity in terms of special assumptions which must be made." (Paul A Samuelson, "Foundations of Economic Analysis", 1966)

"Linear relationships are easy to think about: the more the merrier. Linear equations are solvable, which makes them suitable for textbooks. Linear systems have an important modular virtue: you can take them apart and put them together again - the pieces add up. Nonlinear systems generally cannot be solved and cannot be added together. [...] Nonlinearity means that the act of playing the game has a way of changing the rules. [...] That twisted changeability makes nonlinearity hard to calculate, but it also creates rich kinds of behavior that never occur in linear systems." (James Gleick, "Chaos: Making a New Science", 1987)

"Never in the annals of science and engineering has there been a phenomenon so ubiquitous‚ a paradigm so universal‚ or a discipline so multidisciplinary as that of chaos. Yet chaos represents only the tip of an awesome iceberg‚ for beneath it lies a much finer structure of immense complexity‚ a geometric labyrinth of endless convolutions‚ and a surreal landscape of enchanting beauty. The bedrock which anchors these local and global bifurcation terrains is the omnipresent nonlinearity that was once wantonly linearized by the engineers and applied scientists of yore‚ thereby forfeiting their only chance to grapple with reality." (Leon O Chua, "Editorial", International Journal of Bifurcation and Chaos, Vol. l (1), 1991) 

"It remains an unhappy fact that there is no best method for finding the solution to general nonlinear optimization problems. About the best general procedure yet devised is one that relies upon imbedding the original problem within a family of problems, and then developing relations linking one member of the family to another. If this can be done adroitly so that one family member is easily solvable, then these relations can be used to step forward from the solution of the easy problem to that of the original problem. This is the key idea underlying dynamic programming, the most flexible and powerful of all optimization methods." (John L Casti, "Five Golden Rules", 1995)

"When it comes to modeling processes that are manifestly governed by nonlinear relationships among the system components, we can appeal to the same general idea. Calculus tells us that we should expect most systems to be 'locally' flat; that is, locally linear. So a conservative modeler would try to extend the word 'local' to hold for the region of interest and would take this extension seriously until it was shown to be no longer valid." (John L Casti, "Five Golden Rules", 1995)

"A system at a bifurcation point, when pushed slightly, may begin to oscillate. Or the system may flutter around for a time and then revert to its normal, stable behavior. Or, alternatively it may move into chaos. Knowing a system within one range of circumstances may offer no clue as to how it will react in others. Nonlinear systems always hold surprises." (F David Peat, "From Certainty to Uncertainty", 2002)

"In a linear system a tiny push produces a small effect, so that cause and effect are always proportional to each other. If one plotted on a graph the cause against the effect, the result would be a straight line. In nonlinear systems, however, a small push may produce a small effect, a slightly larger push produces a proportionately larger effect, but increase that push by a hair’s breadth and suddenly the system does something radically different." (F David Peat, "From Certainty to Uncertainty", 2002)

"Complex systems are full of interdependencies - hard to detect - and nonlinear responses." (Nassim N Taleb, "Antifragile: Things That Gain from Disorder", 2012)


Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...