26 February 2024

George E Forsythe - Collected Quotes

"The use of practically any computing technique itself raises a number of mathematical problems. There is thus a very considerable impact of computation on mathematics itself, and this may be expected to influence mathematical research to an increasing degree." (George E Forsythe, 1958) 

"I consider computer science to be the art and science of exploiting automatic digital computers, and of creating the technology necessary to understand their use. It deals with such related problems as the design of better machines using known components:, the design and implementation of adequate software systems for communication between man and machine, and the design and analysis of methods of representing information by abstract symbols and of processes for manipulating these symbols." (George E Forsythe, "Stanford University's Program in Computer Science", 1965)

"To a modern mathematician, design seems to be a second-rate intellectual activity." (George E Forsythe, 1966)

"Computer science is at once abstract and pragmatic. The focus on actual computers introduces the pragmatic component: our central questions are economic ones like the relations among speed, accuracy, and cost of a proposed computation, and the hardware and software organization required. The (often) better understood questions of existence and theoretical computability - however fundamental - remain in the background. On the other hand, the medium of computer science - information - is an abstract one. The meaning of symbols and numbers may change from application to application, either in mathematics or in computer science. Like mathematics, one goal of computer science is to create a basic structure in terms of inherently defined concepts that is independent of any particular application." (George E Forsythe, "What to do till the computer scientist comes", 1968)

"The most valuable acquisitions in a scientific or technical education are the general-purpose mental tools which remain serviceable for a lifetime. I rate natural language and mathematics as the most important of these tools, and computer science as a third." (George E Forsythe, "What to do till the computer scientist comes", 1968)

"Most of known computer science must be considered as design technique, not theory." (George E Forsythe, "What to do till the computer scientist comes", 1968)

"People have said you don’t understand something until you’ve taught it in a class. The truth is you don’t understand something until you’ve taught it to a computer, until you’ve been able to program it." (George E Forsythe)

24 February 2024

On Problem Solving XVIII: Practice

"The insights gained and garnered by the mind in its wanderings among basic concepts are benefits that theory can provide. Theory cannot equip the mind with formulas for solving problems, nor can it mark the narrow path on which the sole solution is supposed to lie by planting a hedge of principles on either side. But it can give the mind insight into the great mass of phenomena and of their relationships, then leave it free to rise into the higher realms of action." (Carl von Clausewitz, "On War", 1832)

"One of the most important tasks of the teacher is to help his students. This task is not quite easy; it demands time, practice, devotion, and sound principles." (George Pólya, "How to Solve It", 1945)

"We acquire any practical skill by imitation and practice. […] Trying to solve problems, you have to observe and to imitate what other people do when solving problems and, finally, you learn to do problems by doing them." (George Pólya, "How to Solve It", 1945)

"The trouble with mathematics is that, however diligent you are in going to lectures and in learning the theory, you may still not be able to do the problems." (Maurice Wilkes, "Memoirs of a Computer Pioneer", 1985)

"Design thinking taps into capacities we all have but that are overlooked by more conventional problem-solving practices. It is not only human-centered; it is deeply human in and of itself. Design thinking relies on our ability to be intuitive, to recognize patterns, to construct ideas that have emotional meaning as well as functionality, to express ourselves in media other than words or symbols." (Tim Brown, "Change by Design: How Design Thinking Transforms Organizations and Inspires Innovation", 2009)

"Solving problems is a practical skill like, let us say, swimming. We acquire any practical skill by imitation and practice." (George Pólya)

Previous <<||>> Next

On Problem Solving XX: Life

"The difficult problems in life always start off being simple. Great affairs always start off being small." (Lao Tzu, cca 400 BC)

"Man was not born to solve the problems of the universe, but rather to seek to lay bare the heart of the problem and then confine himself within the limits of what is amenable to understanding." (Johann Wolfgang von Goethe, 1825)

"The greatest art, both in teaching and in life itself, consists in transforming the problem into a postulate." (Johann Wolfgang von Goethe, 1928)

"The meaning and design of a problem seem not to lie in its solution, but in our working at it incessantly." (Carl G Jung, "Modern Man in Search of a Soul", 1933)

"The field of consciousness is tiny. It accepts only one problem at a time. Get into a fist fight, put your mind on the strategy of the fight, and you will not feel the other fellow's punches." (Antoine de Saint-Exupéry, "Flight to Arras", 1942)

"Life ultimately means taking the responsibility to find the right answer to its problems and to fulfill the tasks which it constantly sets for each individual." (Viktor E Frankl, "Man's Search for Meaning", 1946)

"We are built to conquer environment, solve problems, achieve goals, and we find no real satisfaction or happiness in life without obstacles to conquer and goals to achieve." (Maxwell Maltz, "Psycho-Cybernetics", 1960)

"The mystery of life isn't a problem to solve, but a reality to experience." (Frank Herbert, "Dune", 1965)

"Sometimes the situation is only a problem because it is looked at in a certain way. Looked at in another way, the right course of action may be so obvious that the problem no longer exists." (Edward de Bono, "The use of lateral thinking", 1967)

"There are problems in this universe for which there are no answers." (Frank Herbert, "Dune Messiah", 1969)

"The easiest way to solve a problem is to deny it exists." (Isaac Asimov, "The Gods Themselves", 1972)

"When a decision is made to cope with the symptoms of a problem, it is generally assumed that the corrective measures will solve the problem itself. They seldom do." (Masanobu Fukuoka, "The One-Straw Revolution", 1975)

"If you go through the world looking for excellence, you will find excellence. If you go through the world looking for problems you will find problems." (Joseph O'Connor & John Seymour, "Introducing Neuro-Linguistic Programming: Psychological Skills for Understanding and Influencing People", 1990)

"Every problem has a solution, although it may not be the outcome that was originally hoped for or expected."  (Alice Hoffman, "Practical Magic", 1995)

"Pain is a relatively objective, physical phenomenon; suffering is our psychological resistance to what happens. Events may create physical pain, but they do not in themselves create suffering. Resistance creates suffering. Stress happens when your mind resists what is... The only problem in your life is your mind's resistance to life as it unfolds." (Dan Millman, "Everyday Enlightenment: The Twelve Gateways to Personal Growth", 1998)

"[...] all problems can be reperceived as challenges, or 'opportunities' to change, grow or learn." (Robert B Dilts, "Sleight of Mouth: The Magic of Conversational Belief Change", 1999)

"All problems are illusions of the mind." (Eckhart Tolle, "Practicing the Power of Now: Essential Teachings, Meditations, and Exercises", 2001)

"We humans have two great problems: the first is knowing when to begin; the second is knowing when to stop." (Paulo Coelho, "The Zahir: A Novel of Obsession", 2005)

"Most problems we face in life, as I have said already, happen in our minds. Furthermore, problems generally exist in our concept of the past and the future. The past and the future don’t exist except in our minds." (Richard Bandler, "Get the Life You Want: The Secrets to Quick and Lasting Life Change with Neuro-Linguistic Programming", 2008)

"One of the most important aspects of what human beings do is build beliefs. Beliefs are what trap most people in their problems. Unless you believe you can get over something, get through something, or get to something, there is little likelihood you will be able to do it. Your beliefs refer to your sense of certainty on some of your thoughts." (Richard Bandler, "Get the Life You Want: The Secrets to Quick and Lasting Life Change with Neuro-Linguistic Programming", 2008)

"When you can take on board new, positive suggestions and disbelieve the old, limiting suggestions, you will be ready to tackle the rest of your problems, especially your fears." (Richard Bandler, "Get the Life You Want: The Secrets to Quick and Lasting Life Change with Neuro-Linguistic Programming", 2008)

"Our most important problems cannot be solved; they must be outgrown." (Wayne Dyer, "Excuses Begone!: How to Change Lifelong, Self-Defeating Thinking Habits", 2009)

"A problem is a difference between things as desired and things as perceived. […] Seen in this way, the problem could be solved either by changing desires or changing perceptions." (Donald C Gause & Gerald M Weinberg, "Are Your Lights On?", 2011)

"As a practical matter, it is impossible to define natural, day-to-day problems in a single, unique, totally unambiguous fashion. On the other hand, without some common understanding of the problem, a solution will almost invariably be to the wrong problem." (Donald C Gause & Gerald M Weinberg, "Are Your Lights On?", 2011)

"The really important thing in dealing with problems is to know that the question is never answered, but that it doesn't matter, as long as you keep asking. It's only when you fool yourself into thinking you have the final problem definition - the final, true answer - that you can be fooled into thinking you have the final solution. And if you think that, you're always wrong, because there is no such thing as a 'final solution'." (Donald C Gause & Gerald M Weinberg, "Are Your Lights On?", 2011)

On Numbers: Binary Numbers

"[The information of a message can] be defined as the 'minimum number of binary decisions which enable the receiver to construct the message, on the basis of the data already available to him.' These data comprise both the convention regarding the symbols and the language used, and the knowledge available at the moment when the message started." (Dennis Gabor, "Optical transmission" in Information Theory : Papers Read at a Symposium on Information Theory, 1952)

"[...] there can be such a thing as a simple probabilistic system. For example, consider the tossing of a penny. Here is a perfectly simple system, but one which is notoriously unpredictable. It maybe described in terms of a binary decision process, with a built-in even probability between the two possible outcomes." (Stafford Beer, "Cybernetics and Management", 1959)

"Bivalence trades accuracy for simplicity. Binary outcomes of yes and no, white and black, true and false simplify math and computer processing. You can work with strings of 0s and 1s more easily than you can work with fractions. But bivalence requires some force fitting and rounding off [...] Bivalence holds at cube corners. Multivalence holds everywhere else." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Fuzziness has a formal name in science: multivalence. The opposite of fuzziness is bivalence or two-valuedness, two ways to answer each question, true or false, 1 or 0. Fuzziness means multivalence. It means three or more options, perhaps an infinite spectrum of options, instead of just two extremes. It means analog instead of binary, infinite shades of gray between black and white." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"Somewhere in the process wishful thinking seems to take over. Scientists start to believe they do math when they do science. This holds to greatest degree in an advanced science like physics or at the theoretical frontier of any science where the claims come as math claims. The first victim is truth. What was inaccurate or fuzzy truth all along gets bumped up a letter grade to the all-or-none status of binary logic. Most scientists draw the line at giving up the tentative status of science. They will concede that it can all go otherwise in the next experiment. But most have crossed the bivalent line by this point and believe that in the next experiment a statement or hypothesis or theory may jump from TRUE to FALSE, from 1 to 0." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"The binary logic of modern computers often falls short when describing the vagueness of the real world. Fuzzy logic offers more graceful alternatives." (Bart Kosko & Satoru Isaka, "Fuzzy Logic,” Scientific American Vol. 269, 1993)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)

"Why should anyone know or care about binary numbers? One reason is that working with numbers in an unfamiliar base is an example of quantitative reasoning that might even improve understanding of how numbers work in good old base ten. Beyond that, it’s also important because the number of bits is usually related in some way to how much space, time or complexity is involved. And fundamentally, computers are worth understanding, and binary is central to their operation." (Brian W Kernighan, "Understanding the Digital World", 2017)

Previous Post <<||>> Next Post

29 December 2023

On Homogeneity

"The power of differential calculus is that it linearizes all problems by going back to the 'infinitesimally small', but this process can be used only on smooth manifolds. Thus our distinction between the two senses of rotation on a smooth manifold rests on the fact that a continuously differentiable coordinate transformation leaving the origin fixed can be approximated by a linear transformation at О and one separates the (nondegenerate) homogeneous linear transformations into positive and negative according to the sign of their determinants. Also the invariance of the dimension for a smooth manifold follows simply from the fact that a linear substitution which has an inverse preserves the number of variables." (Hermann Weyl, "The Concept of a Riemann Surface", 1913)

"An 'empty world', i. e., a homogeneous manifold at all points at which equations (1) are satisfied, has, according to the theory, a constant Riemann curvature, and any deviation from this fundamental solution is to be directly attributed to the influence of matter or energy." (Howard P Robertson, "On Relativistic Cosmology", 1928)

"When the statistician looks at the outside world, he cannot, for example, rely on finding errors that are independently and identically distributed in approximately normal distributions. In particular, most economic and business data are collected serially and can be expected, therefore, to be heavily serially dependent. So is much of the data collected from the automatic instruments which are becoming so common in laboratories these days. Analysis of such data, using procedures such as standard regression analysis which assume independence, can lead to gross error. Furthermore, the possibility of contamination of the error distribution by outliers is always present and has recently received much attention. More generally, real data sets, especially if they are long, usually show inhomogeneity in the mean, the variance, or both, and it is not always possible to randomize." (George E P Box, "Some Problems of Statistics and Everyday Life", Journal of the American Statistical Association, Vol. 74 (365), 1979)

"[…] homogeneous functions have an interesting scaling property: they reproduce themselves upon rescaling. This scaling invariance can shed light into some of the darker corners of physics, biology, and other sciences, and even illuminate our appreciation of music." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"In contrast to gravitation, interatomic forces are typically modeled as inhomogeneous power laws with at least two different exponents. Such laws (and exponential laws, too) are not scale-free; they necessarily introduce a characteristic length, related to the size of the atoms. Power laws also govern the power spectra of all kinds of noises, most intriguing among them the ubiquitous (but sometimes difficult to explain)." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"Scaling invariance results from the fact that homogeneous power laws lack natural scales; they do not harbor a characteristic unit (such as a unit length, a unit time, or a unit mass). Such laws are therefore also said to be scale-free or, somewhat paradoxically, 'true on all scales'. Of course, this is strictly true only for our mathematical models. A real spring will not expand linearly on all scales; it will eventually break, at some characteristic dilation length. And even Newton's law of gravitation, once properly quantized, will no doubt sprout a characteristic length." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"Fitting data means finding mathematical descriptions of structure in the data. An additive shift is a structural property of univariate data in which distributions differ only in location and not in spread or shape. […] The process of identifying a structure in data and then fitting the structure to produce residuals that have the same distribution lies at the heart of statistical analysis. Such homogeneous residuals can be pooled, which increases the power of the description of the variation in the data." (William S Cleveland, "Visualizing Data", 1993)

"When the distributions of two or more groups of univariate data are skewed, it is common to have the spread increase monotonically with location. This behavior is monotone spread. Strictly speaking, monotone spread includes the case where the spread decreases monotonically with location, but such a decrease is much less common for raw data. Monotone spread, as with skewness, adds to the difficulty of data analysis. For example, it means that we cannot fit just location estimates to produce homogeneous residuals; we must fit spread estimates as well. Furthermore, the distributions cannot be compared by a number of standard methods of probabilistic inference that are based on an assumption of equal spreads; the standard t-test is one example. Fortunately, remedies for skewness can cure monotone spread as well." (William S Cleveland, "Visualizing Data", 1993)

"Descriptive statistics are built on the assumption that we can use a single value to characterize a single property for a single universe. […] Probability theory is focused on what happens to samples drawn from a known universe. If the data happen to come from different sources, then there are multiple universes with different probability models. If you cannot answer the homogeneity question, then you will not know if you have one probability model or many. [...] Statistical inference assumes that you have a sample that is known to have come from one universe." (Donald J Wheeler, "Myths About Data Analysis", International Lean & Six Sigma Conference, 2012)

"The four questions of data analysis are the questions of description, probability, inference, and homogeneity. [...] Descriptive statistics are built on the assumption that we can use a single value to characterize a single property for a single universe. […] Probability theory is focused on what happens to samples drawn from a known universe. If the data happen to come from different sources, then there are multiple universes with different probability models.  [...] Statistical inference assumes that you have a sample that is known to have come from one universe." (Donald J Wheeler," Myths About Data Analysis", International Lean & Six Sigma Conference, 2012)

"The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there." (Steven Pinker, "The Second Law of Thermodynamics", 2017) [source

On Homogeneity: Trivia

"For thought raised on specialization the most potent objection to the possibility of a universal organizational science is precisely its universality. Is it ever possible that the same laws be applicable to the combination of astronomic worlds and those of biological cells, of living people and the waves of the ether, of scientific ideas and quanta of energy? [...] Mathematics provide a resolute and irrefutable answer: yes, it is undoubtedly possible, for such is indeed the case. Two and two homogenous separate elements amount to four such elements, be they astronomic systems or mental images, electrons or workers; numerical structures are indifferent to any element, there is no place here for specificity." (Alexander Bogdanov, "Tektology: The Universal Organizational Science" Vol. I, 1913)

"Economics is a science of thinking in terms of models joined to the art of choosing models which are relevant to the contemporary world. It is compelled to be this, because, unlike the typical natural science, the material to which it is applied is, in too many respects, not homogeneous through time. The object of a model is to segregate the semi-permanent or relatively constant factors from those which are transitory or fluctuating so as to develop a logical way of thinking about the latter, and of understanding the time sequences to which they give rise in particular cases." (John M Keynes, [letter to Roy Harrod] 1938)

"On the most usual assumption, the universe is homogeneous on the large scale, i. e. down to regions containing each an appreciable number of nebulae. The homogeneity assumption may then be put in the form: An observer situated in a nebula and moving with the nebula will observe the same properties of the universe as any other similarly situated observer at any time." (Sir Hermann Bondi, "Review of Cosmology," Monthly Notices of the Royal Astronomical Society, 1948)

"The plane is the mainstay of all graphic representation. It is so familiar that its properties seem self-evident, but the most familiar things are often the most poorly understood. The plane is homogeneous and has two dimensions. The visual consequences of these properties must be fully explored." (Jacques Bertin, Semiology of graphics [Semiologie Graphique], 1967)

"The sciences have started to swell. Their philosophical basis has never been very strong. Starting as modest probing operations to unravel the works of God in the world, to follow its traces in nature, they were driven gradually to ever more gigantic generalizations. Since the pieces of the giant puzzle never seemed to fit together perfectly, subsets of smaller, more homogeneous puzzles had to be constructed, in each of which the fit was better." (Erwin Chargaff, "Voices in the Labyrinth", 1975)

"Cybernetics is a homogenous and coherent scientific complex, a science resulting from the blending of at least two sciences - psychology and technology; it is a general and integrative science, a crossroads of sciences, involving both animal and car psychology. It is not just a discipline, circumscribed in a narrow and strictly defined field, but a complex of disciplines born of psychology and centered on it, branched out as branches of a tree in its stem. It is a stepwise synthesis, a suite of multiple, often reciprocal, modeling; syntheses and modeling in which, as a priority, and as a great importance, the modeling of psychology on the technique and then the modeling of the technique on psychology. Cybernetics is an intellectual symphony, a symphony of ideas and sciences." (Stefan Odobleja, "Psihologia consonantista ?i cibernetica" ["Consonatist and Cybernetic Psychology"], 1978)

"The standard process of organizing knowledge into departments, and subderpartments, and further breaking it up into separate courses, tends to conceal the homogeneity of knowledge, and at the same time to omit much which falls between the courses." (Richard W Hamming, "The Art of Probability for Scientists and Engineers", 1991)

"Cellular automata (henceforth: CA) are discrete, abstract computational systems that have proved useful both as general models of complexity and as more specific representations of non-linear dynamics in a variety of scientific fields. Firstly, CA are (typically) spatially and temporally discrete: they are composed of a finite or denumerable set of homogenous, simple units, the atoms or cells. [...] Secondly, CA are abstract: they can be specified in purely mathematical terms and physical structures can implement them. Thirdly, CA are computational systems: they can compute functions and solve algorithmic problems." (Francesco Berto & Jacopo Tagliabue, "Cellular Automata", Stanford Encyclopedia of Philosophy, 2012) [source]

"A significant factor missing from any form of artificial intelligence is the inability of machines to learn based on real life experience. Diversity of life experience is the single most powerful characteristic of being human and enhances how we think, how we learn, our ideas and our ability to innovate. Machines exist in a homogeneous ecosystem, which is ok for solving known challenges, however even Artificial General Intelligence will never challenge humanity in being able to acquire the knowledge, creativity and foresight needed to meet the challenges of the unknown." (Tom Golway, 2021)

27 December 2023

On Scale-free Networks

"In contrast to gravitation, interatomic forces are typically modeled as inhomogeneous power laws with at least two different exponents. Such laws (and exponential laws, too) are not scale-free; they necessarily introduce a characteristic length, related to the size of the atoms. Power laws also govern the power spectra of all kinds of noises, most intriguing among them the ubiquitous (but sometimes difficult to explain)." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"In physics, there are numerous phenomena that are said to be 'true on all scales', such as the Heisenberg uncertainty relation, to which no exception has been found over vast ranges of the variables involved (such as energy versus time, or momentum versus position). But even when the size ranges are limited, as in galaxy clusters (by the size of the universe) or the magnetic domains in a piece of iron near the transition point to ferromagnetism (by the size of the magnet), the concept true on all scales is an important postulate in analyzing otherwise often obscure observations." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"Scaling invariance results from the fact that homogeneous power laws lack natural scales; they do not harbor a characteristic unit (such as a unit length, a unit time, or a unit mass). Such laws are therefore also said to be scale-free or, somewhat paradoxically, 'true on all scales'. Of course, this is strictly true only for our mathematical models. A real spring will not expand linearly on all scales; it will eventually break, at some characteristic dilation length. And even Newton's law of gravitation, once properly quantized, will no doubt sprout a characteristic length." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"In networks belonging to the second category, the winner takes all, meaning that the fittest node grabs all links, leaving very little for the rest of the nodes. Such networks develop a star topology, in which all nodes are connected to a central hub. In such a hub-and-spokes network there is a huge gap between the lonely hub and everybody else in the system. Thus a winner-takes-all network is very different from the scale-free networks we encountered earlier, where there is a hierarchy of hubs whose size distribution follows a power law. A winner-takes-all network is not scale-free. Instead there is a single hub and many tiny nodes. This is a very important distinction." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"Networks are not en route from a random to an ordered state. Neither are they at the edge of randomness and chaos. Rather, the scale-free topology is evidence of organizing principles acting at each stage of the network formation process." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"[…] networks are the prerequisite for describing any complex system, indicating that complexity theory must inevitably stand on the shoulders of network theory. It is tempting to step in the footsteps of some of my predecessors and predict whether and when we will tame complexity. If nothing else, such a prediction could serve as a benchmark to be disproven. Looking back at the speed with which we disentangled the networks around us after the discovery of scale-free networks, one thing is sure: Once we stumble across the right vision of complexity, it will take little to bring it to fruition. When that will happen is one of the mysteries that keeps many of us going." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"The first category includes all networks in which, despite the fierce competition for links, the scale-free topology survives. These networks display a fit-get-rich behavior, meaning that the fittest node will inevitably grow to become the biggest hub. The winner's lead is never significant, however. The largest hub is closely followed by a smaller one, which acquires almost as many links as the fittest node. At any moment we have a hierarchy of nodes whose degree distribution follows a power law. In most complex networks, the power law and the fight for links thus are not antagonistic but can coexist peacefully."(Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"At an anatomical level - the level of pure, abstract connectivity - we seem to have stumbled upon a universal pattern of complexity. Disparate networks show the same three tendencies: short chains, high clustering, and scale-free link distributions. The coincidences are eerie, and baffling to interpret." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"In a random network the loss of a small number of nodes can cause the overall network to become incoherent - that is, to break into disconnected subnetworks. In a scale-free network, such an event usually won’t disrupt the overall network because most nodes don’t have many links. But there’s a big caveat to this general principle: if a scale-free network loses a hub, it can be disastrous, because many other nodes depend on that hub." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

"Scale-free networks are particularly vulnerable to intentional attack: if someone wants to wreck the whole network, he simply needs to identify and destroy some of its hubs. And here we see how our world’s increasing connectivity really matters. Scientists have found that as a scale-free network like the Internet or our food-distribution system grows- as it adds more nodes - the new nodes tend to hook up with already highly connected hubs." (Thomas Homer-Dixon, "The Upside of Down: Catastrophe, Creativity, and the Renewal of Civilization", 2006)

On Perodicity III

"Since the ellipse is a closed curve it has a total length, λ say, and therefore f(l + λ) = f(l). The elliptic function f is periodic, with 'period' λ, just as the sine function is periodic with period 2π. However, as Gauss discovered in 1797, elliptic functions are even more interesting than this: they have a second, complex period. This discovery completely changed the face of calculus, by showing that some functions should be viewed as functions on the plane of complex numbers. And just as periodic functions on the line can be regarded as functions on a periodic line - that is, on the circle - elliptic functions can be regarded as functions on a doubly periodic plane - that is, on a 2-torus." (John Stillwell, "Yearning for the impossible: the surpnsing truths of mathematics", 2006)

"A typical control goal when controlling chaotic systems is to transform a chaotic trajectory into a periodic one. In terms of control theory it means stabilization of an unstable periodic orbit or equilibrium. A specific feature of this problem is the possibility of achieving the goal by means of an arbitrarily small control action. Other control goals like synchronization and chaotization can also be achieved by small control in many cases." (Alexander L Fradkov, "Cybernetical Physics: From Control of Chaos to Quantum Control", 2007)

"In parametrized dynamical systems a bifurcation occurs when a qualitative change is invoked by a change of parameters. In models such a qualitative change corresponds to transition between dynamical regimes. In the generic theory a finite list of cases is obtained, containing elements like ‘saddle-node’, ‘period doubling’, ‘Hopf bifurcation’ and many others." (Henk W Broer & Heinz Hanssmann, "Hamiltonian Perturbation Theory (and Transition to Chaos)", 2009)

"In fact, contrary to intuition, some of the most complicated dynamics arise from the simplest equations, while complicated equations often produce very simple and uninteresting dynamics. It is nearly impossible to look at a nonlinear equation and predict whether the solution will be chaotic or otherwise complicated. Small variations of a parameter can change a chaotic system into a periodic one, and vice versa." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"The main defining feature of chaos is the sensitive dependence on initial conditions. Two nearby initial conditions on the attractor or in the chaotic sea separate by a distance that grows exponentially in time when averaged along the trajectory, leading to long-term unpredictability. The Lyapunov exponent is the average rate of growth of this distance, with a positive value signifying sensitive dependence (chaos), a zero value signifying periodicity (or quasiperiodicity), and a negative value signifying a stable equilibrium." (Julien C Sprott, "Elegant Chaos: Algebraically Simple Chaotic Flows", 2010)

"In dynamical systems, a bifurcation occurs when a small smooth change made to the parameter values (the bifurcation parameters) of a system causes a sudden 'qualitative' or topological change in its behaviour. Generally, at a bifurcation, the local stability properties of equilibria, periodic orbits or other invariant sets changes." (Gregory Faye, "An introduction to bifurcation theory", 2011)

"Chaos is just one phenomenon out of many that are encountered in the study of dynamical systems. In addition to behaving chaotically, systems may show fixed equilibria, simple periodic cycles, and more complicated behaviors that defy easy categorization. The study of dynamical systems holds many surprises and shows that the relationships between order and disorder, simplicity and complexity, can be subtle, and counterintuitive." (David P Feldman, "Chaos and Fractals: An Elementary Introduction", 2012)

"A limit cycle is an isolated closed trajectory. Isolated means that neighboring trajectories are not closed; they spiral either toward or away from the limit cycle. If all neighboring trajectories approach the limit cycle, we say the limit cycle is stable or attracting. Otherwise the limit cycle is unstable, or in exceptional cases, half-stable. Stable limit cycles are very important scientifically - they model systems that exhibit self-sustained oscillations. In other words, these systems oscillate even in the absence of external periodic forcing." (Steven H Strogatz, "Nonlinear Dynamics and Chaos: With Applications to Physics, Biology, Chemistry, and Engineering", 2015)

"The significance of Fourier’s theorem to music cannot be overstated: since every periodic vibration produces a musical sound (provided, of course, that it lies within the audible frequency range), it can be broken down into its harmonic components, and this decomposition is unique; that is, every tone has one, and only one, acoustic spectrum, its harmonic fingerprint. The overtones comprising a musical tone thus play a role somewhat similar to that of the prime numbers in number theory: they are the elementary building blocks from which all sound is made." (Eli Maor, "Music by the Numbers: From Pythagoras to Schoenberg", 2018)

"It is particularly helpful to use complex numbers to model periodic phenomena, especially to operate with phase differences. Mathematically, one can treat a physical quantity as being complex, but address physical meaning only to its real part. Another possibility is to treat the real and imaginary parts of a complex number as two related (real) physical quantities. In both cases, the structure of complex numbers is useful to make calculations more easily, but no physical meaning is actually attached to complex variables." (Ricardo Karam, "Why are complex numbers needed in quantum mechanics? Some answers for the introductory level", American Journal of Physics Vol. 88 (1), 2020)

On Perodicity II

"Engineers have sought to minimize the effects of noise in electronic circuits and communication systems. But recent research has established that noise can play a constructive role in the detection of weak periodic signals." (Kurt Wiesenfeld & Frank Moss, "Stochastic Resonance and the Benefits of Noise: From Ice Ages to Crayfish and SQUIDs", Nature vol. 373, 1995)

"In addition to dimensionality requirements, chaos can occur only in nonlinear situations. In multidimensional settings, this means that at least one term in one equation must be nonlinear while also involving several of the variables. With all linear models, solutions can be expressed as combinations of regular and linear periodic processes, but nonlinearities in a model allow for instabilities in such periodic solutions within certain value ranges for some of the parameters." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)

"Chaos appears in both dissipative and conservative systems, but there is a difference in its structure in the two types of systems. Conservative systems have no attractors. Initial conditions can give rise to periodic, quasiperiodic, or chaotic motion, but the chaotic motion, unlike that associated with dissipative systems, is not self-similar. In other words, if you magnify it, it does not give smaller copies of itself. A system that does exhibit self-similarity is called fractal. [...] The chaotic orbits in conservative systems are not fractal; they visit all regions of certain small sections of the phase space, and completely avoid other regions. If you magnify a region of the space, it is not self-similar." (Barry R Parker, "Chaos in the Cosmos: The stunning complexity of the universe", 1996)

"The double periodicity of the torus is fairly obvious: the circle that goes around the torus in the 'long' direction around the rim, together with the circle that goes around it through the hole in the center. And just as periodic functions can be defined on a circle, doubly periodic functions can be defined on a torus." (John L Casti, "Mathematical Mountaintops: The Five Most Famous Problems of All Time", 2001

"In the nonmathematical sense, symmetry is associated with regularity in form, pleasing proportions, periodicity, or a harmonious arrangement; thus it is frequently associated with a sense of beauty. In the geometric sense, symmetry may be more precisely analyzed. We may have, for example, an axis of symmetry, a center of symmetry, or a plane of symmetry, which define respectively the line, point, or plane about which a figure or body is symmetrical. The presence of these symmetry elements, usually in combinations, is responsible for giving form to many compositions; the reproduction of a motif by application of symmetry operations can produce a pattern that is pleasing to the senses." (Hans H Jaffé & Milton Orchin, "Symmetry in Chemistry", 2002)

"In colloquial usage, chaos means a state of total disorder. In its technical sense, however, chaos refers to a state that only appears random, but is actually generated by nonrandom laws. As such, it occupies an unfamiliar middle ground between order and disorder. It looks erratic superficially, yet it contains cryptic patterns and is governed by rigid rules. It's predictable in the short run but unpredictable in the long run. And it never repeats itself: Its behavior is nonperiodic." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"Just as a circle is the shape of periodicity, a strange attractor is the shape of chaos. It lives in an abstract mathematical space called state space, whose axes represent all the different variables in a physical system." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The existence of equilibria or steady periodic solutions is not sufficient to determine if a system will actually behave that way. The stability of these solutions must also be checked. As parameters are changed, a stable motion can become unstable and new solutions may appear. The study of the changes in the dynamic behavior of systems as parameters are varied is the subject of bifurcation theory. Values of the parameters at which the qualitative or topological nature of the motion changes are known as critical or bifurcation values." (Francis C Moona, "Nonlinear Dynamics", 2003)

"A moderate amount of noise leads to enhanced order in excitable systems, manifesting itself in a nearly periodic spiking of single excitable systems, enhancement of synchronized oscillations in coupled systems, and noise-induced stability of spatial pattens in reaction-diffusion systems." (Benjamin Lindner et al, "Effects of Noise in Excitable Systems", Physical Reports. vol. 392, 2004)

"Double periodicity is more interesting than single periodicity, because it is more varied. There is really only one periodic line, since all circles are the same up to a scale factor. However, there are infinitely many doubly periodic planes, even if we ignore scale. This is because the angle between the two periodic axes can vary, and so can the ratio of period lengths. The general picture of a doubly periodic plane is given by a lattice in the plane of complex numbers: a set of points of the form mA + nB, where A and B are nonzero complex numbers in different directions from O, and m and n run through all the integers. A and B are said to generate the lattice because it consists of all their sums and differences. […] The shape of the lattice of points mA + nB can therefore be represented by the complex number A/B. It is not hard to see that any nonzero complex number represents a lattice shape, so in some sense there is whole plane of lattice shapes. Even more interesting: the plane of lattice shapes is a periodic plane, because different numbers represent the same lattice." (John Stillwell, "Yearning for the Impossible: The Surprising Truths of Mathematics", 2006)

On Perodicity I

"Since a given system can never of its own accord go over into another equally probable state but into a more probable one, it is likewise impossible to construct a system of bodies that after traversing various states returns periodically to its original state, that is a perpetual motion machine." (Ludwig E Boltzmann, "The Second Law of Thermodynamics", [Address to a Formal meeting of the Imperial Academy of Science], 1886)

"Science works by the slow method of the classification of data, arranging the detail patiently in a periodic system into groups of facts, in series like the strata of the rocks. For each series there must be a vocabulary of special words which do not always make good sense when used in another series. But the laws of periodicity seem to hold throughout, among the elements and in every sphere of thought, and we must learn to co-ordinate the whole through our new conception of the reign of relativity." (William H Pallister, "Poems of Science", 1931)

"Finite systems of deterministic ordinary nonlinear differential equations may be designed to represent forced dissipative hydrodynamic flow. Solutions of these equations can be identified with trajectories in phase space. For those systems with bounded solutions, it is found that nonperiodic solutions are ordinarily unstable with respect to small modifications, so that slightly differing initial states can evolve into considerably different states. Systems with bounded solutions are shown to possess bounded numerical solutions. (Edward N Lorenz, "Deterministic Nonperiodic Flow", Journal of the Atmospheric Science 20, 1963)

"Now, the main problem with a quasiperiodic theory of turbulence (putting several oscillators together) is the following: when there is a nonlinear coupling between the oscillators, it very often happens that the time evolution does not remain quasiperiodic. As a matter of fact, in this latter situation, one can observe the appearance of a feature which makes the motion completely different from a quasiperiodic one. This feature is called sensitive dependence on initial conditions and turns out to be the conceptual key to reformulating the problem of turbulence." (David Ruelle, "Chaotic Evolution and Strange Attractors: The statistical analysis of time series for deterministic nonlinear systems", 1989)

"All physical objects that are 'self-similar' have limited self-similarity - just as there are no perfectly periodic functions, in the mathematical sense, in the real world: most oscillations have a beginning and an end (with the possible exception of our universe, if it is closed and begins a new life cycle after every 'big crunch' […]. Nevertheless, self-similarity is a useful  abstraction, just as periodicity is one of the most useful concepts in the sciences, any finite extent notwithstanding." (Manfred Schroeder, "Fractals, Chaos, Power Laws Minutes from an Infinite Paradise", 1990)

"The digits of pi march to infinity in a predestined yet unfathomable code: they do not repeat periodically, seeming to pop up by blind chance, lacking any perceivable order, rule, reason, or design - ‘random’ integers, ad infinitum." (Richard Preston, "The Mountains of Pi", The New Yorker, March 2, 1992)

"Clearly, however, a zero probability is not the same thing as an impossibility; […] In systems that are now called chaotic, most initial states are followed by nonperiodic behavior, and only a special few lead to periodicity. […] In limited chaos, encountering nonperiodic behavior is analogous to striking a point on the diagonal of the square; although it is possible, its probability is zero. In full chaos, the probability of encountering periodic behavior is zero." (Edward N Lorenz, "The Essence of Chaos", 1993)

"The description of the evolutionary trajectory of dynamical systems as irreversible, periodically chaotic, and strongly nonlinear fits certain features of the historical development of human societies. But the description of evolutionary processes, whether in nature or in history, has additional elements. These elements include such factors as the convergence of existing systems on progressively higher organizational levels, the increasingly efficient exploitation by systems of the sources of free energy in their environment, and the complexification of systems structure in states progressively further removed from thermodynamic equilibrium." (Ervin László et al, "The Evolution of Cognitive Maps: New Paradigms for the Twenty-first Century", 1993) 

"There is no question but that the chains of events through which chaos can develop out of regularity, or regularity out of chaos, are essential aspects of families of dynamical systems [...]  Sometimes [...] a nearly imperceptible change in a constant will produce a qualitative change in the system’s behaviour: from steady to periodic, from steady or periodic to almost periodic, or from steady, periodic, or almost periodic to chaotic. Even chaos can change abruptly to more complicated chaos, and, of course, each of these changes can proceed in the opposite direction. Such changes are called bifurcations." (Edward Lorenz, "The Essence of Chaos", 1993)

"As with subtle bifurcations, catastrophes also involve a control parameter. When the value of that parameter is below a bifurcation point, the system is dominated by one attractor. When the value of that parameter is above the bifurcation point, another attractor dominates. Thus the fundamental characteristic of a catastrophe is the sudden disappearance of one attractor and its basin, combined with the dominant emergence of another attractor. Any type of attractor static, periodic, or chaotic can be involved in this. Elementary catastrophe theory involves static attractors, such as points. Because multidimensional surfaces can also attract (together with attracting points on these surfaces), we refer to them more generally as attracting hypersurfaces, limit sets, or simply attractors." (Courtney Brown, "Chaos and Catastrophe Theories", 1995)


Related Posts Plugin for WordPress, Blogger...

On Thresholds (From Fiction to Science-Ficttion)

"For many men that stumble at the threshold Are well foretold that danger lurks within." (William Shakespeare, "King Henry th...