Showing posts with label information. Show all posts
Showing posts with label information. Show all posts

09 January 2023

John R Pierce - Collected Quotes

"A valid scientific theory seldom if ever offers the solution to the pressing problems which we repeatedly state. It seldom supplies a sensible answer to our multitudinous questions. Rather than rationalizing our ideas, it discards them entirely, or, rather, it leaves them as they were. It tells us in a fresh and new way what aspects of our experience can profitably be related and simply understood." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Communication theory deals with certain important but abstract aspects of communication. Communication theory proceeds from clear and definite assumptions to theorems concerning information sources and communication channels. In this it is essentially mathematical, and in order to understand it we must understand the idea of a theorem as a statement which must be proved, that is, which must be shown to be the necessary consequence of a set of initial assumptions. This is an idea which is the very heart of mathematics as mathematicians understand it." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Communication theory tells us how many bits of information can be sent per second over perfect and imperfect communication channels in terms of rather abstract descriptions of the properties of these channels. Communication theory tells us how to measure the rate at which a message source, such as a speaker or a writer, generates information. Communication theory tells us how to represent, or encode, messages from a particular message source efficiently for transmission over a particular sort of channel, such as an electrical circuit, and it tells us when we can avoid errors in transmission." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"However, it turns out that a one-to-one mapping of the points in a square into the points on a line cannot be continuous. As we move smoothly along a curve through the square, the points on the line which represent the successive points on the square necessarily jump around erratically, not only for the mapping described above but for any one-to-one mapping whatever. Any one-to-one mapping of the square onto the line is discontinuous." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"In communication theory we consider a message source, such as a writer or a speaker, which may produce on a given occasion any one of many possible messages. The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Mathematics is a way of finding out, step by step, facts which are inherent in the statement of the problem but which are not immediately obvious. Usually, in applying mathematics one must first hit on the facts and then verify them by proof. Here we come upon a knotty problem, for the proofs which satisfied mathematicians of an earlier day do not satisfy modem mathematicians." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Mathematicians start out with certain assumptions and definitions, and then by means of mathematical arguments or proofs they are able to show that certain statements or theorems are true." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"One of these is that many of the most general and powerful discoveries of science have arisen, not through the study of phenomena as they occur in nature, but, rather, through the study of phenomena in man-made devices, in products of technology, if you will. This is because the phenomena in man’s machines are simplified and ordered in comparison with those occurring naturally, and it is these simplified phenomena that man understands most easily." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Ordinarily, while mathematicians may suspect or conjecture the truth of certain statements, they have to prove theorems in order to be certain." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"The ideas and assumptions of a theory determine the generalityof the theory, that is, to how wide a range of phenomena the theory applies." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"The fact that network theory evolved from the study of idealized electrical systems rather than from the study of idealized mechanical systems is a matter of history, not of necessity." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Theories are strongly physical when they describe very completely some range of physical phenomena, which in practice is always limited. Theories become more mathematical or abstract when they deal with an idealized class of phenomena or with only certain aspects of phenomena." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Thus, in physics, entropy is associated with the possibility of converting thermal energy into mechanical energy. If the entropy does not change during a process, the process is reversible. If the entropy increases, the available energy decreases. Statistical mechanics interprets an increase of entropy as a decrease in order or, if we wish, as a decrease in our knowledge." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Thus, information is sometimes associated with the idea of knowledge through its popular use rather than with uncertainty and the resolution of uncertainty, as it is in communication theory." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

07 August 2022

Edwin T Jaynes - Collected Quotes

"In conventional statistical mechanics the energy plays a preferred role among all dynamical quantities because it is conserved both in the time development of isolated systems and in the interaction of different systems. Since, however, the principles of maximum-entropy inference are independent of any physical properties, it appears that in subjective statistical mechanics all measurable quantities may be treated on the same basis, subject to certain precautions." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"Just as in applied statistics the crux of a problem is often the devising of some method of sampling that avoids bias, our problem is that of finding a probability assignment which avoids bias, while agreeing with whatever information is given. The great advance provided by information theory lies in the discovery that there is a unique, unambiguous criterion for the 'amount of uncertainty' represented by a discrete probability distribution, which agrees with our intuitive notions that a broad distribution represents more uncertainty than does a sharply peaked one, and satisfies all other conditions which make it reasonable." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"On the other hand, the 'subjective' school of thought, regards probabilities as expressions of human ignorance; the probability of an event is merely a formal expression of our expectation that the event will or did occur, based on whatever information is available. To the subjectivist, the purpose of probability theory is to help us in forming plausible conclusions in cases where there is not enough information available to lead to certain conclusions; thus detailed verification is not expected. The test of a good subjective probability distribution is does it correctly represent our state of knowledge as to the value of x?" (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"The mere fact that the same mathematical expression -Σ pi log(pi) [i is index], occurs both in statistical mechanics and in information theory does not in itself establish any connection between these fields. This can be done only by finding new viewpoints from which thermodynamic entropy and information-theory entropy appear as the same concept." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"[...] thermodynamics knows of no such notion as the 'entropy of a physical system'. Thermodynamics does have the concept of the entropy of a thermodynamic system; but a given physical system corresponds to many different thermodynamic systems." (Edwin T Jaynes, "Gibbs vs Boltzmann Entropies", 1964)

"In particular, the uncertainty principle has stood for a generation, barring the way to more detailed descriptions of nature; and yet, with the lesson of parity still fresh in our minds, how can anyone be quite so sure of its universal validity when we note that, to this day, it has never been subjected to even one direct experimental test?" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"'You cannot base a general mathematical theory on imprecisely defined concepts. You can make some progress that way; but sooner or later the theory is bound to dissolve in ambiguities which prevent you from extending it further.' Failure to recognize this fact has another unfortunate consequence which is, in a practical sense, even more disastrous: 'Unless the conceptual problems of a field have been clearly resolved, you cannot say which mathematical problems are the relevant ones worth working on; and your efforts are more than likely to be wasted.'" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"In decision theory, mathematical analysis shows that once the sampling distribution, loss function, and sample are specified, the only remaining basis for a choice among different admissible decisions lies in the prior probabilities. Therefore, the logical foundations of decision theory cannot be put in fully satisfactory form until the old problem of arbitrariness (sometimes called 'subjectiveness') in assigning prior probabilities is resolved." (Edwin T Jaynes, "Prior Probabilities", 1978)

"It appears to be a quite general principle that, whenever there is a randomized way of doing something, then there is a nonrandomized way that delivers better performance but requires more thought." (Edwin T Jaynes, "Probability Theory: The Logic of Science", 1979)

"The semiliterate on the next bar stool will tell you with absolute, arrogant assurance just how to solve the world's problems; while the scholar who has spent a lifetime studying their causes is not at all sure how to do this." (Edwin T Jaynes, "Probability Theory: The Logic of Science", 1979)

"The difference is that energy is a property of the microstates, and so all observers, whatever macroscopic variables they may choose to define their thermodynamic states, must ascribe the same energy to a system in a given microstate. But they will ascribe different entropies to that microstate, because entropy is not a property of the microstate, but rather of the reference class in which it is embedded. As we learned from Boltzmann, Planck, and Einstein, the entropy of a thermodynamic state is a measure of the number of microstates compatible with the macroscopic quantities that you or I use to define the thermodynamic state." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

"There is no end to this search for the ultimate ‘true’ entropy until we have reached the point where we control the location of each atom independently. But just at that point the notion of entropy collapses, and we are no longer talking thermodynamics." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

09 May 2022

Claude E Shannon - Collected Quotes

"The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages." (Claude E Shannon, "A mathematical theory of communication", Bell Systems Technical Journal 27, 1948)

"Almost every problem that you come across is befuddled with all kinds of extraneous data of one sort or another; and if you can bring this problem down into the main issues, you can see more clearly what you’re trying to do." (Claude E Shannon, "Creative Thinking", 1952)

"Another approach for a given problem is to try to restate it in just as many different forms as you can. Change the words. Change the viewpoint. Look at it from every possible angle. After you’ve done that, you can try to look at it from several angles at the same time and perhaps you can get an insight into the real basic issues of the problem, so that you can correlate the important factors and come out with the solution." (Claude E Shannon, "Creative Thinking", 1952)

"Electronic computers are normally used for the solution of numerical problems arising in science or industry. The fundamental design of these computers, however, is so flexible and so universal in conception that they maybe programmed to perform many operations which do not involve numbers at all - operations such as the translation of language, the analysis of a logical situation or the playing of games. The same orders which are used in constructing a numerical program maybe used to symbolize operations on abstract entities such as the words of a language or the positions in a chess game." (Claude E Shannon, "Game Playing Machines, 1955) 

"This duality can be pursued further and is related to a duality between past and future and the notions of control and knowledge. Thus we may have knowledge of the past but cannot control it; we may control the future but have no knowledge of it." (Claude E Shannon, "Coding theorems for a discrete source with a fidelity criterion", IRE International Convention Records Vol. 7, 1959)

"It is very difficult to estimate how well a computer can be made to play with ideal programming. I tend to agree that it would be very difficult to reach the caliber of world champions or even most chess masters but I do not regard this as unthinkable. The machines do have certain very strong advantages of accuracy, speed, etc., and our present techniques of programming are bound to improve enormously in the future." (Claude E Shannon)

"It seems to be much easier to make two small jumps than the one big jump in any kind of mental thinking." (Claude E Shannon)

"Many proofs in mathematics have been actually found by extremely roundabout processes. A man starts to prove this theorem and he finds that he wanders all over the map. He starts off and prove a good many results which don’t seem to be leading anywhere and then eventually ends up by the back door on the solution of the given problem." (Claude E Shannon)

"Suppose that you are given a problem to solve, I don’t care what kind of a problem - a machine to design, or a physical theory to develop, or a mathematical theorem to prove, or something of that kind - probably a very powerful approach to this is to attempt to eliminate everything from the problem except the essentials; that is, cut it down to size." (Claude E Shannon)

"The chief weakness of the machine is that it will not learn by its mistakes. The only way to improve its play is by improving the program. Some thought has been given to designing a program that would develop its own improvements in strategy with increasing experience in play. Although it appears to be theoretically possible, the methods thought of so far do not seem to be very practical. One possibility is to devise a program that would change the terms and coefficients involved in the evaluation function on the basis of the results of games the machine had already played. Small variations might be introduced in these terms, and the values would be selected to give the greatest percentage of wins." (Claude E Shannon)

"The idea of a machine thinking is by no means repugnant to all of us. In fact, I find the converse idea, that the human brain may itself be a machine which could be possibly duplicated functionally with inanimate objects, quite attractive. Until clearly disproved, this hypothesis concerning the brain seems the natural scientific one in line with the principle of parsimony, etc., rather than hypothecating intangible and unreachable “vital forces,” “souls” and the like." (Claude E Shannon)

"The redundancy of a language is related to the existence of crossword puzzles. If the redundancy is zero any sequence of letters is a reasonable text in the language and any two dimensional array of letters forms a crossword puzzle. If the redundancy is too high the language imposes too many constraints for large crossword puzzles to be possible." (Claude E Shannon)

"There is a vast, explored sea of nature just waiting for things to be discovered in it, and the science and technology needed are progressing at an exponential rate. You see, it all feeds back into itself. Someone discovers a new principle or a new theory and it’s not only new knowledge but a new instrument for seeking more knowledge." (Claude E Shannon)

21 August 2021

Out of Context: On Information (Definitions)

"Information is a set of marks that have meaning." (Edmund C Berkeley & Lawrence Wainwright, Computers: Their Operation and Applications", 1956)

"Information is carried by physical entities, such as books or sound waves or brains, but it is not itself material. Information in a living system is a feature of the order and arrangement of its parts, which arrangement provides the signs that constitute a ‘code’ or ‘language’." (John Z Young, "Programs of the Brain", 1978)

"Information is recorded in vast interconnecting networks. Each idea or image has hundreds, perhaps thousands, of associations and is connected to numerous other points in the mental network." (Peter Russell, "The Brain Book: Know Your Own Mind and How to Use it", 1979)

"Neither noise nor information is predictable." (Ray Kurzweil, "The Age of Spiritual Machines: When Computers Exceed Human Intelligence", 1999)

"Most of the information is fuzzy and linguistic in form." (Timothy J Ross & W Jerry Parkinson, "Fuzzy Set Theory, Fuzzy Logic, and Fuzzy Systems", 2002)

"Information is assimilated to the slots of a mental model in the form of ‘frames’ which are understood here as ‘chunks’ of knowledge with a well-defined meaning anchored in a given body of shared knowledge." (Jürgen Renn, "Before the Riemann Tensor: The Emergence of Einstein’s Double Strategy", "The Universe of General Relativity" Ed. by A.J. Kox & Jean Eisenstaedt, 2005)

"One advantage of the use of fuzzy models is the fact that their complexity can be gradually increased as more information is gathered. This increase in complexity can be done automatically or manually by a careful commission of the new operating point." (Jairo Espinosa et al, "Fuzzy Logic, Identification and Predictive Control", 2005)

"In a physical system, information is the opposite of entropy, as it involves uncommon and highly correlated configurations that are difficult to arrive at." (César A Hidalgo, "Why Information Grows: The Evolution of Order, from Atoms to Economies", 2015)

John M Ziman - Collected Quotes

"Many philosophers have now sadly come to the conclusion that there is no ultimate procedure which will wring the last drops of uncertainty from what scientists call their knowledge." (John M Ziman, "Public Knowledge: An Essay Concerning the Social Dimension of Science", 1968)

"Although the best and most famous scientific discoveries seem to open whole new windows of the mind, a typical scientific paper has never pretended to be more than another piece in a large jig-saw not significant in itself but as an element in a grander scheme. This technique, of soliciting many modest contributions to the vast store of human knowledge, has been the secret of western science since the seventeenth century, for it achieves a corporate collective power that is far greater than any one individual can exert. Primary scientific papers are not meant to be final statements of indisputable truths; each is merely a tiny tentative step forward, through the jungle of ignorance." (John M Zimer, Vol. 224, 1969)

"It is not enough to observe, experiment, theorize, calculate and communicate; we must also argue, criticize, debate, expound, summarize, and otherwise transform the information that we have obtained individually into reliable, well established, public knowledge." (John M Ziman, "Information, Communication, Knowledge", Nature Vol. 224 (5217), 1969)

"The sooner we all face up to the fact that theory and practice are indissoluble, and that there is no contradiction between the qualities of usefulness and beauty, the better." (John M Ziman, "Growth and Spread of Science", Nature Vol. 221 (5180), 1969)

"A significant fraction of the ordinary scientific literature in any field is concerned with essentially irrational theories put forward by a few well-established scholars who have lost touch with reality." (John M Ziman, "Some Pathologies of the Scientific Life", Nature Vol. 227, 1970)

"The communication of modern science to the ordinary citizen, necessary, important, desirable as it is, cannot be considered an easy task. The prime obstacle is lack of education. [...] There is also the difficulty of making scientific discoveries interesting and exciting without completely degrading them intellectually. [...] It is a weakness of modern science that the scientist shrinks from this sort of publicity, and thus gives an impression of arrogant mystagoguery." (John M Ziman,"The Force of Knowledge: The Scientific Dimension of Society", 1976)

"Physics defines itself as the science devoted to discovering, developing and refining those aspects of reality that are amenable to mathematical analysis." (John M Ziman, "Reliable Knowledge: An Exploration of the Grounds for Belief in Science", 1978)

"The most astonishing achievements of science, intellectually and practically, have been in physics, which many people take to be the ideal type of scientific knowledge. In fact, physics is a very special type of science, in which the subject matter is deliberately chosen so as to be amenable to quantitative analysis." (John M Ziman, "Reliable Knowledge: An Exploration of the Grounds for Belief in Science", 1978)

"'Disorder' is not mere chaos; it implies defective order." (John M Ziman, "Models of Disorder", 1979)

"A philosopher is a person who knows less and less about more and more, until he knows nothing about everything. […] A scientist is a person who knows more and more about less and less, until he knows everything about nothing." (John M Ziman, "Knowing Everything about Nothing: Specialization and Change in Scientific Careers", 1987)

"Any research organization requires generous measures of the following: (1) Social space for personal initiative and creativity; (2) Time for ideas to grow to maturity; (3) Openness to debate and criticism; (4) Hospitality towards novelty; and (5) Respect for specialized expertise." (John M Ziman, "Prometheus Bound", 1994)

"Theoretical physicists are like pure mathematicians, in that they are often interested in the hypothetical behaviour of entirely imaginary objects, such as parallel universes, or particles traveling faster than light, whose actual existence is not being seriously proposed at all." (John M Ziman, "Real Science: What it Is, and what it Means", 2000)

06 July 2021

On Algorithms I

"Mathematics is an aspect of culture as well as a collection of algorithms." (Carl B Boyer, "The History of the Calculus and Its Conceptual Development", 1959)

"An algorithm must be seen to be believed, and the best way to learn what an algorithm is all about is to try it." (Donald E Knuth, The Art of Computer Programming Vol. I, 1968)

"Scientific laws give algorithms, or procedures, for determining how systems behave. The computer program is a medium in which the algorithms can be expressed and applied. Physical objects and mathematical structures can be represented as numbers and symbols in a computer, and a program can be written to manipulate them according to the algorithms. When the computer program is executed, it causes the numbers and symbols to be modified in the way specified by the scientific laws. It thereby allows the consequences of the laws to be deduced." (Stephen Wolfram, "Computer Software in Science and Mathematics", 1984)

"Algorithmic complexity theory and nonlinear dynamics together establish the fact that determinism reigns only over a quite finite domain; outside this small haven of order lies a largely uncharted, vast wasteland of chaos." (Joseph Ford, "Progress in Chaotic Dynamics: Essays in Honor of Joseph Ford's 60th Birthday", 1988)

"On this view, we recognize science to be the search for algorithmic compressions. We list sequences of observed data. We try to formulate algorithms that compactly represent the information content of those sequences. Then we test the correctness of our hypothetical abbreviations by using them to predict the next terms in the string. These predictions can then be compared with the future direction of the data sequence. Without the development of algorithmic compressions of data all science would be replaced by mindless stamp collecting - the indiscriminate accumulation of every available fact. Science is predicated upon the belief that the Universe is algorithmically compressible and the modern search for a Theory of Everything is the ultimate expression of that belief, a belief that there is an abbreviated representation of the logic behind the Universe's properties that can be written down in finite form by human beings." (John D Barrow, New Theories of Everything", 1991)

"Algorithms are a set of procedures to generate the answer to a problem." (Stuart Kauffman, "At Home in the Universe: The Search for Laws of Complexity", 1995)

"Let us regard a proof of an assertion as a purely mechanical procedure using precise rules of inference starting with a few unassailable axioms. This means that an algorithm can be devised for testing the validity of an alleged proof simply by checking the successive steps of the argument; the rules of inference constitute an algorithm for generating all the statements that can be deduced in a finite number of steps from the axioms." (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"Heuristics are rules of thumb that help constrain the problem in certain ways (in other words they help you to avoid falling back on blind trial and error), but they don't guarantee that you will find a solution. Heuristics are often contrasted with algorithms that will guarantee that you find a solution - it may take forever, but if the problem is algorithmic you will get there. However, heuristics are also algorithms." (S Ian Robertson, "Problem Solving", 2001)

"An algorithm is a simple rule, or elementary task, that is repeated over and over again. In this way algorithms can produce structures of astounding complexity." (F David Peat, "From Certainty to Uncertainty", 2002)

"Many people have strong intuitions about whether they would rather have a vital decision about them made by algorithms or humans. Some people are touchingly impressed by the capabilities of the algorithms; others have far too much faith in human judgment. The truth is that sometimes the algorithms will do better than the humans, and sometimes they won’t. If we want to avoid the problems and unlock the promise of big data, we’re going to need to assess the performance of the algorithms on a case-by-case basis. All too often, this is much harder than it should be. […] So the problem is not the algorithms, or the big datasets. The problem is a lack of scrutiny, transparency, and debate." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

On Algorithms II

"The vast majority of information that we have on most processes tends to be nonnumeric and nonalgorithmic. Most of the information is fuzzy and linguistic in form." (Timothy J Ross & W Jerry Parkinson, "Fuzzy Set Theory, Fuzzy Logic, and Fuzzy Systems", 2002)

"Knowledge is encoded in models. Models are synthetic sets of rules, and pictures, and algorithms providing us with useful representations of the world of our perceptions and of their patterns." (Didier Sornette, "Why Stock Markets Crash - Critical Events in Complex Systems", 2003)

"Swarm Intelligence can be defined more precisely as: Any attempt to design algorithms or distributed problem-solving methods inspired by the collective behavior of the social insect colonies or other animal societies. The main properties of such systems are flexibility, robustness, decentralization and self-organization." ("Swarm Intelligence in Data Mining", Ed. Ajith Abraham et al, 2006)

"The burgeoning field of computer science has shifted our view of the physical world from that of a collection of interacting material particles to one of a seething network of information. In this way of looking at nature, the laws of physics are a form of software, or algorithm, while the material world - the hardware - plays the role of a gigantic computer." (Paul C W Davies, "Laying Down the Laws", New Scientist, 2007)

"An algorithm refers to a successive and finite procedure by which it is possible to solve a certain problem. Algorithms are the operational base for most computer programs. They consist of a series of instructions that, thanks to programmers’ prior knowledge about the essential characteristics of a problem that must be solved, allow a step-by-step path to the solution." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Programming is a science dressed up as art, because most of us don’t understand the physics of software and it’s rarely, if ever, taught. The physics of software is not algorithms, data structures, languages, and abstractions. These are just tools we make, use, and throw away. The real physics of software is the physics of people. Specifically, it’s about our limitations when it comes to complexity and our desire to work together to solve large problems in pieces. This is the science of programming: make building blocks that people can understand and use easily, and people will work together to solve the very largest problems." (Pieter Hintjens, "ZeroMQ: Messaging for Many Applications", 2012)

"These nature-inspired algorithms gradually became more and more attractive and popular among the evolutionary computation research community, and together they were named swarm intelligence, which became the little brother of the major four evolutionary computation algorithms." (Yuhui Shi, "Emerging Research on Swarm Intelligence and Algorithm Optimization", Information Science Reference, 2014)

"Again, classical statistics only summarizes data, so it does not provide even a language for asking [a counterfactual] question. Causal inference provides a notation and, more importantly, offers a solution. As with predicting the effect of interventions [...], in many cases we can emulate human retrospective thinking with an algorithm that takes what we know about the observed world and produces an answer about the counterfactual world." (Judea Pearl & Dana Mackenzie, "The Book of Why: The new science of cause and effect", 2018)

"An algorithm, meanwhile, is a step-by-step recipe for performing a series of actions, and in most cases 'algorithm' means simply 'computer program'." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Big data is revolutionizing the world around us, and it is easy to feel alienated by tales of computers handing down decisions made in ways we don’t understand. I think we’re right to be concerned. Modern data analytics can produce some miraculous results, but big data is often less trustworthy than small data. Small data can typically be scrutinized; big data tends to be locked away in the vaults of Silicon Valley. The simple statistical tools used to analyze small datasets are usually easy to check; pattern-recognizing algorithms can all too easily be mysterious and commercially sensitive black boxes." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

"Each of us is sweating data, and those data are being mopped up and wrung out into oceans of information. Algorithms and large datasets are being used for everything from finding us love to deciding whether, if we are accused of a crime, we go to prison before the trial or are instead allowed to post bail. We all need to understand what these data are and how they can be exploited." (Tim Harford, "The Data Detective: Ten easy rules to make sense of statistics", 2020)

27 June 2021

Herbert A Simon - Collected Quotes

"All behavior involves conscious or unconscious selection of particular actions out of all those which are physically possible to the actor and to those persons over whom he exercises influence and authority." (Herbert A Simon, "Administrative Behavior: A Study of Decision-making Processes in Administrative Organization", 1947)

"Decision making processes are aimed at finding courses of action that are feasible or satisfactory in the light of multiple goals and constraints." (Herbert A Simon, "Administrative Behavior: A Study of Decision-making Processes in Administrative Organization", 1947)

"In the process of decision those alternatives are chosen which are considered to be appropriate means of reaching desired ends. Ends themselves, however, are often merely instrumental to more final objectives. We are thus led to the conception of a series, or hierarchy, of ends. Rationality has to do with the construction of means-ends chains of this kind." (Herbert A Simon, "Administrative Behavior", 1947)

"It is impossible for the behavior of a single, isolated individual to reach a high degree of rationality. The number of alternatives he must explore is so great, the information he would need to evaluate them so vast that even an approximation to objective rationality is hard to conceive. Individual choice takes place in rationality is hard to conceive. [...] Actual behavior falls short in at least three ways, of objective rationality." (Herbert A Simon, "Administrative Behavior", 1947)

"Many individuals and organization units contribute to every large decision, and the very problem of centralization and decentralization is a problem of arranging the complex system into an effective scheme." (Herbert A Simon, "Administrative Behavior: A Study of Decision-making Processes in Administrative Organization", 1947)

"Rationality requires a choice among all possible alternative behaviors. In actual behavior, only a very few of all these possible alternatives come to mind." (Herbert A Simon, "Administrative Behavior", 1947)

"Rationality requires a complete knowledge and anticipation of the consequences that will follow on each choice. In fact, knowledge of consequences is always fragmentary." (Herbert A Simon, "Administrative Behavior", 1947)

"Roughly speaking, rationality is concerned with the selection of preferred behavior alternatives in terms of some system of values, whereby the consequences of behavior can be evaluated." (Herbert A Simon, "Administrative Behavior", 1947)

"The function of knowledge in the decision-making process is to determine which consequences follow upon which of the alternative strategies. It is the task of knowledge to select from the whole class of possible consequences a more limited subclass, or even (ideally) a single set of consequences correlated with each strategy." (Herbert A Simon, "Administrative Behavior: A Study of Decision-making Processes in Administrative Organization", 1947)

"The principle of bounded rationality [is] the capacity of the human mind for formulating and solving complex problems is very small compared with the size of the problems whose solution is required for objectively rational behavior in the real world - or even for a reasonable approximation to such objective rationality." (Herbert A Simon, "Administrative Behavior", 1947)

"The first consequence of the principle of bounded rationality is that the intended rationality of an actor requires him to construct a simplified model of the real situation in order to deal with it. He behaves rationally with respect to this model, and such behavior is not even approximately optimal with respect to the real world. To predict his behavior we must understand the way in which this simplified model is constructed, and its construction will certainly be related to his psychological properties as a perceiving, thinking, and learning animal." (Herbert A Simon, "Models of Man", 1957)

"The mathematical and computing techniques for making programmed decisions replace man but they do not generally simulate him." (Herbert A Simon, "Management and Corporations 1985", 1960)

"Programs do not merely substitute brute force for human cunning. Increasingly, they imitate-and in some cases improve upon-human cunning." (Herbert A Simon, "Management and Corporations 1985", 1960)

"Roughly, by a complex system I mean one made up of a large number of parts that interact in a nonsimple way. In such systems, the whole is more than the sum of the parts, not in an ultimate, metaphysical sense, but in the important pragmatic sense that, given the properties of the parts and the laws of their interaction, it is not a trivial matter to infer the properties of the whole." (Herbert A Simon, "The Architecture of Complexity", Proceedings of the American Philosophical Society, Vol. 106 (6), 1962)

"Thus, the central theme that runs through my remarks is that complexity frequently takes the form of hierarchy, and that hierarchic systems have some common properties that are independent of their specific content. Hierarchy, I shall argue, is one of the central structural schemes that the architect of complexity uses." (Herbert A Simon, "The Architecture of Complexity", Proceedings of the American Philosophical Society Vol. 106 (6), 1962)

"A mathematical proof, as usually written down, is a sequence of expressions in the state space. But we may also think of the proof as consisting of the sequence of justifications of consecutive proof steps - i.e., the references to axioms, previously-proved theorems, and rules of inference that legitimize the writing down of the proof steps. From this point of view, the proof is a sequence of actions (applications of rules of inference) that, operating initially on the axioms, transform them into the desired theorem." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)

"[...] a problem of design exists when (1) there is a language for naming actions and a language for naming states of the world, (2) there is a need to find an action that will produce a specified state of the world or a specified change in the state of the world, and (3) there is no non-trivial process for translating changes in the state of the world into their corresponding actions." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)

"A problem will be difficult if there are no procedures for generating possible solutions that are guaranteed (or at least likely) to generate the actual solution rather early in the game. But for such a procedure to exist, there must be some kind of structural relation, at least approximate, between the possible solutions as named by the solution-generating process and these same solutions as named in the language of the problem statement." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)

"An adaptive organism is connected with its environment by two kinds of channels. Afferent channels give it information about the state of the environment; efferent channels cause action on the environment. Problem statements define solutions in terms of afferent information to the organism; the organism's task is to discover a set of efferent signals which, changing the state of the environment, will produce the appropriate afferent. But, ab initio, the mapping of efferents on afferents is entirely arbitrary; the relations can only be discovered by experiment, by acting and observing the consequences of action." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)

"Design problems - generating or discovering alternatives - are complex largely because they involve two spaces, an action space and a state space, that generally have completely different structures. To find a design requires mapping the former of these on the latter. For many, if not most, design problems in the real world systematic algorithms are not known that guarantee solutions with reasonable amounts of computing effort. Design uses a wide range of heuristic devices - like means-end analysis, satisficing, and the other procedures that have been outlined - that have been found by experience to enhance the efficiency of search. Much remains to be learned about the nature and effectiveness of these devices." (Herbert A Simon, "The Logic of Heuristic Decision Making", [in "The Logic of Decision and Action"], 1966)

"Every problem-solving effort must begin with creating a representation for the problem - a problem space in which the search for the solution can take place. Of course, for most of the problems we encounter in our daily personal or professional lives, we simply retrieve from memory a representation that we have already stored and used on previous occasions. Sometimes, we have to adapt the representation a bit to the new situation, but that is usually a rather simple matter." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"Natural science is knowledge about natural objects and phenomena." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"Learning is any change in a system that produces a more or less permanent change in its capacity for adapting to its environment. Understanding systems, especially systems capable of understanding problems in new task domains, are learning systems." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"Making discoveries belongs to the class of ill-structured problem-solving tasks that have relatively ill-defined goals." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"Solving a problem simply means representing it so as to make the solution transparent." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"The central task of a natural science is to make the wonderful commonplace: to show that complexity, correctly viewed, is only a mask for simplicity; to find pattern hidden in apparent chaos. […] This is the task of natural science: to show that the wonderful is not incomprehensible, to show how it can be comprehended - but not to destroy wonder. For when we have explained the wonderful, unmasked the hidden pattern, a new wonder arises at how complexity was woven out of simplicity. The aesthetics of natural science and mathematics is at one with the aesthetics of music and painting - both inhere in the discovery of a partially concealed pattern." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"The more we are willing to abstract from the detail of a set of phenomena, the easier it becomes to simulate the phenomena. Moreover we do not have to know, or guess at, all the internal structure of the system but only that part of it that is crucial to the abstraction." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"[...] in an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it." (Herbert Simon, "Designing Organizations for an Information-Rich World", 1971)

"But the answers provided by the theory of games are sometimes very puzzling and ambiguous. In many situations, no single course of action dominates all the others; instead, a whole set of possible solutions are all equally consistent with the postulates of rationality." (Herbert A Simon et al, "Decision Making and Problem Solving", Interfaces Vol. 17 (5), 1987)

"[...] problem solving generally proceeds by selective search through large sets of possibilities, using rules of thumb (heuristics) to guide the search. Because the possibilities in realistic problem situations are generally multitudinous, trial-and-error search would simply not work; the search must be highly selective." (Herbert A Simon et al, "Decision Making and Problem Solving", Interfaces Vol. 17 (5), 1987)

"The way in which an uncertain possibility is presented may have a substantial effect on how people respond to it." (Herbert A Simon et al, "Decision Making and Problem Solving", Interfaces Vol. 17 (5), 1987)

17 June 2021

On Knowledge (1990-1999)

"[By understanding] I mean simply a sufficient grasp of concepts, principles, or skills so that one can bring them to bear on new problems and situations, deciding in which ways one’s present competencies can suffice and in which ways one may require new skills or knowledge." (Howard Gardner, "The Unschooled Mind", 1991)

"The worst, i.e., most dangerous, feature of 'accepting the null hypothesis' is the giving up of explicit uncertainty. [...] Mathematics can sometimes be put in such black-and-white terms, but our knowledge or belief about the external world never can." (John Tukey, "The Philosophy of Multiple Comparisons", Statistical Science Vol. 6 (1), 1991)

"We live on an island surrounded by a sea of ignorance. As our island of knowledge grows, so does the shore of our ignorance." (John A Wheeler, Scientific American Vol. 267, 1992)

"Indeed, knowledge that one will be judged on some criterion of ‘creativeness’ or ‘originality’ tends to narrow the scope of what one can produce (leading to products that are then judged as relatively conventional); in contrast, the absence of an evaluations seems to liberate creativity." (Howard Gardner, "Creating Minds", 1993)

"Knowledge is theory. We should be thankful if action of management is based on theory. Knowledge has temporal spread. Information is not knowledge. The world is drowning in information but is slow in acquisition of knowledge. There is no substitute for knowledge." (William E Deming, "The New Economics for Industry, Government, Education", 1993)

"Worldviews are social constructions, and they channel the search for facts. But facts are found and knowledge progresses, however fitfully. Fact and theory are intertwined, and all great scientists understand the interaction." (Stephen J Gould, "Shields of Expectation - and Actuality", 1993)

"At the very least (there is certainly more), cybernetics implies a new philosophy about (1) what we can know, (2) about what it means for something to exist, and (3) about how to get things done. Cybernetics implies that knowledge is to be built up through effective goal-seeking processes, and perhaps not necessarily in uncovering timeless, absolute, attributes of things, irrespective of our purposes and needs." (Jeff Dooley, "Thoughts on the Question: What is Cybernetics", 1995)

"Crude complexity is ‘the length of the shortest message that will describe a system, at a given level of coarse graining, to someone at a distance, employing language, knowledge, and understanding that both parties share (and know they share) beforehand." (Murray Gell-Mann, "What is Complexity?" Complexity Vol. 1 (1), 1995)

"Humans may crave absolute certainty; they may aspire to it; they may pretend, as partisans of certain religions do, to have attained it. But the history of science - by far the most successful claim to knowledge accessible to humans - teaches that the most we can hope for is successive improvement in our understanding, learning from our mistakes, an asymptotic approach to the Universe, but with the proviso that absolute certainty will always elude us. We will always be mired in error. The most each generation can hope for is to reduce the error bars a little, and to add to the body of data to which error bars apply." (Carl Sagan, "The Demon-Haunted World: Science as a Candle in the Dark", 1995)

"The amount of understanding produced by a theory is determined by how well it meets the criteria of adequacy - testability, fruitfulness, scope, simplicity, conservatism - because these criteria indicate the extent to which a theory systematizes and unifies our knowledge." (Theodore Schick Jr., "How to Think about Weird Things: Critical Thinking for a New Age", 1995)

"The representational nature of maps, however, is often ignored - what we see when looking at a map is not the word, but an abstract representation that we find convenient to use in place of the world. When we build these abstract representations we are not revealing knowledge as much as are creating it." (Alan MacEachren, "How Maps Work: Representation, Visualization, and Design", 1995)

"The term mental model refers to knowledge structures utilized in the solving of problems. Mental models are causal and thus may be functionally defined in the sense that they allow a problem solver to engage in description, explanation, and prediction. Mental models may also be defined in a structural sense as consisting of objects, states that those objects exist in, and processes that are responsible for those objects’ changing states." (Robert Hafner & Jim Stewart, "Revising Explanatory Models to Accommodate Anomalous Genetic Phenomena: Problem Solving in the ‘Context of Discovery’", Science Education 79 (2), 1995) 

"Generalization is the process of matching new, unknown input data with the problem knowledge in order to obtain the best possible solution, or one close to it. Generalization means reacting properly to new situations, for example, recognizing new images, or classifying new objects and situations. Generalization can also be described as a transition from a particular object description to a general concept description. This is a major characteristic of all intelligent systems." (Nikola K Kasabov, "Foundations of Neural Networks, Fuzzy Systems, and Knowledge Engineering", 1996)

"In the new systems thinking, the metaphor of knowledge as a building is being replaced by that of the network. As we perceive reality as a network of relationships, our descriptions, too, form an interconnected network of concepts and models in which there are no foundations. For most scientists such a view of knowledge as a network with no firm foundations is extremely unsettling, and today it is by no means generally accepted. But as the network approach expands throughout the scientific community, the idea of knowledge as a network will undoubtedly find increasing acceptance." (Fritjof Capra," The Web of Life: a new scientific understanding of living systems", 1996)

"Discourses are ways of referring to or constructing knowledge about a particular topic of practice: a cluster (or formation) of ideas, images and practices, which provide ways of talking about, forms of knowledge and conduct associated with, a particular topic, social activity or institutional site in society. These discursive formations, as they are known, define what is and is not appropriate in our formulation of, and our practices in relation to, a particular subject or site of social activity." (Stuart Hall, "Representation: Cultural Representations and Signifying Practices", 1997)

"Data is discrimination between physical states of things (black, white, etc.) that may convey or not convey information to an agent. Whether it does so or not depends on the agent's prior stock of knowledge." (Max Boisot, "Knowledge Assets", 1998)

"The social constructivist thesis is that mathematics is a social construction, a cultural product, fallible like any other branch of knowledge." (Paul Ernest, "Social Constructivism as a Philosophy of Mathematics", 1998)

"An individual understands a concept, skill, theory, or domain of knowledge to the extent that he or she can apply it appropriately in a new situation." (Howard Gardner, "The Disciplined Mind", 1999)

"Analysis of a system reveals its structure and how it works. It provides the knowledge required to make it work efficiently and to repair it when it stops working. Its product is know-how, knowledge, not understanding. To enable a system to perform effectively we must understand it - we must be able to explain its behavior—and this requires being aware of its functions in the larger systems of which it is a part." (Russell L Ackoff, "Re-Creating the Corporation", 1999)

On Knowledge (1980-1989)

"Definitions, like questions and metaphors, are instruments for thinking. Their authority rests entirely on their usefulness, not their correctness. We use definitions in order to delineate problems we wish to investigate, or to further interests we wish to promote. In other words, we invent definitions and discard them as suits our purposes." (Neil Postman, "Language Education in a Knowledge Context", 1980)

"A schema, then is a data structure for representing the generic concepts stored in memory. There are schemata representing our knowledge about all concepts; those underlying objects, situations, events, sequences of events, actions and sequences of actions. A schema contains, as part of its specification, the network of interrelations that is believed to normally hold among the constituents of the concept in question. A schema theory embodies a prototype theory of meaning. That is, inasmuch as a schema underlying a concept stored in memory corresponds to the meaning of that concept, meanings are encoded in terms of the typical or normal situations or events that instantiate that concept." (David E Rumelhart, "Schemata: The building blocks of cognition", 1980)

“Analogies, metaphors, and emblems are the threads by which the mind holds on to the world even when, absentmindedly, it has lost direct contact with it, and they guarantee the unity of human experience. Moreover, in the thinking process itself they serve as models to give us our bearings lest we stagger blindly among experiences that our bodily senses with their relative certainty of knowledge cannot guide us through.” (Hannah Arendt, “The Life of the Mind”, 1981)

"Knowledge specialists may ascribe a degree of certainty to their models of the world that baffles and offends managers. Often the complexity of the world cannot be reduced to mathematical abstractions that make sense to a manager. Managers who expect complete, one-to-one correspondence between the real world and each element in a model are disappointed and skeptical." (Dale E Zand, "Information, Organization, and Power", 1981)

"The thinking person goes over the same ground many times. He looks at it from varying points of view - his own, his arch-enemy’s, others’. He diagrams it, verbalizes it, formulates equations, constructs visual images of the whole problem, or of troublesome parts, or of what is clearly known. But he does not keep a detailed record of all this mental work, indeed could not. […] Deep understanding of a domain of knowledge requires knowing it in various ways. This multiplicity of perspectives grows slowly through hard work and sets the state for the re-cognition we experience as a new insight." (Howard E Gruber, "Darwin on Man", 1981)

"Definitions are temporary verbalizations of concepts, and concepts - particularly difficult concepts - are usually revised repeatedly as our knowledge and understanding grows." (Ernst Mayr, "The Growth of Biological Thought", 1982)

"We are drowning in information but starved for knowledge." (John Naisbitt, "Megatrends: Ten New Directions Transforming Our Lives", 1982)

"We define a semantic network as 'the collection of all the relationships that concepts have to other concepts, to percepts, to procedures, and to motor mechanisms' of the knowledge." (John F Sowa, "Conceptual Structures", 1984)

"Contrary to the impression students acquire in school, mathematics is not just a series of techniques. Mathematics tells us what we have never known or even suspected about notable phenomena and in some instances even contradicts perception. It is the essence of our knowledge of the physical world. It not only transcends perception but outclasses it." (Morris Kline, "Mathematics and the Search for Knowledge", 1985)

"Knowledge is the appropriate collection of information, such that it's intent is to be useful. Knowledge is a deterministic process. When someone 'memorizes' information (as less-aspiring test-bound students often do), then they have amassed knowledge. This knowledge has useful meaning to them, but it does not provide for, in and of itself, an integration such as would infer further knowledge." (Russell L Ackoff, "Towards a Systems Theory of Organization", 1985)

"Although science literally means ‘knowledge’, the scientific attitude is concerned much more with rational perception through the mind and with testing such perceptions against actual fact, in the form of experiments and observations." (David Bohm & F David Peat, "Science, Order, and Creativity", 1987)

"There is no coherent knowledge, i.e. no uniform comprehensive account of the world and the events in it. There is no comprehensive truth that goes beyond an enumeration of details, but there are many pieces of information, obtained in different ways from different sources and collected for the benefit of the curious. The best way of presenting such knowledge is the list - and the oldest scientific works were indeed lists of facts, parts, coincidences, problems in several specialized domains." (Paul K Feyerabend, "Farewell to Reason", 1987)

"We admit knowledge whenever we observe an effective (or adequate) behavior in a given context, i.e., in a realm or domain which we define by a question (explicit or implicit)." (Humberto Maturana & Francisco J Varela, "The Tree of Knowledge", 1987)

"In the Information Age, the first step to sanity is FILTERING. Filter the information: extract for knowledge. Filter first for substance. Filter second for significance. […] Filter third for reliability. […] Filter fourth for completeness." (Marc Stiegler, "David’s Sling", 1988)

"Probabilities are summaries of knowledge that is left behind when information is transferred to a higher level of abstraction." (Judea Pearl, "Probabilistic Reasoning in Intelligent Systems: Network of Plausible, Inference", 1988)

"Science doesn't purvey absolute truth. Science is a mechanism. It's a way of trying to improve your knowledge of nature. It's a system for testing your thoughts against the universe and seeing whether they match. And this works, not just for the ordinary aspects of science, but for all of life. I should think people would want to know that what they know is truly what the universe is like, or at least as close as they can get to it." (Isaac Asimov, [Interview by Bill Moyers] 1988)

"A discovery in science, or a new theory, even where it appears most unitary and most all-embracing, deals with some immediate element of novelty or paradox within the framework of far vaster, unanalyzed, unarticulated reserves of knowledge, experience, faith, and presupposition. Our progress is narrow: it takes a vast world unchallenged and for granted." (James R Oppenheimer, "Atom and Void", 1989)

On Knowledge (1960-1969)

"Any pattern of activity in a network, regarded as consistent by some observer, is a system, Certain groups of observers, who share a common body of knowledge, and subscribe to a particular discipline, like 'physics' or 'biology' (in terms of which they pose hypotheses about the network), will pick out substantially the same systems. On the other hand, observers belonging to different groups will not agree about the activity which is a system." (Gordon Pask, "The Natural History of Networks", 1960)

"The most important maxim for data analysis to heed, and one which many statisticians seem to have shunned is this: ‘Far better an approximate answer to the right question, which is often vague, than an exact answer to the wrong question, which can always be made precise.’ Data analysis must progress by approximate answers, at best, since its knowledge of what the problem really is will at best be approximate." (John W Tukey, "The Future of Data Analysis", Annals of Mathematical Statistics, Vol. 33, No. 1, 1962)

"Incomplete knowledge must be considered as perfectly normal in probability theory; we might even say that, if we knew all the circumstances of a phenomenon, there would be no place for probability, and we would know the outcome with certainty." (Félix E Borel, Probability and Certainty", 1963)

"When a science approaches the frontiers of its knowledge, it seeks refuge in allegory or in analogy." (Erwin Chargaff, "Essays on Nucleic Acids", 1963)

"In its efforts to learn as much as possible about nature, modem physics has found that certain things can never be ‘known’ with certainty. Much of our knowledge must always remain uncertain. The most we can know is in terms of probabilities." (Richard P Feynman, "The Feynman Lectures on Physics", 1964)

"A model is a useful (and often indispensable) framework on which to organize our knowledge about a phenomenon. […] It must not be overlooked that the quantitative consequences of any model can be no more reliable than the a priori agreement between the assumptions of the model and the known facts about the real phenomenon. When the model is known to diverge significantly from the facts, it is self-deceiving to claim quantitative usefulness for it by appeal to agreement between a prediction of the model and observation." (John R Philip, 1966)

"It is a commonplace of modern technology that there is a high measure of certainty that problems have solutions before there is knowledge of how they are to be solved." (John K Galbraith, "The New Industrial State", 1967)

"The aim of science is not so much to search for truth, or even truths, as to classify our knowledge and to establish relations between observable phenomena in order to be able to predict the future in a certain measure and to explain the sequence of phenomena in relation to ourselves." (Pierre L du Noüy, "Between Knowing and Believing", 1967)

"It [knowledge] is clearly related to information, which we can now measure; and an economist especially is tempted to regard knowledge as a kind of capital structure, corresponding to information as an income flow. Knowledge, that is to say, is some kind of improbable structure or stock made up essentially of patterns - that is, improbable arrangements, and the more improbable the arrangements, we might suppose, the more knowledge there is." (Kenneth E Boulding, "Beyond Economics: Essays on Society", 1968)

"Knowing reality means constructing systems of transformations that correspond, more or less adequately, to reality. They are more or less isomorphic to transformations of reality. The transformational structures of which knowledge consists are not copies of the transformations in reality; they are simply possible isomorphic models among which experience can enable us to choose. Knowledge, then, is a system of transformations that become progressively adequate." (Jean Piaget, "Genetic Epistemology", 1968)

"Scientific knowledge is not created solely by the piecemeal mining of discrete facts by uniformly accurate and reliable individual scientific investigations. The process of criticism and evaluation, of analysis and synthesis, are essential to the whole system. It is impossible for each one of us to be continually aware of all that is going on around us, so that we can immediately decide the significance of every new paper that is published. The job of making such judgments must therefore be delegated to the best and wisest among us, who speak, not with their own personal voices, but on behalf of the whole community of Science. […] It is impossible for the consensus - public knowledge - to be voiced at all, unless it is channeled through the minds of selected persons, and restated in their words for all to hear." (John M Ziman, "Public Knowledge: An Essay Concerning the Social Dimension of Science", 1968)

"The idea of knowledge as an improbable structure is still a good place to start. Knowledge, however, has a dimension which goes beyond that of mere information or improbability. This is a dimension of significance which is very hard to reduce to quantitative form. Two knowledge structures might be equally improbable but one might be much more significant than the other." (Kenneth E Boulding, "Beyond Economics: Essays on Society", 1968)

"Discovery always carries an honorific connotation. It is the stamp of approval on a finding of lasting value. Many laws and theories have come and gone in the history of science, but they are not spoken of as discoveries. […] Theories are especially precarious, as this century profoundly testifies. World views can and do often change. Despite these difficulties, it is still true that to count as a discovery a finding must be of at least relatively permanent value, as shown by its inclusion in the generally accepted body of scientific knowledge." (Richard J. Blackwell, "Discovery in the Physical Sciences", 1969)

"It is not enough to observe, experiment, theorize, calculate and communicate; we must also argue, criticize, debate, expound, summarize, and otherwise transform the information that we have obtained individually into reliable, well established, public knowledge." (John M Ziman, "Information, Communication, Knowledge", Nature Vol. 224 (5217), 1969)

"Models constitute a framework or a skeleton and the flesh and blood will have to be added by a lot of common sense and knowledge of details."(Jan Tinbergen, "The Use of Models: Experience," 1969)

"The 'flow of information' through human communication channels is enormous. So far no theory exists, to our knowledge, which attributes any sort of unambiguous measure to this 'flow'." (Anatol Rapoport, "Modern Systems Research for the Behavioral Scientist", 1969)

08 June 2021

On Patterns (1960-1969)

"Any pattern of activity in a network, regarded as consistent by some observer, is a system, Certain groups of observers, who share a common body of knowledge, and subscribe to a particular discipline, like 'physics' or 'biology' (in terms of which they pose hypotheses about the network), will pick out substantially the same systems. On the other hand, observers belonging to different groups will not agree about the activity which is a system." (Gordon Pask, The Natural History of Networks, 1960)

"It is of our very nature to see the universe as a place that we can talk about. In particular, you will remember, the brain tends to compute by organizing all of its input into certain general patterns. It is natural for us, therefore, to try to make these grand abstractions, to seek for one formula, one model, one God, around which we can organize all our communication and the whole business of living." (John Z Young, "Doubt and Certainty in Science: A Biologist’s Reflections on the Brain", 1960)

"How can a modern anthropologist embark upon a generalization with any hope of arriving at a satisfactory conclusion? By thinking of the organizational ideas that are present in any society as a mathematical pattern." (Edmund R Leach, "Rethinking Anthropology", 1961)

"Mathematics is a creation of the mind. To begin with, there is a collection of things, which exist only in the mind, assumed to be distinguishable from one another; and there is a collection of statements about these things, which are taken for granted. Starting with the assumed statements concerning these invented or imagined things, the mathematician discovers other statements, called theorems, and proves them as necessary consequences. This, in brief, is the pattern of mathematics. The mathematician is an artist whose medium is the mind and whose creations are ideas." (Hubert S Wall, "Creative Mathematics", 1963)

"The mark of our time is its revulsion against imposed patterns." (Marshall McLuhan, "Understanding Media", 1964)

"The 'message' of any medium or technology is the change of scale or pace or pattern that it introduces into human affairs." (Marshall McLuhan, "Understanding Media", 1964)

"Without the hard little bits of marble which are called 'facts' or 'data' one cannot compose a mosaic; what matters, however, are not so much the individual bits, but the successive patterns into which you arrange them, then break them up and rearrange them." (Arthur Koestler, "The Act of Creation", 1964)

"The notion of a fuzzy set provides a convenient point of departure for the construction of a conceptual framework which parallels in many respects the framework used in the case of ordinary sets, but is more general than the latter and, potentially, may prove to have a much wider scope of applicability, particularly in the fields of pattern classification and information processing. Essentially, such a framework provides a natural way of dealing with problems in which the source of imprecision is the absence of sharply denned criteria of class membership rather than the presence of random variables." (Lotfi A Zadeh, "Fuzzy Sets", 1965)

"As perceivers we select from all the stimuli falling on our senses only those which interest us, and our interests are governed by a pattern-making tendency, sometimes called a schema. In a chaos of shifting impressions each of us constructs a stable world in which objects have recognisable shapes, are located in depth and have permanence." (Mary Douglas, "Purity and Danger", 1966)

"System theory is basically concerned with problems of relationships, of structure, and of interdependence rather than with the constant attributes of objects. In general approach it resembles field theory except that its dynamics deal with temporal as well as spatial patterns. Older formulations of system constructs dealt with the closed systems of the physical sciences, in which relatively self-contained structures could be treated successfully as if they were independent of external forces. But living systems, whether biological organisms or social organizations, are acutely dependent on their external environment and so must be conceived of as open systems." (Daniel Katz, "The Social Psychology of Organizations", 1966)

"[…] there is perhaps a difference between the ideas which are associated in the sense of their patterns being tired to the original one and available in connexion with it, and being actually associated or aroused. Our mental modelling of the outer world may imitate it and its sequences from moment to moment, but only that which is fairly frequent, or fits into other patterns, will remain for long, and of that only a portion will arise in response to other ideas. " (Kenneth J W Craik, "The Nature of Psychology", 1966)

"It [knowledge] is clearly related to information, which we can now measure; and an economist especially is tempted to regard knowledge as a kind of capital structure, corresponding to information as an income flow. Knowledge, that is to say, is some kind of improbable structure or stock made up essentially of patterns - that is, improbable arrangements, and the more improbable the arrangements, we might suppose, the more knowledge there is." (Kenneth Boulding, "Beyond Economics: Essays on Society", 1968)

"The central task of a natural science is to make the wonderful commonplace: to show that complexity, correctly viewed, is only a mask for simplicity; to find pattern hidden in apparent chaos. […] This is the task of natural science: to show that the wonderful is not incomprehensible, to show how it can be comprehended - but not to destroy wonder. For when we have explained the wonderful, unmasked the hidden pattern, a new wonder arises at how complexity was woven out of simplicity. The aesthetics of natural science and mathematics is at one with the aesthetics of music and painting - both inhere in the discovery of a partially concealed pattern." (Herbert A Simon, "The Sciences of the Artificial", 1968)

"Faced with information overload, we have no alternative but pattern-recognition."(Marshall McLuhan, "Counterblast", 1969) 

"The central task of a natural science is to make the wonderful commonplace: to show that complexity, correctly viewed, is only a mask for simplicity; to find pattern hidden in apparent chaos." (Herbert A Simon, "The Sciences of the Artificial", 1969)

"Visual thinking calls, more broadly, for the ability to see visual shapes as images of the patterns of forces that underlie our existence - the functioning of minds, of bodies or machines, the structure of societies or ideas." (Rudolf Arnheim, "Visual Thinking", 1969)

On Patterns (2010-2019)

"Because the question for me was always whether that shape we see in our lives was there from the beginning or whether these random events are only called a pattern after the fact. Because otherwise we are nothing." (Cormac McCarthy, "All the Pretty Horses", 2010)

"The human mind delights in finding pattern - so much so that we often mistake coincidence or forced analogy for profound meaning. No other habit of thought lies so deeply within the soul of a small creature trying to make sense of a complex world not constructed for it." (Stephen J Gould, "The Flamingo's Smile: Reflections in Natural History", 2010)

"What advantages do diagrams have over verbal descriptions in promoting system understanding? First, by providing a diagram, massive amounts of information can be presented more efficiently. A diagram can strip down informational complexity to its core - in this sense, it can result in a parsimonious, minimalist description of a system. Second, a diagram can help us see patterns in information and data that may appear disordered otherwise. For example, a diagram can help us see mechanisms of cause and effect or can illustrate sequence and flow in a complex system. Third, a diagram can result in a less ambiguous description than a verbal description because it forces one to come up with a more structured description." (Robbie T Nakatsu, "Diagrammatic Reasoning in AI", 2010)

"A surprising proportion of mathematicians are accomplished musicians. Is it because music and mathematics share patterns that are beautiful?" (Martin Gardner, "The Dover Math and Science Newsletter", 2011)

"It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern." (Daniel Kahneman, "Thinking, Fast and Slow", 2011)

"Knowing the importance of luck, you should be particularly suspicious when highly consistent patterns emerge from the comparison of successful and less successful firms. In the presence of randomness, regular patterns can only be mirages." (Daniel Kahneman, "Thinking, Fast and Slow", 2011)

"Once a myth becomes established, it forms part of our mental model of the world and alters our perception, the way our brains interpret the fleeting patterns our eyes pick up." (Jeremy Wade, "River Monsters: True Stories of the Ones that Didn't Get Away", 2011)

"Randomness might be defined in terms of order - its absence, that is. […] Everything we care about lies somewhere in the middle, where pattern and randomness interlace." (James Gleick, "The Information: A History, a Theory, a Flood", 2011)

"Equations have hidden powers. They reveal the innermost secrets of nature. […] The power of equations lies in the philosophically difficult correspondence between mathematics, a collective creation of human minds, and an external physical reality. Equations model deep patterns in the outside world. By learning to value equations, and to read the stories they tell, we can uncover vital features of the world around us." (Ian Stewart, "In Pursuit of the Unknown", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"Mathematical intuition is the mind’s ability to sense form and structure, to detect patterns that we cannot consciously perceive. Intuition lacks the crystal clarity of conscious logic, but it makes up for that by drawing attention to things we would never have consciously considered." (Ian Stewart, "Visions of Infinity", 2013)

"Proof, in fact, is the requirement that makes great problems problematic. Anyone moderately competent can carry out a few calculations, spot an apparent pattern, and distil its essence into a pithy statement. Mathematicians demand more evidence than that: they insist on a complete, logically impeccable proof. Or, if the answer turns out to be negative, a disproof. It isn’t really possible to appreciate the seductive allure of a great problem without appreciating the vital role of proof in the mathematical enterprise. Anyone can make an educated guess. What’s hard is to prove it’s right. Or wrong." (Ian Stewart, "Visions of Infinity", 2013)

"Swarm intelligence illustrates the complex and holistic way in which the world operates. Order is created from chaos; patterns are revealed; and systems are free to work out their errors and problems at their own level. What natural systems can teach humanity is truly amazing." (Lawrence K Samuels, "Defense of Chaos: The Chaology of Politics, Economics and Human Action", 2013)

"To put it simply, we communicate when we display a convincing pattern, and we discover when we observe deviations from our expectations. These may be explicit in terms of a mathematical model or implicit in terms of a conceptual model. How a reader interprets a graphic will depend on their expectations. If they have a lot of background knowledge, they will view the graphic differently than if they rely only on the graphic and its surrounding text." (Andrew Gelman & Antony Unwin, "Infovis and Statistical Graphics: Different Goals, Different Looks", Journal of Computational and Graphical Statistics Vol. 22(1), 2013)

"Another way to secure statistical significance is to use the data to discover a theory. Statistical tests assume that the researcher starts with a theory, collects data to test the theory, and reports the results - whether statistically significant or not. Many people work in the other direction, scrutinizing the data until they find a pattern and then making up a theory that fits the pattern." (Gary Smith, "Standard Deviations", 2014)

"Intersections of lines, for example, remain intersections, and the hole in a torus (doughnut) cannot be transformed away. Thus a doughnut may be transformed topologically into a coffee cup (the hole turning into a handle) but never into a pancake. Topology, then, is really a mathematics of relationships, of unchangeable, or 'invariant', patterns." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"One of the remarkable features of these complex systems created by replicator dynamics is that infinitesimal differences in starting positions create vastly different patterns. This sensitive dependence on initial conditions is often called the butterfly-effect aspect of complex systems - small changes in the replicator dynamics or in the starting point can lead to enormous differences in outcome, and they change one’s view of how robust the current reality is. If it is complex, one small change could have led to a reality that is quite different." (David Colander & Roland Kupers, "Complexity and the art of public policy : solving society’s problems from the bottom up", 2014)

"[…] regard it in fact as the great advantage of the mathematical technique that it allows us to describe, by means of algebraic equations, the general character of a pattern even where we are ignorant of the numerical values which will determine its particular manifestation." (Friedrich A von Hayek, "The Market and Other Orders", 2014)

"We are genetically predisposed to look for patterns and to believe that the patterns we observe are meaningful. […] Don’t be fooled into thinking that a pattern is proof. We need a logical, persuasive explanation and we need to test the explanation with fresh data." (Gary Smith, "Standard Deviations", 2014)

"We are hardwired to make sense of the world around us - to notice patterns and invent theories to explain these patterns. We underestimate how easily pat - terns can be created by inexplicable random events - by good luck and bad luck." (Gary Smith, "Standard Deviations", 2014)

"A pattern is a design or model that helps grasp something. Patterns help connect things that may not appear to be connected. Patterns help cut through complexity and reveal simpler understandable trends. […] Patterns can be temporal, which is something that regularly occurs over time. Patterns can also be spatial, such as things being organized in a certain way. Patterns can be functional, in that doing certain things leads to certain effects. Good patterns are often symmetric. They echo basic structures and patterns that we are already aware of." (Anil K Maheshwari, "Business Intelligence and Data Mining", 2015)

"The human mind builds up theories by recognising familiar patterns and glossing over details that are well understood, so that it can concentrate on the new material. In fact it is limited by the amount of new information it can hold at any one time, and the suppression of familiar detail is often essential for a grasp of the total picture. In a written proof, the step-by-step logical deduction is therefore foreshortened where it is already a part of the reader's basic technique, so that they can comprehend the overall structure more easily." (Ian Stewart & David Tall, "The Foundations of Mathematics" 2nd Ed., 2015)

"Why do mathematicians care so much about pi? Is it some kind of weird circle fixation? Hardly. The beauty of pi, in part, is that it puts infinity within reach. Even young children get this. The digits of pi never end and never show a pattern. They go on forever, seemingly at random - except that they can’t possibly be random, because they embody the order inherent in a perfect circle. This tension between order and randomness is one of the most tantalizing aspects of pi." (Steven Strogatz, "Why PI Matters" 2015)

"Without chaos there would be no creation, no structure and no existence. After all, order is merely the repetition of patterns; chaos is the process that establishes those patterns. Without this creative self-organizing force, the universe would be devoid of biological life, the birth of stars and galaxies - everything we have come to know. (Lawrence K Samuels, "Chaos Gets a Bad Rap: Importance of Chaology to Liberty", 2015)

"A mental representation is a mental structure that corresponds to an object, an idea, a collection of information, or anything else, concrete or abstract, that the brain is thinking about. […] Because the details of mental representations can differ dramatically from field to field, it’s hard to offer an overarching definition that is not too vague, but in essence these representations are preexisting patterns of information - facts, images, rules, relationships, and so on - that are held in long-term memory and that can be used to respond quickly and effectively in certain types of situations." (Anders Ericsson & Robert Pool," Peak: Secrets from  the  New  Science  of  Expertise", 2016)

"String theory today looks almost fractal. The more closely people explore any one corner, the more structure they find. Some dig deep into particular crevices; others zoom out to try to make sense of grander patterns. The upshot is that string theory today includes much that no longer seems stringy. Those tiny loops of string whose harmonics were thought to breathe form into every particle and force known to nature (including elusive gravity) hardly even appear anymore on chalkboards at conferences." (K C Cole, "The Strange Second Life of String Theory", Quanta Magazine", 2016)

"The relationship of math to the real world has been a conundrum for philosophers for centuries, but it is also an inspiration for poets. The patterns of mathematics inhabit a liminal space - they were initially derived from the natural world and yet seem to exist in a separate, self-contained system standing apart from that world. This makes them a source of potential metaphor: mapping back and forth between the world of personal experience and the world of mathematical patterns opens the door to novel connections." (Alice Major, "Mapping from e to Metaphor", 2018)

"Apart from the technical challenge of working with the data itself, visualization in big data is different because showing the individual observations is just not an option. But visualization is essential here: for analysis to work well, we have to be assured that patterns and errors in the data have been spotted and understood. That is only possible by visualization with big data, because nobody can look over the data in a table or spreadsheet." (Robert Grant, "Data Visualization: Charts, Maps and Interactive Graphics", 2019)

02 June 2021

On Structure: Structure in Mathematics II

"[…] the major mathematical research acquires an organization and orientation similar to the poetical function which, adjusting by means of metaphor disjunctive elements, displays a structure identical to the sensitive universe. Similarly, by means of its axiomatic or theoretical foundation, mathematics assimilates various doctrines and serves the instructive purpose, the one set up by the unifying moral universe of concepts." (Dan Barbilian, "The Autobiography of the Scientist", 1940)

"Mathematicians deal with possible worlds, with an infinite number of logically consistent systems. Observers explore the one particular world we inhabit. Between the two stands the theorist. He studies possible worlds but only those which are compatible with the information furnished by observers. In other words, theory attempts to segregate the minimum number of possible worlds which must include the actual world we inhabit. Then the observer, with new factual information, attempts to reduce the list further. And so it goes, observation and theory advancing together toward the common goal of science, knowledge of the structure and observation of the universe." (Edwin P Hubble, "The Problem of the Expanding Universe", 1941)

"To say that mathematics in general has been reduced to logic hints at some new firming up of mathematics at its foundations. This is misleading. Set theory is less settled and more conjectural than the classical mathematical superstructure than can be founded upon it." (Willard van Orman Quine, "Elementary Logic", 1941)

"One expects a mathematical theorem or a mathematical theory not only to describe and to classify in a simple and elegant way numerous and a priori disparate special cases. One also expects ‘elegance’ in its ‘architectural’ structural makeup." (John von Neumann, "The Mathematician" [in "Works of the Mind" Vol. I (1), 1947])

"The constructions of the mathematical mind are at the same time free and necessary. The individual mathematician feels free to define his notions and set up his axioms as he pleases. But the question is will he get his fellow-mathematician interested in the constructs of his imagination. We cannot help the feeling that certain mathematical structures which have evolved through the combined efforts of the mathematical community bear the stamp of a necessity not affected by the accidents of their historical birth. Everybody who looks at the spectacle of modern algebra will be struck by this complementarity of freedom and necessity." (Hermann Weyl, "A Half-Century of Mathematics", The American Mathematical Monthly, 1951)

"Mathematicians create by acts of insight and intuition. Logic then sanctions the conquests of intuition. It is the hygiene that mathematics practice to keep its ideas healthy and strong. Moreover, the whole structure rests fundamentally on uncertain ground, the intuitions of man." (Morris Kline, "Mathematics in Western Culture", 1953)

"Mathematics is not only the model along the lines of which the exact sciences are striving to design their structure; mathematics is the cement which holds the structure together." (Tobias Dantzig, "Number: The Language of Science" 4th Ed, 1954)

"Mathematics, springing from the soil of basic human experience with numbers and data and space and motion, builds up a far-flung architectural structure composed of theorems which reveal insights into the reasons behind appearances and of concepts which relate totally disparate concrete ideas." (Saunders MacLane, "Of Course and Courses"The American Mathematical Monthly, Vol 61, No 3, 1954)

"Chess combines the beauty of mathematical structure with the recreational delights of a competitive game." (Martin Gardner, "Mathematics, Magic, and Mystery", 1956)

"Probability is a mathematical discipline with aims akin to those, for example, of geometry or analytical mechanics. In each field we must carefully distinguish three aspects of the theory: (a) the formal logical content, (b) the intuitive background, (c) the applications. The character, and the charm, of the whole structure cannot be appreciated without considering all three aspects in their proper relation." (William Feller, "An Introduction to Probability Theory and Its Applications", 1957)

"If the system exhibits a structure which can be represented by a mathematical equivalent, called a mathematical model, and if the objective can be also so quantified, then some computational method may be evolved for choosing the best schedule of actions among alternatives. Such use of mathematical models is termed mathematical programming."  (George Dantzig, "Linear Programming and Extensions", 1959)

Complexity (From Fiction to Science-Fiction)

"When distant and unfamiliar and complex things are communicated to great masses of people, the truth suffers a considerable and often a radical distortion. The complex is made over into the simple, the hypothetical into the dogmatic, and the relative into an absolute." (Walter Lippmann, "The Public Philosophy", 1955)

"The more complex a civilization, the more vital to its existence is the maintenance of the flow of information; hence the more vulnerable it becomes to any disturbance in that flow." (Stanislaw Lem, "Memoirs Found in a Bathtub", 1961)

"[Human] communication is rendered more complex by the use of differing sets of sound-symbols, called languages and by the fact that a given set of symbols tends to change with the passage of years to become an entirely new language." (Howard L Myers, "The Creatures of Man", 1968)

"Isn’t the measure of complexity the measure of the eternal joy?" (Ursula K Le Guin, "Vaster Than Empires and More Slow", 1971)

"Time is no longer a line along which history, past or future, lies neatly arranged, but a field of great mystery and complexity, in the contemplation of which the mind perceives an immense terror, and an indestructible hope." (Ursula K Le Guin, 1977)

"Any information system of sufficient complexity will inevitably become infected with viruses - viruses generated from within itself." (Neal Stephenson, "Snow Crash", 1992)

"The universe is driven by the complex interaction between three ingredients: matter, energy, and enlightened self-interest." (Marc S Zicree, "Survivors" [episode of Babylon 5], 1994)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...