"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)
"Progress imposes not only new possibilities for the future but new restrictions. It seems almost as if progress itself and our fight against the increase of entropy intrinsically must end in the downhill path from which we are trying to escape." (Norbert Wiener, "The Human Use of Human Beings", 1950)
"There is no concept in the whole field of physics which is more difficult to understand than is the concept of entropy, nor is there one which is more fundamental." (Francis W Sears, "Mechanics, heat and sound", 1950)
"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)
"Reversible processes are not, in fact, processes at all, they are sequences of states of equilibrium. The processes which we encounter in real life are always irreversible processes." (Arnold Sommerfeld, "Thermodynamics and Statistical Mechanics", Lectures on Theoretical - Physics Vol. V, 1956)
"The mere fact that the same mathematical expression -Σ pi log(pi) [i is index], occurs both in statistical mechanics and in information theory does not in itself establish any connection between these fields. This can be done only by finding new viewpoints from which thermodynamic entropy and information-theory entropy appear as the same concept." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)
"The second law of thermodynamics provides a more modem" (and a more discouraging) example of the maximum principle: the entropy" (disorder) of the universe tends toward a maximum." (James R Newman, "The World of Mathematics" Vol. II, 1956)
"There is no logical necessity for the existence of a unique direction of total time; whether there is only one time direction, or whether time directions alternate, depends on the shape of the entropy curve plotted by the universe." (Hans Reichenbach, "The Direction of Time", 1956)
"When, for instance, I see a symmetrical object, I feel its pleasurable quality, but do not need to assert explicitly to myself, ‘How symmetrical!’. This characteristic feature may be explained as follows. In the course of individual experience it is found generally that symmetrical objects possess exceptional and desirable qualities. Thus our own bodies are not regarded as perfectly formed unless they are symmetrical. Furthermore, the visual and tactual technique by which we perceive the symmetry of various objects is uniform, highly developed, and almost instantaneously applied. It is this technique which forms the associative 'pointer.' In consequence of it, the perception of any symmetrical object is accompanied by an intuitive aesthetic feeling of positive tone." (George D Birkhoff, "Mathematics of Aesthetics", 1956)
"But in addition to what we decide to do by way of transformation, there are certain tendencies in the way systems behave of their own volition when left to their own devices. The convenient analogy for one of these processes is found in the second law of thermodynamics: an 'ordering' process goes on, for which the name is entropy. This can be explained without technicalities as the tendency of a system to settle down to a uniform distribution of its energy. The dissipation of local pockets of high energy is measured by an increase in entropy, until at maximum entropy all is uniform. According to this model, order is more 'natural' than chaos. This is the reason why it is convenient to discuss cybernetic systems, with their self-regulating tendency to attain stability or orderliness, in terms of entropy - a term which has been taken over to name a key tool of cybernetics." (Stafford Beer, "Cybernetics and Management", 1959)
"[…] to the unpreoccupied mind, complex numbers are far from natural or simple and they cannot be suggested by physical observations. Furthermore, the use of complex numbers is in this case not a calculational trick of applied mathematics but comes close to being a necessity in the formulation of quantum mechanics." (Eugene Wigner,"The Unreasonable Effectiveness of Mathematics in the Natural Sciences", 1960)
"[...] sciences do not try to explain, they hardly even try to interpret, they mainly make models. By a model is meant a mathematical construct which, with the addition of certain verbal interpretations, describes observed phenomena. The justification of such a mathematical construct is solely and precisely that it is expected to work - that is, correctly to describe phenomena from a reasonably wide area. Furthermore, it must satisfy certain aesthetic criteria - that is, in relation to how much it describes, it must be rather simple." (John von Neumann, Method in the physical sciences", 1961)
"[...] thermodynamics knows of no such notion as the 'entropy of a physical system'. Thermodynamics does have the concept of the entropy of a thermodynamic system; but a given physical system corresponds to many different thermodynamic systems." (Edwin T Jaynes, "Gibbs vs Boltzmann Entropies", 1964)
"As mechanics is the science of motions and forces, so thermodynamics is the science of forces and entropy. What is entropy? Heads have split for a century trying to define entropy in terms of other things. Entropy, like force, is an undefined object, and if you try to define it, you will suffer the same fate as the force definers of the seventeenth and eighteenth centuries: Either you will get something too special or you will run around in a circle." (Clifford Truesdell, "Six Lectures on Modern Natural Philosophy", 1966)
"Despite two centuries of study, the integrals of general dynamical systems remain covered with darkness. To save the classical thermostatics, the practical success of which is shown by the wide use to which it has been put, we must find a way out. That is, we must find some mathematical connection between time averages of the functions of physical interest and the corresponding simple canonical averages." (Clifford Truesdell, "Six Lectures on Modern Natural Philosophy", 1966)
"The mere fact that the same mathematical expression -Σ pi log(pi) [i is index], occurs both in statistical mechanics and in information theory does not in itself establish any connection between these fields. This can be done only by finding new viewpoints from which thermodynamic entropy and information-theory entropy appear as the same concept." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)
"Thermostatics, which even now is usually called thermodynamics, has an unfortunate history and a cancerous tradition. It arose in a chaos of metaphysical and indeed irrational controversy, the traces of which drip their poison even today. As compared with the older science of mechanics and the younger science of electromagnetism, its mathematical structure is meager. Though claims for its breadth of application are often extravagant, the examples from which its principles usually are inferred are most special, and extensive mathematical developments based on fundamental equations, such as typify mechanics and electromagnetism, are wanting. The logical standards acceptable in thermostatics fail to meet the criteria of other exact sciences [...]." (Clifford Truesdell, "Six Lectures on Modern Natural Philosophy", 1966)
"Conventional physics deals only with closed systems, i.e. systems which are considered to be isolated from their environment. [...] However, we find systems which by their very nature and definition are not closed systems. Every living organism is essentially an open system. It maintains itself in a continuous inflow and outflow, a building up and breaking down of components, never being, so long as it is alive, in a state of chemical and thermodynamic equilibrium but maintained in a so-called steady state which is distinct from the latter." (Ludwig von Bertalanffy, "General System Theory", 1968)"
"My analysis of living systems uses concepts of thermodynamics, information theory, cybernetics, and systems engineering, as well as the classical concepts appropriate to each level. The purpose is to produce a description of living structure and process in terms of input and output, flows through systems, steady states, and feedbacks, which will clarify and unify the facts of life." (James G Miller, "Living Systems: Basic Concepts", 1969)
"In general, one might define a complex of semantic components connected by logical constants as a concept. The dictionary of a language is then a system of concepts in which a phonological form and certain syntactic and morphological characteristics are assigned to each concept. This system of concepts is structured by several types of relations. It is supplemented, furthermore, by redundancy or implicational rules […] representing general properties of the whole system of concepts. […] At least a relevant part of these general rules is not bound to particular languages, but represents presumably universal structures of natural languages. They are not learned, but are rather a part of the human ability to acquire an arbitrary natural language." (Manfred Bierwisch, "Semantics", 1970)
"Self-organization can be defined as the spontaneous creation of a globally coherent pattern out of local interactions. Because of its distributed character, this organization tends to be robust, resisting perturbations. The dynamics of a self-organizing system is typically non-linear, because of circular or feedback relations between the components. Positive feedback leads to an explosive growth, which ends when all components have been absorbed into the new configuration, leaving the system in a stable, negative feedback state. Non-linear systems have in general several stable states, and this number tends to increase" (bifurcate) as an increasing input of energy pushes the system farther from its thermodynamic equilibrium." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)
"Self-organization can be defined as the spontaneous creation of a globally coherent pattern out of local interactions. Because of its distributed character, this organization tends to be robust, resisting perturbations. The dynamics of a self-organizing system is typically non-linear, because of circular or feedback relations between the components. Positive feedback leads to an explosive growth, which ends when all components have been absorbed into the new configuration, leaving the system in a stable, negative feedback state. Non-linear systems have in general several stable states, and this number tends to increase" (bifurcate) as an increasing input of energy pushes the system farther from its thermodynamic equilibrium. To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the" (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)
"Mental models are fuzzy, incomplete, and imprecisely stated. Furthermore, within a single individual, mental models change with time, even during the flow of a single conversation. The human mind assembles a few relationships to fit the context of a discussion. As debate shifts, so do the mental models. Even when only a single topic is being discussed, each participant in a conversation employs a different mental model to interpret the subject. Fundamental assumptions differ but are never brought into the open. […] A mental model may be correct in structure and assumptions but, even so, the human mind - either individually or as a group consensus - is apt to draw the wrong implications for the future." (Jay W Forrester,"Counterintuitive Behaviour of Social Systems", Technology Review, 1971)
"Mental models are fuzzy, incomplete, and imprecisely stated. Furthermore, within a single individual, mental models change with time, even during the flow of a single conversation. The human mind assembles a few relationships to fit the context of a discussion. As debate shifts, so do the mental models. Even when only a single topic is being discussed, each participant in a conversation employs a different mental model to interpret the subject. Fundamental assumptions differ but are never brought into the open. […] A mental model may be correct in structure and assumptions but, even so, the human mind - either individually or as a group consensus - is apt to draw the wrong implications for the future." (Jay W Forrester, "Counterintuitive Behaviour of Social Systems", Technology Review, 1971)
"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)
"The evolution of a physicochemical system leads to an equilibrium state of maximum disorder." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)
"The functional order maintained within living systems seems to defy the Second Law; nonequilibrium thermodynamics describes how such systems come to terms with entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)
"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25" (11), 1972)
"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)
"when matter is becoming disturbed by non-equilibrium conditions it organizes itself, it wakes up. It happens that our world is a non-equilibrium system." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)
"When the phenomena of the universe are seen as linked together by cause-and-effect and energy transfer, the resulting picture is of complexly branching and interconnecting chains of causation. In certain regions of this universe" (notably organisms in environments, ecosystems, thermostats, steam engines with governors, societies, computers, and the like), these chains of causation form circuits which are closed in the sense that causal interconnection can be traced around the circuit and back through whatever position was" (arbitrarily) chosen as the starting point of the description. In such a circuit, evidently, events at any position in the circuit may be expected to have effect at all positions on the circuit at later times." (Gregory Bateson, "Steps to an Ecology of Mind", 1972)
"It has been suggested that thermodynamic irreversibility is due to cosmological expansion." (Peter T Landsberg, "Thermodynamics, Cosmology, and the Physical Constants", 1973)
No comments:
Post a Comment