10 January 2020

On Entropy (1970-1979)

"There is a kind of second law of cultural dynamics which states simply that when anything has been done, it cannot be done again. In other words, we start off any system with a potential for novelty which is gradually exhausted. We see this in every field of human life, in the arts as well as the sciences. Once Beethoven has written the Ninth Symphony, nobody else can do it. Consequently, we find that in any evolutionary process, even in the arts, the search for novelty becomes corrupting. The 'entropy trap' is perhaps the most subtle and the most fundamental of the obstacles toward realising the developed society." (Kenneth Boulding, "The Science Revelation", Bulletin of the Atomic Scientists Vol. 26 (7), 1970)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972) 

"The functional order maintained within living systems seems to defy the Second Law; nonequilibrium thermodynamics describes how such systems come to terms with entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972) 

"An item of information which leads to the exclusion of certain of the possible outcomes causes a decrease in entropy: this decrease is called the amount of information, and, like the entropy, is measured in bits (it is, in fact, the same thing with the opposite sign: some even call it negative entropy)." (Bruno de Finetti, "Theory of Probability", 1974)


"Entropy theory, on the other hand, is not concerned with the probability of succession in a series of items but with the overall distribution of kinds of items in a given arrangement." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974)

"Life, this anti-entropy, ceaselessly reloaded with energy, is a climbing force, toward order amidst chaos, toward light, among the darkness of the indefinite, toward the mystic dream of Love, between the fire which devours itself and the silence of the Cold. Such a Nature does not accept abdication, nor skepticism." (Albert Claude, [Nobel lecture for award received] 1974) 

"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)
"Entropy theory is indeed a first attempt to deal with global form; but it has not been dealing with structure. All it says is that a large sum of elements may have properties not found in a smaller sample of them." (Rudolf Arnheim, "Entropy and Art: An Essay on Disorder and Order", 1974) 

"If entropy must constantly and continuously increase, then the universe is remorselessly running down, thus setting a limit (a long one, to be sure) on the existence of humanity. To some human beings, this ultimate end poses itself almost as a threat to their personal immortality, or as a denial of the omnipotence of God. There is, therefore, a strong emotional urge to deny that entropy must increase." (Isaac Asimov," Asimov on Physics", 1976) 

"The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979)

"The interaction between parts of mind is triggered by difference, and difference is a nonsubstantial phenomenon not located in space or time; difference is related to negentropy and entropy rather than energy." (Gregory Bateson, "Mind and Nature: A Necessary Unity", 1979)

"Thus, an increase in entropy means a decrease in our ability to change thermal energy, the energy of heat, into mechanical energy. An increase of entropy means a decrease of available energy."  (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979)

"You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage." (John von Neumann) [Suggesting to Claude Shannon a name for his new uncertainty function, see Scientific American Vol. 225 (3), 1971]

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

Douglas T Ross - Collected Quotes

"Automatic design has the computer do too much and the human do too little, whereas automatic programming has the human do too much and...