07 December 2020

On Entropy (1950-1959)

"But in no case is there any question of time flowing backward, and in fact the concept of backward flow of time seems absolutely meaningless. […] If it were found that the entropy of the universe were decreasing, would one say that time was flowing backward, or would one say that it was a law of nature that entropy decreases with time?" (Percy W Bridgman, "Reflections of a Physicist", 1950)

"It is my thesis that the physical functioning of the living individual and the operation of some of the newer communication machines are precisely parallel in their analogous attempts to control entropy through feedback. Both of them have sensory receptors as one stage of their cycle of operation: that is, in both of them there exists a special apparatus for collecting information from the outer world at low energy levels, and for making it available in the operation of the individual or of the machine. In both cases these external messages are not taken neat, but through the internal transforming powers of the apparatus, whether it be alive or dead. The information is then turned into a new form available for the further stages of performance. In both the animal and the machine this performance is made to be effective on the outer world. In both of them, their performed action on the outer world, and not merely their intended action, is reported back to the central regulatory apparatus." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Progress imposes not only new possibilities for the future but new restrictions. It seems almost as if progress itself and our fight against the increase of entropy intrinsically must end in the downhill path from which we are trying to escape." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"There is no concept in the whole field of physics which is more difficult to understand than is the concept of entropy, nor is there one which is more fundamental." (Francis W Sears, "Mechanics, heat and sound", 1950)

"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"The mere fact that the same mathematical expression -Σ pi log(pi) [i is index], occurs both in statistical mechanics and in information theory does not in itself establish any connection between these fields. This can be done only by finding new viewpoints from which thermodynamic entropy and information-theory entropy appear as the same concept." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"The second law of thermodynamics provides a more modem (and a more discouraging) example of the maximum principle: the entropy (disorder) of the universe tends toward a maximum." (James R Newman, "The World of Mathematics" Vol. II, 1956)

"There is no logical necessity for the existence of a unique direction of total time; whether there is only one time direction, or whether time directions alternate, depends on the shape of the entropy curve plotted by the universe." (Hans Reichenbach, "The Direction of Time", 1956)

"But in addition to what we decide to do by way of transformation, there are certain tendencies in the way systems behave of their own volition when left to their own devices. The convenient analogy for one of these processes is found in the second law of thermodynamics: an 'ordering' process goes on, for which the name is entropy. This can be explained without technicalities as the tendency of a system to settle down to a uniform distribution of its energy. The dissipation of local pockets of high energy is measured by an increase in entropy, until at maximum entropy all is uniform. According to this model, order is more 'natural' than chaos. This is the reason why it is convenient to discuss cybernetic systems, with their self-regulating tendency to attain stability or orderliness, in terms of entropy - a term which has been taken over to name a key tool of cybernetics." (Stafford Beer, "Cybernetics and Management", 1959)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

Alexander von Humboldt - Collected Quotes

"Whatever relates to extent and quantity may be represented by geometrical figures. Statistical projections which speak to the senses w...