21 November 2025

On Thermodynamics (1950-1959)

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Progress imposes not only new possibilities for the future but new restrictions. It seems almost as if progress itself and our fight against the increase of entropy intrinsically must end in the downhill path from which we are trying to escape." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"There is no concept in the whole field of physics which is more difficult to understand than is the concept of entropy, nor is there one which is more fundamental." (Francis W Sears, "Mechanics, heat and sound", 1950)

"[…] the characteristic tendency of entropy is to increase. As entropy increases, the universe, and all closed systems in the universe, tend naturally to deteriorate and lose their distinctiveness, to move from the least to the most probable state, from a state of organization and differentiation in which distinctions and forms exist, to a state of chaos and sameness." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Reversible processes are not, in fact, processes at all, they are sequences of states of equilibrium. The processes which we encounter in real life are always irreversible processes." (Arnold Sommerfeld, "Thermodynamics and Statistical Mechanics", Lectures on Theoretical - Physics Vol. V, 1956)

"The mere fact that the same mathematical expression -Σ pi log(pi) [i is index], occurs both in statistical mechanics and in information theory does not in itself establish any connection between these fields. This can be done only by finding new viewpoints from which thermodynamic entropy and information-theory entropy appear as the same concept." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"The second law of thermodynamics provides a more modem" (and a more discouraging) example of the maximum principle: the entropy" (disorder) of the universe tends toward a maximum." (James R Newman, "The World of Mathematics" Vol. II, 1956)

"There is no logical necessity for the existence of a unique direction of total time; whether there is only one time direction, or whether time directions alternate, depends on the shape of the entropy curve plotted by the universe." (Hans Reichenbach, "The Direction of Time", 1956)

"When, for instance, I see a symmetrical object, I feel its pleasurable quality, but do not need to assert explicitly to myself, ‘How symmetrical!’. This characteristic feature may be explained as follows. In the course of individual experience it is found generally that symmetrical objects possess exceptional and desirable qualities. Thus our own bodies are not regarded as perfectly formed unless they are symmetrical. Furthermore, the visual and tactual technique by which we perceive the symmetry of various objects is uniform, highly developed, and almost instantaneously applied. It is this technique which forms the associative 'pointer.' In consequence of it, the perception of any symmetrical object is accompanied by an intuitive aesthetic feeling of positive tone." (George D Birkhoff, "Mathematics of Aesthetics", 1956)

"But in addition to what we decide to do by way of transformation, there are certain tendencies in the way systems behave of their own volition when left to their own devices. The convenient analogy for one of these processes is found in the second law of thermodynamics: an 'ordering' process goes on, for which the name is entropy. This can be explained without technicalities as the tendency of a system to settle down to a uniform distribution of its energy. The dissipation of local pockets of high energy is measured by an increase in entropy, until at maximum entropy all is uniform. According to this model, order is more 'natural' than chaos. This is the reason why it is convenient to discuss cybernetic systems, with their self-regulating tendency to attain stability or orderliness, in terms of entropy - a term which has been taken over to name a key tool of cybernetics." (Stafford Beer, "Cybernetics and Management", 1959)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Images (1890-1899)

"Every definite image in the mind is steeped and dyed in the free water that flows around it. With it goes the sense of its relations, ...