21 November 2025

On Thermodynamics (1980-1989)

"It is a remarkable empirical fact that mathematics can be based on set theory. More precisely, all mathematical objects can be coded as sets" (in the cumulative hierarchy built by transfinitely iterating the power set operation, starting with the empty set). And all their crucial properties can be proved from the axioms of set theory." (. . . ) At first sight, category theory seems to be an exception to this general phenomenon. It deals with objects, like the categories of sets, of groups etc. that are as big as the whole universe of sets and that therefore do not admit any evident coding as sets. Furthermore, category theory involves constructions, like the functor category, that lead from these large categories to even larger ones. Thus, category theory is not just another field whose set-theoretic foundation can be left as an exercise. An interaction between category theory and set theory arises because there is a real question: What is the appropriate set-theoretic foundation for category theory?" (Andreas Blass, "The interaction between category theory and set theory", 1983) 

"The difference is that energy is a property of the microstates, and so all observers, whatever macroscopic variables they may choose to define their thermodynamic states, must ascribe the same energy to a system in a given microstate. But they will ascribe different entropies to that microstate, because entropy is not a property of the microstate, but rather of the reference class in which it is embedded. As we learned from Boltzmann, Planck, and Einstein, the entropy of a thermodynamic state is a measure of the number of microstates compatible with the macroscopic quantities that you or I use to define the thermodynamic state." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

"There is no end to this search for the ultimate ‘true’ entropy until we have reached the point where we control the location of each atom independently. But just at that point the notion of entropy collapses, and we are no longer talking thermodynamics." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

"The increase of disorder or entropy with time is one example of what is called an arrow of time something that gives a direction to time and distinguishes the past from the future. There are at least three different directions of time. First, there is the thermodynamic arrow of time - the direction of time in which disorder or entropy increases. Second, there is the psychological arrow of time. This is the direction in which we feel time passes - the direction of time in which we remember the past, but not the future. Third, there is the cosmological arrow of time. This is the direction of time in which the universe is expanding rather than contracting." (Stephen W. Hawking, "The Direction of Time", New Scientist 46, 1987)

"Just like a computer, we must remember things in the order in which entropy increases. This makes the second law of thermodynamics almost trivial. Disorder increases with time because we measure time in the direction in which disorder increases. " (Stephen Hawking, "A Brief History of Time", 1988)

"The concept of entropy relates to the tendency of things to move toward greater disorder, or disorganization. […] The second law of thermodynamics expresses precisely the same concept. This states that heat dissipates from a central source and the energy becomes degraded, although total energy remains constant" (the first law of thermodynamics). Entropy suggests that organisms, organizations, societies, machines, and so on, will rapidly deteriorate into disorder and death." The reason they do not is because animate things can self-organize and inanimate things may be serviced by man. These are negentropic activities which require energy. Energy, however, can be made available only by further degradation. Ultimately, therefore, entropy wins the day and the attempts to create order can seem rather a daunting task in the entropic scheme of things. Holding back entropy, however, is another of the challenging tasks for the systems scientist." (Robert L Flood & Ewart R Carson, "Dealing with Complexity: An introduction to the theory and application of systems", 1988)

"Life is nature's solution to the problem of preserving information despite the second law of thermodynamics." (Howard L Resnikoff, "The Illusion of Reality", 1989)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Thermodynamics (1980-1989)

"It is a remarkable empirical fact that mathematics can be based on set theory. More precisely, all mathematical objects can be coded a...