21 November 2025

On Thermodynamics (2000-2009)

"Defined from a societal standpoint, information may be seen as an entity which reduces maladjustment between system and environment. In order to survive as a thermodynamic entity, all social systems are dependent upon an information flow. This explanation is derived from the parallel between entropy and information where the latter is regarded as negative entropy" (negentropy). In more common terms information is a form of processed data or facts about objects, events or persons, which are meaningful for the receiver, inasmuch as an increase in knowledge reduces uncertainty." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization" (and an apparent reduction in entropy), and the micro level" (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Statistical mechanics is the science of predicting the observable properties of a many-body system by studying the statistics of the behaviour of its individual constituents, be they atoms, molecules, photons etc. It provides the link between macroscopic and microscopic states. […] classical thermodynamics. This is a subject dealing with the very large. It describes the world that we all see in our daily lives, knows nothing about atoms and molecules and other very small particles, but instead treats the universe as if it were made up of large-scale continua. […] quantum mechanics. This is the other end of the spectrum from thermodynamics; it deals with the very small. It recognises that the universe is made up of particles: atoms, electrons, protons and so on. One of the key features of quantum mechanics, however, is that particle behaviour is not precisely determined" (if it were, it would be possible to compute, at least in principle, all past and future behaviour of particles, such as might be expected in a classical view). Instead, the behaviour is described through the language of probabilities." (A Mike Glazer & Justin S Wark, "Statistical Mechanics: A survival guide", 2001)

"The function of living matter is apparently to expand the organization of the universe. Here, locally decreased entropy as a result of biological order in existing life is invalidating the effects of the second law of thermodynamics, although at the expense of increased entropy in the whole system. It is the running down of the universe that made the sun and the earth possible. It is the running down of the sun that made life and us possible." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The second law of thermodynamics states that all energy in the universe degrades irreversibly. Thus, differences between energy forms must decrease over time. Everything is spread!" (The principle of degradation of energy with regard to quality.) Translated to the area of systems the law tells us that the entropy of an isolated system always increases. Another consequence is that when two systems are joined together, the entropy of the united system is greater than the sum of the entropies of the individual systems." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"More generally, thermodynamics shows that there is an irreversible flow of time. Rather than there being time symmetry and indeed a reversibility of time as postulated in classical physics, a clear distinction is drawn between the past and future. An arrow of time results within open systems in the loss of organization and an increase in randomness or disorder over time. This accumulation of disorder or positive entropy results from the Second Law of Thermodynamics." (John Urry, "Global Complexity", 2003)

"Scientists have long been baffled by the existence of spontaneous order in the universe. The laws of thermodynamics seem to dictate the opposite, that nature should inexorably degenerate to - ward a state of greater disorder, greater entropy." (Steven Strogatz, "Sync: The Emerging Science of Spontaneous Order", 2003)

"The principle of maximum entropy is employed for estimating unknown probabilities (which cannot be derived deductively) on the basis of the available information. According to this principle, the estimated probability distribution should be such that its entropy reaches maximum within the constraints of the situation, i.e., constraints that represent the available information. This principle thus guarantees that no more information is used in estimating the probabilities than available." (George J Klir & Doug Elias, "Architecture of Systems Problem Solving" 2nd Ed, 2003)

"In the physics of complex systems we can introduce a statistical concept, a measure of randomness, called entropy. In a quiet equilibrium, like hot onion soup sitting in a thermos bottle with no escaping heat, the entropy remains constant in time. However, in violent nonequilibrium processes, like shattering glass or explosions, the entropy always increases. Essentially, entropy, as a measure of randomness, will always increase when a very ordered initial condition leads to a very disordered final state through the normal laws of physics. The fact that entropy at best stays the same in equilibrium, or increases in all other processes, is called the second law of thermodynamics." (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"In the physics of complex systems we can introduce a statistical concept, a measure of randomness, called entropy. In a quiet equilibrium, like hot onion soup sitting in a thermos bottle with no escaping heat, the entropy remains constant in time. However, in violent nonequilibrium processes, like shattering glass or explosions, the entropy always increases. Essentially, entropy, as a measure of randomness, will always increase when a very ordered initial condition leads to a very disordered final state through the normal laws of physics. The fact that entropy at best stays the same in equilibrium, or increases in all other processes, is called the second law of thermodynamics." (Leon M Lederman & Christopher T Hill, "Symmetry and the Beautiful Universe", 2004)

"At the foundation of classical thermodynamics are the first and second laws. The first law formulates that the total energy of a system is conserved, while the second law states that the entropy of an isolated system can only increase. The second law implies that the free energy of an isolated system is successively degraded by diabatic processes over time, leading to entropy production. This eventually results in an equilibrium state of maximum entropy. In its statistical interpretation, the direction towards higher entropy can be interpreted as a transition to more probable states." (Axel Kleidon & Ralph D Lorenz, "Entropy Production by Earth System Processes" [in "Non- quilibrium Thermodynamics and the Production of Entropy"], 2005))

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat" ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"The total disorder in the universe, as measured by the quantity that physicists call entropy, increases steadily as we go from past to future. On the other hand, the total order in the universe, as measured by the complexity and permanence of organized structures, also increases steadily as we go from past to future." (Freeman J Dyson, "A Many-Colored Glass: Reflections on the Place of Life in the Universe", 2007

"Thermodynamics is about those properties of systems that are true independent of their mechanism. This is why there is a fundamental asymmetry in the relationship between mechanistic descriptions of systems and thermodynamic descriptions of systems. From the mechanistic information we can deduce all the thermodynamic properties of that system. However, given only thermodynamic information we can deduce nothing about mechanism. This is in spite of the fact that thermodynamics makes it possible for us to reject classes of models such as perpetual motion machines." (Carlos Gershenson,"Design and Control of Self-organizing Systems", 2007)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Thermodynamics (2010-)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entrop...