17 December 2020

On Entropy (1990-1999)

"The inflationary period of expansion does not smooth out irregularity by entropy-producing processes like those explored by the cosmologies of the seventies. Rather it sweeps the irregularity out beyond the Horizon of our visible Universe, where we cannot see it . The entire universe of stars and galaxies on view to us. […] on this hypothesis, is but the reflection of a minute, perhaps infinitesimal, portion of the universe's initial conditions, whose ultimate extent and structure must remain forever unknowable to us. A theory of everything does not help here. The information contained in the observable part of the universe derives from the evolution of a tiny part of the initial conditions for the entire universe. The sum total of all the observations we could possibly make can only tell us about a minuscule portion of the whole." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"Three laws governing black hole changes were thus found, but it was soon noticed that something unusual was going on. If one merely replaced the words 'surface area' by 'entropy' and 'gravitational field' by 'temperature', then the laws of black hole changes became merely statements of the laws of thermodynamics. The rule that the horizon surface areas can never decrease in physical processes becomes the second law of thermodynamics that the entropy can never decrease; the constancy of the gravitational field around the horizon is the so-called zeroth law of thermodynamics that the temperature must be the same everywhere in a state of thermal equilibrium. The rule linking allowed changes in the defining quantities of the black hole just becomes the first law of thermodynamics, which is more commonly known as the conservation of energy." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"The new information technologies can be seen to drive societies toward increasingly dynamic high-energy regions further and further from thermodynamical equilibrium, characterized by decreasing specific entropy and increasingly dense free-energy flows, accessed and processed by more and more complex social, economic, and political structures." (Ervin László, "Information Technology and Social Change: An Evolutionary Systems Analysis", Behavioral Science 37, 1992) 

"Fuzzy entropy measures the fuzziness of a fuzzy set. It answers the question 'How fuzzy is a fuzzy set?' And it is a matter of degree. Some fuzzy sets are fuzzier than others. Entropy means the uncertainty or disorder in a system. A set describes a system or collection of things. When the set is fuzzy, when elements belong to it to some degree, the set is uncertain or vague to some degree. Fuzzy entropy measures this degree. And it is simple enough that you can see it in a picture of a cube." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"The Law of Entropy Nonconservation required that life be lived forward, from birth to death. […] To wish for the reverse was to wish for the entropy of the universe to diminish with time, which was impossible. One might as well wish for autumn leaves to assemble themselves in neat stacks just as soon as they had fallen from trees or for water to freeze whenever it was heated." (Michael Guillen, "Five Equations That Changed the World", 1995)

"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"Contrary to what happens at equilibrium, or near equilibrium, systems far from equilibrium do not conform to any minimum principle that is valid for functions of free energy or entropy production." (Ilya Prigogine, "The End of Certainty: Time, Chaos, and the New Laws of Nature", 1996)

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"No one has yet succeeded in deriving the second law from any other law of nature. It stands on its own feet. It is the only law in our everyday world that gives a direction to time, which tells us that the universe is moving toward equilibrium and which gives us a criteria for that state, namely, the point of maximum entropy, of maximum probability. The second law involves no new forces. On the contrary, it says nothing about forces whatsoever." (Brian L Silver, "The Ascent of Science", 1998)

"Physical systems are subject to the force of entropy, which increases until eventually the entire system fails. The tendency toward maximum entropy is a movement to disorder, complete lack of resource transformation, and death." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Data: Longitudinal Data

  "Longitudinal data sets are comprised of repeated observations of an outcome and a set of covariates for each of many subjects. One o...