09 May 2021

On Randomness XXIV (Entropy)

"Let us draw an arrow arbitrarily. If as we follow the arrow we find more and more of the random element in the world, then the arrow is pointing towards the future; if the random element decreases the arrow points towards the past. [...] I shall use the phrase 'time's arrow' to express this one-way property of time which has no analogue in space. (Arthur Eddington, "The Nature of the Physical World", 1928

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010)

"And don’t ever make the mistake of thinking that things you didn’t intend or plan don’t matter. It’s a big, disorganised multiverse out there – an accident of stars. Almost nothing ever works out like we want it to, and when it does, there’s guaranteed to be unexpected consequences. Randomness is what separates life from entropy, but it’s also what makes it fun." (Foz Meadows, "An Accident of Stars", 2016)

"Only highly ordered and structured systems can display complex creative and unpredictable behaviour, and then only if they have the capacity to act with a degree of freedom and randomness. Systems which lack structure and organisation usually fail to produce anything much, they just tend to drift down the entropy gradient. This applies both to people and to organisations." (Peter J Carroll)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

A Picture's Worth

"The drawing shows me at a glance what would be spread over ten pages in a book." (Ivan Turgenev, 1862) [2] "Sometimes, half ...