Showing posts with label entropy. Show all posts
Showing posts with label entropy. Show all posts

07 February 2025

On Entropy: Definitions

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization. In fact, it is possible to interpret the information carried by a message as essentially the negative of its entropy, and the negative logarithm of its probability. That is, the more probable the message, the less information it gives. Clichés, for example, are less illuminating than great poems." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." ("G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"In the physics [entropy is the] rate of system's messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)

02 November 2023

Carlo Rovelli - Collected Quotes

"Boltzmann has shown that entropy exists because we describe the world in a blurred fashion. He has demonstrated that entropy is precisely the quantity that counts how many are the different configurations that our blurred vision does not distinguish between. Heat, entropy, and the lower entropy of the past are notions that belong to an approximate, statistical description of nature. The difference between past and future is deeply linked to this blurring." (Carlo Rovelli, "The Order of Time", 2018)

"Continuity is only a mathematical technique for approximating very finely grained things. The world is subtly discrete, not continuous." (Carlo Rovelli, "The Order of Time", 2018)

"For a moving object, time contracts. Not only is there no single time for different places - there is not even a single time for any particular place. A duration can be associated only with the movement of something, with a given trajectory." (Carlo Rovelli, "The Order of Time", 2018)

"Granularity is ubiquitous in nature: light is made of photons, the particles of light. The energy of electrons in atoms can acquire only certain values and not others. The purest air is granular, and so, too, is the densest matter. Once it is understood that Newton’s space and time are physical entities like all others, it is natural to suppose that they are also granular. Theory confirms this idea: loop quantum gravity predicts that elementary temporal leaps are small, but finite." (Carlo Rovelli, "The Order of Time", 2018)

"If one section of the molecules is still, it becomes stirred up by the frenzy of neighboring ones that set them in motion, too: the agitation spreads, the molecules bump into and shove each other. In this way, cold things are heated in contact with hot ones: their molecules become jostled by hot ones and pushed into ferment. That is, they heat up." (Carlo Rovelli, "The Order of Time", 2018)

"In the elementary equations of the world, the arrow of time appears only where there is heat. The link between time and heat is therefore fundamental: every time a difference is manifested between the past and the future, heat is involved. In every sequence of events that becomes absurd if projected backward, there is something that is heating up." (Carlo Rovelli, "The Order of Time", 2018)

"It is not possible to think of duration as continuous. We must think of it as discontinuous: not as something that flows uniformly but as something that in a certain sense jumps, kangaroo-like, from one value to another. In other words, a minimum interval of time exists. Below this, the notion of time does not exist - even in its most basic meaning." (Carlo Rovelli, "The Order of Time", 2018)

"Nature, for its part, is what it is - and we discover it very gradually. If our grammar and our intuition do not readily adapt to what we discover, well, too bad: we must seek to adapt them." (Carlo Rovelli, "The Order of Time", 2018)

"Nothing is valid always and everywhere. Sooner or later, we always come across something that is new." (Carlo Rovelli, "The Order of Time", 2018)

"[...] our vision of the world is blurred because the physical interactions between the part of the world to which we belong and the rest are blind to many variables. This blurring is at the heart of Boltzmann's theory. From this blurring, the concepts of heat and entropy are born - and these are linked to the phenomena that characterize the flow of time. The entropy of a system depends explicitly on blurring. It depends on what I do not register, because it depends on the number of indistinguishable configurations. The same microscopic configuration may be of high entropy with regard to one blurring and low in relation to another." (Carlo Rovelli, "The Order of Time", 2018)

"Physics does not describe how things evolve 'in time' but how things evolve in their own times, and how 'times' evolve relative to each other." (Carlo Rovelli, "The Order of Time", 2018)

"Spacetime is a physical object like an electron. It, too, fluctuates. It, too, can be in a 'superposition' of different configurations." (Carlo Rovelli, "The Order of Time", 2018)

"The basic units in terms of which we comprehend the world are not located in some specific point in space. They are - if they are at all - in a where but also in a when. They are spatially but also temporally delimited: they are events." (Carlo Rovelli, "The Order of Time", 2018)

"The entire evolution of science would suggest that the best grammar for thinking about the world is that of change, not of permanence. Not of being, but of becoming." (Carlo Rovelli, "The Order of Time", 2018) 

"The entropy of the world does not depend only on the configuration of the world; it also depends on the way in which we are blurring the world, and this depends on what the variables of the world are that we interact with. That is to say, on the variables with which our part of the world interacts. [...] The difference between past and future does not lie in the elementary laws of motion; it does not reside in the deep grammar of nature. It is the natural disordering that leads to gradually less particular, less special situations."" (Carlo Rovelli, "The Order of Time", 2018)

"The entropy of the world in the far past appears very low to us. But this might not reflect the exact state of the world: it might regard the subset of the world's variables with which we, as physical systems, have interacted. It is with respect to the dramatic blurring produced by our interactions with the world, caused by the small set of macroscopic variables in terms of which we describe the world, that the entropy of the universe was low." (Carlo Rovelli, "The Order of Time", 2018)

"The relation of 'temporal precedence' is a partial order made of cones. Special relativity is the discovery that the temporal structure of the universe is like the one established by filiation: it defines an order between the events of the universe that is partial, not complete. The expanded present is the set of events that are neither past nor future: it exists, just as there are human beings who are neither our descendants nor our forebears. […] Every event has its past, its future, and a part of the universe that is neither past nor future, just as every person has forebears, descendants, and others who are neither forebears nor descendants. Light travels along the oblique lines that delimit these cones." (Carlo Rovelli, "The Order of Time", 2018)

"The world without a time variable is not a complicated one. It’s a net of interconnected events, where the variables in play adhere to probabilistic rules that, incredibly, we know for a good part how to write. And it’s a clear world, windswept and full of beauty as the crests of mountains; aridly beautiful as the cracked lips of the adolescent you loved." (Carlo Rovelli, "The Order of Time", 2018)

"Thermal agitation is like a continual shuffling of a pack of cards: if the cards are in order, the shuffling disorders them. In this way, heat passes from hot to cold, and not vice versa: by shuffling, by the natural disordering of everything. The growth of entropy is nothing other than the ubiquitous and familiar natural increase of disorder." (Carlo Rovelli, "The Order of Time", 2018)

"We cannot draw a complete map, a complete geometry, of everything that happens in the world, because such happenings - including among them the passage of time - are always triggered only by an interaction with, and with respect to, a physical system involved in the interaction. The world is like a collection of interrelated points of view. To speak of the world “seen from outside” makes no sense, because there is no “outside” to the world." (Carlo Rovelli, "The Order of Time", 2018)

"We often say that causes precede effects and yet, in the elementary grammar of things, there is no distinction between 'cause' and 'effect'. There are regularities, represented by what we call physical laws, that link events of different times, but they are symmetric between future and past. In a microscopic description, there can be no sense in which the past is different from the future."(Carlo Rovelli, "The Order of Time", 2018)

29 October 2023

Out of Context: On Entropy (Just the Quotes)

"If for the entire universe we conceive the same magnitude to be determined, consistently and with due regard to all circumstances, which for a single body I have called entropy, and if at the same time we introduce the other and simpler conception of energy, we may express in the following manner the fundamental laws of the universe which correspond to the two fundamental theorems of the mechanical theory of heat: (1) The energy of the universe is constant. (2) The entropy of the universe tends to a maximum." (Rudolf Clausius, "The Mechanical Theory of Heat - With its Applications to the Steam Engine and to Physical Properties of Bodies", 1867)

"The Entropy of a system is the mechanical work it can perform without communication of heat, or alteration of its total volume, all transference of heat being performed by reversible engines. When the pressure and temperature of the system have become uniform the entropy is exhausted." (James C Maxwell, "Theory of Heat", 1899)

"The second law of thermodynamics appears solely as a law of probability, entropy as a measure of the probability, and the increase of entropy is equivalent to a statement that more probable events follow less probable ones." (Max Planck, "A Survey of Physics", 1923)

"Entropy is the measure of randomness." (Lincoln Barnett, "The Universe and Dr. Einstein", 1948)

"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization." (Norbert Wiener, "The Human Use of Human Beings", 1950)

"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)

"You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage." (John von Neumann) [Suggesting to Claude Shannon a name for his new uncertainty function, see Scientific American Vol. 225 (3), 1971]

"Thus, in physics, entropy is associated with the possibility of converting thermal energy into mechanical energy. If the entropy does not change during a process, the process is reversible. If the entropy increases, the available energy decreases. Statistical mechanics interprets an increase of entropy as a decrease in order or, if we wish, as a decrease in our knowledge." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"Entropy is the crisp scientific name for waste, chaos, and disorder." (Kevin Kelly, "What Technology Wants", 2010)

"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." (G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)

"In the physics [entropy is the] rate of system's messiness or disorder in a physical system. In the social systems theory - social entropy is a sociological theory that evaluates social behaviors using a method based on the second law of thermodynamics." (Justína Mikulášková et al, "Spiral Management: New Concept of the Social Systems Management", 2020)

10 September 2023

On Entropy (2010-2019)

"Entropy is the crisp scientific name for waste, chaos, and disorder. As far as we know, the sole law of physics with no known exceptions anywhere in the universe is this: All creation is headed to the basement. Everything in the universe is steadily sliding down the slope toward the supreme equality of wasted heat and maximum entropy." (Kevin Kelly, "What Technology Wants", 2010)

"If everything in the universe evolves toward increasing disorder, it must have started out in an exquisitely ordered arrangement. This whole chain of logic, purporting to explain why you can't turn an omelet into an egg, apparently rests on a deep assumption about the very beginning of the universe. It was in a state of very low entropy, very high order. Why did our part of the universe pass though a period of such low entropy?" (Sean Carroll, "From Eternity to Here: The Quest for the Ultimate Theory of Time", 2010)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010) 

"Information, defined intuitively and informally, might be something like 'uncertainty's antidote'. This turns out also to be the formal definition - the amount of information comes from the amount by which something reduces uncertainty. [...] The higher the [information] entropy, the more information there is. It turns out to be a value capable of measuring a startling array of things- from the flip of a coin to a telephone call, to a Joyce novel, to a first date, to last words, to a Turing test. [...] Entropy suggests that we gain the most insight on a question when we take it to the friend, colleague, or mentor of whose reaction and response we're least certain. And it suggests, perhaps, reversing the equation, that if we want to gain the most insight into a person, we should ask the question of whose answer we're least certain. [...] Pleasantries are low entropy, biased so far that they stop being an earnest inquiry and become ritual. Ritual has its virtues, of course, and I don't quibble with them in the slightest. But if we really want to start fathoming someone, we need to get them speaking in sentences we can't finish." (Brian Christian, "The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive", 2011)

"The laws of thermodynamics tell us something quite different. Economic activity is merely borrowing low-entropy energy inputs from the environment and transforming them into temporary products and services of value. In the transformation process, often more energy is expended and lost to the environment than is embedded in the particular good or service being produced." (Jeremy Rifkin, "The Third Industrial Revolution", 2011)

"The psychic entropy peculiar to the human condition involves seeing more to do than one can actually accomplish and feeling able to accomplish more than what conditions allow."(Mihaly Csikszentmihalyi, "Flow: The Psychology of Happiness", 2013)

"In a physical system, information is the opposite of entropy, as it involves uncommon and highly correlated configurations that are difficult to arrive at." (César A Hidalgo, "Why Information Grows: The Evolution of Order, from Atoms to Economies", 2015)

"The passage of time and the action of entropy bring about ever-greater complexity - a branching, blossoming tree of possibilities. Blossoming disorder (things getting worse), now unfolding within the constraints of the physics of our universe, creates novel opportunities for spontaneous ordered complexity to arise." (D J MacLennan, "Frozen to Life", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"The natural effect of processes going on in the Universe is to move from a state of order to a state of disorder, unless there is an input of energy from outside." (John R Gribbin, "The Time Illusion", 2016) 

"The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there." (Steven Pinker, "The Second Law of Thermodynamics", 2017) [source

"Our greatest enemies are ultimately not our political adversaries but entropy, evolution (in the form of pestilence and the flaws in human nature), and most of all ignorance - a shortfall of knowledge of how best to solve our problems." (Steven Pinker, "Enlightenment Now: The Case for Reason, Science, Humanism, and Progress", 2018)

"[...] our vision of the world is blurred because the physical interactions between the part of the world to which we belong and the rest are blind to many variables. This blurring is at the heart of Boltzmann's theory. From this blurring, the concepts of heat and entropy are born - and these are linked to the phenomena that characterize the flow of time. The entropy of a system depends explicitly on blurring. It depends on what I do not register, because it depends on the number of indistinguishable configurations. The same microscopic configuration may be of high entropy with regard to one blurring and low in relation to another." (Carlo Rovelli, "The Order of Time", 2018)

"The entropy of the world does not depend only on the configuration of the world; it also depends on the way in which we are blurring the world, and this depends on what the variables of the world are that we interact with. That is to say, on the variables with which our part of the world interacts." (Carlo Rovelli, "The Order of Time", 2018)

"The entropy of the world in the far past appears very low to us. But this might not reflect the exact state of the world: it might regard the subset of the world's variables with which we, as physical systems, have interacted. It is with respect to the dramatic blurring produced by our interactions with the world, caused by the small set of macroscopic variables in terms of which we describe the world, that the entropy of the universe was low." (Carlo Rovelli, "The Order of Time", 2018)

"Disorder is a collective property of large assemblages; it makes no sense to say a single molecule is disordered or random. Thermodynamic quantities like entropy and heat energy are defined by reference to enormous numbers of particles - for example, molecules of gas careering about - and averaging across them without considering the details of individual particles. (Such averaging is sometimes called a ‘coarse-grained view’.) Thus, the temperature of a gas is related to the average energy of motion of the gas molecules. The point is that whenever one takes an average some information is thrown away, that is, we accept some ignorance. The average height of a Londoner tells us nothing about the height of a specific person. Likewise, the temperature of a gas tells us nothing about the speed of a specific molecule. In a nutshell: information is about what you know, and entropy is about what you don’t know." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019)

20 July 2021

Misquoted: Cicero on Probability is the Very Guide of Life

The "probability is the very guide of life" adage is attributed to Joseph Butler (1692-1752) and surprisingly (see [6]) Marcus Tullius Cicero (106-43 BC). Surprisingly because at first glance it is hard to believe that a concept relatively new as the one of probability was known to the antics, even if philosophical texts on divinity, fate, divination, causality or similar topics approached notions like chance or plausibility. 

The concept of probability entered in common usage only starting with 14th century via the French vocabulary, the term being derived directly from the Latin probabilitatem/probabilitas which belong to the family of words derived from 'probabilis' and translated as 'probable' or 'plausible'. 

Cicero used indeed 'probable' (Latin: probabilis) close to its actual meaning and provides a definition for it in "De Inventione" (cca. 91 and 88 BC), one of his earliest works, considered by historians as one of the main works on rhetoric:

"That is probable which for the most part usually comes to pass, or which is a part of the ordinary beliefs of mankind, or which contains in itself some resemblance to these qualities, whether such resemblance be true or false." [1]

Cicero used 'probable' also in "De Natura Deorum" (cca. 45 BC) with the same meaning:

"For we are not people who believe that there is nothing whatever which is true; but we say that some falsehoods are so blended with all truths, and have so great a resemblance to them, that there is no certain rule for judging of, or assenting to propositions; from which this maxim also follows, that many things are probable, which, though they are not evident to the senses, have still so persuasive and beautiful an aspect, that a wise man chooses to direct his conduct by them." [2]

"But, as it is the peculiar property of the Academy to interpose no personal judgment of its own, but to admit those opinions which appear most probable, to compare arguments, and to set forth all that may be reasonably stated in favour of each proposition; and so, without putting forth any authority of its own, to leave the judgment of the hearers free and unprejudiced; we will retain this custom, which has been handed down from Socrates; and this method, dear brother Quintus, if you please, we will adopt as often as possible in all our dialogues together." [2]

However, from 'probable' to 'probability' (Latin: probabilitas) there's an important leap of meaning. The use of 'probability' can be explained by translator's choice of using it in the detriment of terms like 'chance', 'odds' or 'possible', though searches in the online texts of the book in the translations of Charles D Yonge (1878) and Francis Brooks (1896) provided no proximate occurrences of the adage. 

Even more surprising, a similar form of the adage appears in Sextus Empiricus' "Outlines of Pyrrhonism"  (cca. 3rd century):

"Furthermore, as regards the End (the aim of life) we differ from the New Academy; for whereas the men who profess to conform to its doctrine use probability as the guide of life; we live in an undogmatic way by following the laws, customs, and natural affections." [3]

Do we have here another situation in which the translator assumed a choice of words or maybe in the original text there were indeed references to probability? Unfortunately, the available translation used as source for the quote is from Greek. 

The adage in its quoted form can be found in Joseph Butler's "The Analogy of Religion" (1736):

"Probable evidence, in its very nature, affords but an imperfect kind of information, and is to be considered as relative only to beings of limited capacities. For nothing which is the possible object of knowledge, whether past, present, or future, can be probable to an infinite Intelligence; since it cannot but be discerned absolutely as it is in itself, certainly true, or certainly false. To us, probability is the very guide of life." [4]

According to Butler we use probability to guide us in life when we deal with incomplete (imperfect) information, when we can't discern whether things are false or true and/or nuances of grey exist in between. For Cicero the things more probable tend to happen even if the senses can't discern which of the things are more probable, the wisdom of a person relying in the ability in identifying and evaluating the things probable. One can recognize in Cicero’s definition an early glimpse of entropy – the movement toward more probable states.

Despite the deep role propabilities play in life, we can still question adage's generalization - the degree to which we use probabilities to guide us in life. We do occasionally think in terms of the probabilities for an event to happen; we do tend to believe that what is more probable to happen will happen. Probably, the more we are caught in scientific endeavors, the more likely we use probabilities in decision making. Though, there's a limit to it, limit associated to the degree we are able to understand and use probabilities. 

I'd like to believe that Cicero's thoughts were in the proximate range of meaning associated with the early concept of probability, though a deeper analysis of the original text is needed and even then we can only advance suppositions.

Previous Post <<||>> Next Post

References:
[1] Marcus Tullius Cicero (cca. 91 and 88 BC) "De inventione", ["On Invention"]
[2] Marcus Tullius Cicero (45 BC) "De Natura Deorum" ["On the Nature of the Gods"]
[3] Sextus Empiricus (cca. 3rd century) "Outlines of Pyrrhonism"
[4] Joseph Butler (1736) "The Analogy of Religion, Natural and Revealed, to the Constitution and Course of Nature"
[5] Stanford Encyclopedia of Philosophy (2014) Probability in Medieval and Renaissance Philosophy [source]
[6] Kate L Roberts (1922) Hoyt's New Cyclopedia Of Practical Quotations

04 July 2021

Thermodynamics IV

"It is impossible by means of inanimate material agency, to derive mechanical effect from any portion of matter by cooling it below the temperature of the coldest of the surrounding objects. [Footnote: ] If this axiom be denied for all temperatures, it would have to be admitted that a self-acting machine might be set to work and produce mechanical effect by cooling the sea or earth, with no limit but the total loss of heat from the earth and sea, or in reality, from the whole material world." (William Thomson, "On the Dynamical Theory of Heat with Numerical Results Deduced from Mr Joule's Equivalent of a Thermal Unit and M. Regnault's Observations on Steam", Transactions of the Royal Society of Edinburgh, 1851)

"Though the ultimate state of the universe may be its vital and psychical extinction, there is nothing in physics to interfere with the hypothesis that the penultimate state might be the millennium - in other words a state in which a minimum of difference of energy - level might have its exchanges so skillfully canalises that a maximum of happy and virtuous consciousness would be the only result." (William James, [Letter to Henry Adams] 1910)" (William James, [Letter to Henry Adams] 1910)

"Organic evolution has its physical analogue in the universal law that the world tends, in all its parts and particles, to pass from certain less probable to certain more probable configurations or states. This is the second law of thermodynamics." (D'Arcy Wentworth Thompson, "On Growth and Form", 1917)

"In classical physics, most of the fundamental laws of nature were concerned either with the stability of certain configurations of bodies, e.g. the solar system, or else with the conservation of certain properties of matter, e.g. mass, energy, angular momentum or spin. The outstanding exception was the famous Second Law of Thermodynamics, discovered by Clausius in 1850. This law, as usually stated, refers to an abstract concept called entropy, which for any enclosed or thermally isolated system tends to increase continually with lapse of time. In practice, the most familiar example of this law occurs when two bodies are in contact: in general, heat tends to flow from the hotter body to the cooler. Thus, while the First Law of Thermodynamics, viz. the conservation of energy, is concerned only with time as mere duration, the Second Law involves the idea of trend." (Gerald J Whitrow, "The Structure of the Universe: An Introduction to Cosmology", 1949)

"The second law of thermodynamics provides a more modem (and a more discouraging) example of the maximum principle: the entropy (disorder) of the universe tends toward a maximum." (James R Newman, "The World of Mathematics" Vol. II, 1956)

"[...] thermodynamics knows of no such notion as the 'entropy of a physical system'. Thermodynamics does have the concept of the entropy of a thermodynamic system; but a given physical system corresponds to many different thermodynamic systems." (Edwin T Jaynes, "Gibbs vs Boltzmann Entropies", 1964)

"'You cannot base a general mathematical theory on imprecisely defined concepts. You can make some progress that way; but sooner or later the theory is bound to dissolve in ambiguities which prevent you from extending it further.' Failure to recognize this fact has another unfortunate consequence which is, in a practical sense, even more disastrous: 'Unless the conceptual problems of a field have been clearly resolved, you cannot say which mathematical problems are the relevant ones worth working on; and your efforts are more than likely to be wasted.'" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"There is no end to this search for the ultimate ‘true’ entropy until we have reached the point where we control the location of each atom independently. But just at that point the notion of entropy collapses, and we are no longer talking thermodynamics." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

"No one has yet succeeded in deriving the second law from any other law of nature. It stands on its own feet. It is the only law in our everyday world that gives a direction to time, which tells us that the universe is moving toward equilibrium and which gives us a criteria for that state, namely, the point of maximum entropy, of maximum probability. The second law involves no new forces. On the contrary, it says nothing about forces whatsoever." (Brian L Silver, "The Ascent of Science", 1998)

09 June 2021

On Entropy (2000-2009)

"All systems have a tendency toward maximum entropy, disorder, and death. Importing resources from the environment is key to long-term viability; closed systems move toward this disorganization faster than open systems." (Stephen G Haines, "The Systems Thinking Approach to Strategic Planning and Management", 2000)

"Defined from a societal standpoint, information may be seen as an entity which reduces maladjustment between system and environment. In order to survive as a thermodynamic entity, all social systems are dependent upon an information flow. This explanation is derived from the parallel between entropy and information where the latter is regarded as negative entropy (negentropy). In more common terms information is a form of processed data or facts about objects, events or persons, which are meaningful for the receiver, inasmuch as an increase in knowledge reduces uncertainty." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Expressed in terms of entropy, open systems are negentropic, that is, tend toward a more elaborate structure. As open systems, organisms which are in equilibrium are capable of working for a long time by use of the constant input of matter and energy. Closed systems, however, increase their entropy, tend to run down and can therefore be called ’dying systems’. When reaching a steady state the closed system is not capable of performing any work." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The function of living matter is apparently to expand the organization of the universe. Here, locally decreased entropy as a result of biological order in existing life is invalidating the effects of the second law of thermodynamics, although at the expense of increased entropy in the whole system. It is the running down of the universe that made the sun and the earth possible. It is the running down of the sun that made life and us possible." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Nature normally hates power laws. In ordinary systems all quantities follow bell curves, and correlations decay rapidly, obeying exponential laws. But all that changes if the system is forced to undergo a phase transition. Then power laws emerge-nature's unmistakable sign that chaos is departing in favor of order. The theory of phase transitions told us loud and clear that the road from disorder to order is maintained by the powerful forces of self-organization and is paved by power laws. It told us that power laws are not just another way of characterizing a system's behavior. They are the patent signatures of self-organization in complex systems." (Albert-László Barabási, "Linked: How Everything Is Connected to Everything Else and What It Means for Business, Science, and Everyday Life", 2002)

"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003) 

"The principle of maximum entropy is employed for estimating unknown probabilities (which cannot be derived deductively) on the basis of the available information. According to this principle, the estimated probability distribution should be such that its entropy reaches maximum within the constraints of the situation, i.e., constraints that represent the available information. This principle thus guarantees that no more information is used in estimating the probabilities than available." (George J Klir & Doug Elias, "Architecture of Systems Problem Solving" 2nd Ed, 2003) 

"The principle of minimum entropy is employed in the formulation of resolution forms and related problems. According to this principle, the entropy of the estimated probability distribution, conditioned by a particular classification of the given events (e.g., states of the variable involved), is minimum subject to the constraints of the situation. This principle thus guarantees that all available information is used, as much as possible within the given constraints (e.g., required number of states), in the estimation of the unknown probabilities." (George J Klir & Doug Elias, "Architecture of Systems Problem Solving" 2nd Ed, 2003)

"The total disorder in the universe, as measured by the quantity that physicists call entropy, increases steadily steadily as we go from past to future. On the other hand, the total order in the universe, as measured by the complexity and permanence of organized structures, also increases steadily as we go from past to future." (Freeman J Dyson, [Page-Barbour lecture], 2004)

"At the foundation of classical thermodynamics are the first and second laws. The first law formulates that the total energy of a system is conserved, while the second law states that the entropy of an isolated system can only increase. The second law implies that the free energy of an isolated system is successively degraded by diabatic processes over time, leading to entropy production. This eventually results in an equilibrium state of maximum entropy. In its statistical interpretation, the direction towards higher entropy can be interpreted as a transition to more probable states." (Axel Kleidon & Ralph D Lorenz, "Entropy Production by Earth System Processes" [in "Non- quilibrium Thermodynamics and the Production of Entropy"], 2005)

"However, the law of accelerating returns pertains to evolution, which is not a closed system. It takes place amid great chaos and indeed depends on the disorder in its midst, from which it draws its options for diversity. And from these options, an evolutionary process continually prunes its choices to create ever greater order."  (Ray Kurzweil, "The Singularity is Near", 2005)

"A bit involves both probability and an experiment that decides a binary or yes-no question. Consider flipping a coin. One bit of in-formation is what we learn from the flip of a fair coin. With an unfair or biased coin the odds are other than even because either heads or tails is more likely to appear after the flip. We learn less from flipping the biased coin because there is less surprise in the outcome on average. Shannon's bit-based concept of entropy is just the average information of the experiment. What we gain in information from the coin flip we lose in uncertainty or entropy." (Bart Kosko, "Noise", 2006)

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"The total disorder in the universe, as measured by the quantity that physicists call entropy, increases steadily as we go from past to future. On the other hand, the total order in the universe, as measured by the complexity and permanence of organized structures, also increases steadily as we go from past to future." (Freeman J Dyson, "A Many-Colored Glass: Reflections on the Place of Life in the Universe", 2007)

"We have to be aware that even in mathematical and physical models of self-organizing systems, it is the observer who ascribes properties, aspects, states, and probabilities; and therefore entropy or order to the system. But organization is more than low entropy: it is structure that has a function or purpose." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"In fact, H [entropy] measures the amount of uncertainty that exists in the phenomenon. If there were only one event, its probability would be equal to 1, and H would be equal to 0 - that is, there is no uncertainty about what will happen in a phenomenon with a single event because we always know what is going to occur. The more events that a phenomenon possesses, the more uncertainty there is about the state of the phenomenon. In other words, the more entropy, the more information." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Writing and (particularly) maintaining software is a continual battle against entropy. Keeping on top of quality is tough, requiring high levels of discipline. This discipline is difficult enough to maintain under the best of circumstances, let alone when faced with concrete evidence that the software is uncared for, such as a long-unfixed bug. As soon as discipline slips, quality can go into a self-reinforcing downward spiral, and you’re in real trouble." (Paul Butcher, "Debug It! Find, Repair, and Prevent Bugs in Your Code", 2009)

07 June 2021

On Continuity XI (Life)

"[…] to the scientific mind the living and the non-living form one continuous series of systems of differing degrees of complexity […], while to the philosophic mind the whole universe, itself perhaps an organism, is composed of a vast number of interlacing organisms of all sizes." (James G Needham, "Developments in Philosophy of Biology", Quarterly Review of Biology Vol. 3 (1), 1928)

"[A living organism] feeds upon negative entropy […] Thus, the device by which an organism maintains itself stationary at a fairly high level of orderliness really consists in continually sucking orderliness from its environment." (Erwin Schrodinger, "What is Life? The Physical Aspect of the Living Cell", 1944)

"Every process, event, happening – call it what you will; in a word, everything that is going on in Nature means an increase of the entropy of the part of the world where it is going on. Thus a living organism continually increases its entropy – or, as you may say, produces positive entropy – and thus tends to approach the dangerous state of maximum entropy, which is death. It can only keep aloof from it, i.e. alive, by continually drawing from its environment negative entropy – which is something very positive as we shall immediately see. What an organism feeds upon is negative entropy. Or, to put it less paradoxically, the essential thing in metabolism is that the organism succeeds in freeing itself from all the entropy it cannot help producing while alive." (Erwin Schrödinger, "What is Life?", 1944)

"Hence the awkward expression ‘negative entropy’ can be replaced by a better one: entropy, taken with the negative sign, is itself a measure of order. Thus the device by which an organism maintains itself stationary at a fairly high level of orderliness ( = fairly low level of entropy) really consists in continually sucking orderliness from its environment." (Erwin Schrödinger, "What is Life?", 1944)

"All nature is a continuum. The endless complexity of life is organized into patterns which repeat themselves - theme and variations - at each level of system. These similarities and differences are proper concerns for science. From the ceaseless streaming of protoplasm to the many-vectored activities of supranational systems, there are continuous flows through living systems as they maintain their highly organized steady states." (James G Miller, "Living Systems", 1978)

"All living organisms must feed on continual flows of matter and energy: from their environment to stay alive, and all living organisms continually produce waste. However, an ecosystem generates no net waste, one species' waste being another species' food. Thus, matter cycles continually through the web of life." (Fritjof Capra, "The Hidden Connections", 2002)

09 May 2021

On Randomness XXIV (Entropy)

"Let us draw an arrow arbitrarily. If as we follow the arrow we find more and more of the random element in the world, then the arrow is pointing towards the future; if the random element decreases the arrow points towards the past. [...] I shall use the phrase 'time's arrow' to express this one-way property of time which has no analogue in space. (Arthur Eddington, "The Nature of the Physical World", 1928

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010)

"And don’t ever make the mistake of thinking that things you didn’t intend or plan don’t matter. It’s a big, disorganised multiverse out there – an accident of stars. Almost nothing ever works out like we want it to, and when it does, there’s guaranteed to be unexpected consequences. Randomness is what separates life from entropy, but it’s also what makes it fun." (Foz Meadows, "An Accident of Stars", 2016)

"Only highly ordered and structured systems can display complex creative and unpredictable behaviour, and then only if they have the capacity to act with a degree of freedom and randomness. Systems which lack structure and organisation usually fail to produce anything much, they just tend to drift down the entropy gradient. This applies both to people and to organisations." (Peter J Carroll)

24 January 2021

Yevgeny Zamiatin - Collected Quotes

"All truths are erroneous. This is the very essence of the dialectical process: today’s truths become errors tomorrow." (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"Heretics are the only (bitter) remedy against the entropy of human thought." (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"No revolution, no heresy is comfortable or easy. For it is a leap, it is a break in the smooth evolutionary curve, and a break is a wound, a pain. But the wound is necessary; most of mankind suffers from hereditary sleeping sickness, and victims of this sickness (entropy) must not be allowed to sleep, or it will be their final sleep, death."  (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"Revolution is everywhere, in everything. It is infinite. There is no final revolution, no final number. The social revolution is only one of an infinite number of numbers; the law of revolution is not a social law, but an immeasurably greater one. It is a cosmic, universal law - like the laws of the conservation of energy and of the dissipation of energy (entropy)." (Yevgeny Zamiatin, "On Literature, Revolution, Entropy, and Other Matters", 1923)

"Everything in human society is being continually perfected - and should be." (Yevgeny Zamiatin, "We", 1924)

"The cruelest thing is to make a person doubt his own reality."  (Yevgeny Zamiatin, "We", 1924)

"The function of man’s highest faculty, his reason, consists precisely of the continuous limitation of infinity, the breaking up of infinity into convenient, easily digestible portions - differentials. This is precisely what lends my field, mathematics, its divine beauty." (Yevgeny Zamiatin, "We", 1924)

"The inevitable mark of truth is - its cruelty." (Yevgeny Zamiatin, "We", 1924)

23 January 2021

On Entropy (Unsourced)

"Just as the constant increase of entropy is the basic law of the universe, so it is the basic law of life to be ever more highly structured and to struggle against entropy." (Václav Havel, [Letter to Gustáv Husák]) 

"Only highly ordered and structured systems can display complex creative and unpredictable behaviour, and then only if they have the capacity to act with a degree of freedom and randomness. Systems which lack structure and organisation usually fail to produce anything much, they just tend to drift down the entropy gradient. This applies both to people and to organisations." (Peter J Carroll)

"Some of the science literature on entropy erroneously assumes that, as one goes forward in time, one moves from order to disorder. Naturally this is erroneous for the simple reason that there is no such thing as disorder; there is only complexity within simplicity." (Wald Wassermann)

"You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage." (John von Neumann) [Suggesting to Claude Shannon a name for his new uncertainty function, see Scientific American Vol. 225 (3), 1971] 

22 January 2021

Thermodynamics I

"The second fundamental theorem [the second law of thermodynamics], in the form which I have given to it, asserts that all transformations occurring in nature may take place in a certain direction, which I have assumed as positive, by themselves, that is, without compensation […] the entire condition of the universe must always continue to change in that first direction, and the universe must consequently approach incessantly a limiting condition. […] For every body two magnitudes have thereby presented themselves - the transformation value of its thermal content [the amount of inputted energy that is converted to 'work'], and its disgregation [separation or disintegration]; the sum of which constitutes its entropy." (Rudolf Clausius, "The Mechanical Theory of Heat", 1867)

"Since a given system can never of its own accord go over into another equally probable state but into a more probable one, it is likewise impossible to construct a system of bodies that after traversing various states returns periodically to its original state, that is a perpetual motion machine." (Ludwig E Boltzmann, "The Second Law of Thermodynamics", [Address to a Formal meeting of the Imperial Academy of Science], 1886)

"The second law of thermodynamics appears solely as a law of probability, entropy as a measure of the probability, and the increase of entropy is equivalent to a statement that more probable events follow less probable ones." (Max Planck, "A Survey of Physics", 1923)

"It was not easy for a person brought up in the ways of classical thermodynamics to come around to the idea that gain of entropy eventually is nothing more nor less than loss of information." (Gilbert N Lewis, [Letter to Irving Langmuir] 1930)

"True equilibria can occur only in closed systems and that, in open systems, disequilibria called ‘steady states’, or ‘flow equilibria’ are the predominant and characteristic feature. According to the second law of thermodynamics a closed system must eventually attain a time-independent equilibrium state, with maximum entropy and minimum free energy. An open system may, under certain conditions, attain a time-independent state where the system remains constant as a whole and in its phases, though there is a continuous flow of component materials. This is called a steady state. Steady states are irreversible as a whole. […] A closed system in equilibrium does not need energy for its preservation, nor can energy be obtained from it. In order to perform work, a system must be in disequilibrium, tending toward equilibrium and maintaining a steady state, Therefore the character of an open system is the necessary condition for the continuous working capacity of the organism." (Ludwig on Bertalanffy, "Theoretische Biologie: Band 1: Allgemeine Theorie, Physikochemie, Aufbau und Entwicklung des Organismus", 1932)

"When a transfer of matter to or from a system is also possible, the system may be called an open system." (Frank H MacDougall, "Thermodynamics and chemistry", ?1939)

"A theory is the more impressive the greater the simplicity of its premises is, the more different kinds of things it relates, and the more extended is its area of applicability. Therefore the deep impression which classical thermodynamics made upon me. It is the only physical theory of universal content concerning which I am convinced that, within the framework of the applicability of its basic concepts, it will never be overthrown (for the special attention of those who are skeptics on principle)." (Albert Einstein, "Autobiographical Notes", 1949)

"Reversible processes are not, in fact, processes at all, they are sequences of states of equilibrium. The processes which we encounter in real life are always irreversible processes." (Arnold Sommerfeld, "Thermodynamics and Statistical Mechanics", Lectures on Theoretical - Physics Vol. V, 1956)

Thermodynamics III

"My analysis of living systems uses concepts of thermodynamics, information theory, cybernetics, and systems engineering, as well as the classical concepts appropriate to each level. The purpose is to produce a description of living structure and process in terms of input and output, flows through systems, steady states, and feedbacks, which will clarify and unify the facts of life." (James G Miller, "Living Systems: Basic Concepts", 1969)

"In an isolated system, which cannot exchange energy and matter with the surroundings, this tendency is expressed in terms of a function of the macroscopic state of the system: the entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)

"[The] system may evolve through a whole succession of transitions leading to a hierarchy of more and more complex and organized states. Such transitions can arise in nonlinear systems that are maintained far from equilibrium: that is, beyond a certain critical threshold the steady-state regime become unstable and the system evolves into a new configuration." (Ilya Prigogine, Gregoire Micolis & Agnes Babloyantz, "Thermodynamics of Evolution", Physics Today 25 (11), 1972)

"The evolution of a physicochemical system leads to an equilibrium state of maximum disorder." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)

"The functional order maintained within living systems seems to defy the Second Law; nonequilibrium thermodynamics describes how such systems come to terms with entropy." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)

"When matter is becoming disturbed by non-equilibrium conditions it organizes itself, it wakes up. It happens that our world is a non-equilibrium system." (Ilya Prigogine, "Thermodynamics of Evolution", 1972)

"There is nothing supernatural about the process of self-organization to states of higher entropy; it is a general property of systems, regardless of their materials and origin. It does not violate the Second Law of thermodynamics since the decrease in entropy within an open system is always offset by the increase of entropy in its surroundings." (Ervin László, "Introduction to Systems Philosophy", 1972)

"Concepts form the basis for any science. These are ideas, usually somewhat vague (especially when first encountered), which often defy really adequate definition. The meaning of a new concept can seldom be grasped from reading a one-paragraph discussion. There must be time to become accustomed to the concept, to investigate it with prior knowledge, and to associate it with personal experience. Inability to work with details of a new subject can often be traced to inadequate understanding of its basic concepts." (William C Reynolds & Harry C Perkins, "Engineering Thermodynamics", 1977)

"Just like a computer, we must remember things in the order in which entropy increases. This makes the second law of thermodynamics almost trivial. Disorder increases with time because we measure time in the direction in which disorder increases."  (Stephen Hawking, "A Brief History of Time", 1988)

"Life is nature's solution to the problem of preserving information despite the second law of thermodynamics." (Howard L Resnikoff, "The Illusion of Reality", 1989)

Thermodynamics II

"Everywhere […] in the Universe, we discern that closed physical systems evolve in the same sense from ordered states towards a state of complete disorder called thermal equilibrium. This cannot be a consequence of known laws of change, since […] these laws are time symmetric- they permit […] time-reverse. […] The initial conditions play a decisive role in endowing the world with its sense of temporal direction. […] some prescription for initial conditions is crucial if we are to understand […]" (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"Three laws governing black hole changes were thus found, but it was soon noticed that something unusual was going on. If one merely replaced the words 'surface area' by 'entropy' and 'gravitational field' by 'temperature', then the laws of black hole changes became merely statements of the laws of thermodynamics. The rule that the horizon surface areas can never decrease in physical processes becomes the second law of thermodynamics that the entropy can never decrease; the constancy of the gravitational field around the horizon is the so-called zeroth law of thermodynamics that the temperature must be the same everywhere in a state of thermal equilibrium. The rule linking allowed changes in the defining quantities of the black hole just becomes the first law of thermodynamics, which is more commonly known as the conservation of energy." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"The second law of thermodynamics states that in an isolated system, entropy can only increase, not decrease. Such systems evolve to their state of maximum entropy, or thermodynamic equilibrium. Therefore, physical self-organizing systems cannot be isolated: they require a constant input of matter or energy with low entropy, getting rid of the internally generated entropy through the output of heat ('dissipation'). This allows them to produce ‘dissipative structures’ which maintain far from thermodynamic equilibrium. Life is a clear example of order far from thermodynamic equilibrium." (Carlos Gershenson, "Design and Control of Self-organizing Systems", 2007)

"Thermodynamics is about those properties of systems that are true independent of their mechanism. This is why there is a fundamental asymmetry in the relationship between mechanistic descriptions of systems and thermodynamic descriptions of systems. From the mechanistic information we can deduce all the thermodynamic properties of that system. However, given only thermodynamic information we can deduce nothing about mechanism. This is in spite of the fact that thermodynamics makes it possible for us to reject classes of models such as perpetual motion machines." (Carlos Gershenson, “Design and Control of Self-organizing Systems”, 2007)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010)

"The laws of thermodynamics tell us something quite different. Economic activity is merely borrowing low-entropy energy inputs from the environment and transforming them into temporary products and services of value. In the transformation process, often more energy is expended and lost to the environment than is embedded in the particular good or service being produced." (Jeremy Rifkin, "The Third Industrial Revolution", 2011)

"The reactions that break down large molecules into small ones do not require an input of energy, but the reactions that build up large molecules require and input of energy. This is consistent with the laws of thermodynamics, which say that large, orderly molecules tend to break down into small, disorderly molecules." (Stanley A Rice, "Life of Earth: Portrait of a Beautiful, Middle-aged Stressed-out World", 2011)

"The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there." (Steven Pinker, "The Second Law of Thermodynamics", 2017)

21 January 2021

Complex Systems III

"Complexity must be grown from simple systems that already work." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Even though these complex systems differ in detail, the question of coherence under change is the central enigma for each." (John H Holland," Hidden Order: How Adaptation Builds Complexity", 1995)

"By irreducibly complex I mean a single system composed of several well-matched, interacting parts that contribute to the basic function, wherein the removal of any one of the parts causes the system to effectively cease functioning. An irreducibly complex system cannot be produced directly (that is, by continuously improving the initial function, which continues to work by the same mechanism) by slight, successive modification of a precursor, system, because any precursors to an irreducibly complex system that is missing a part is by definition nonfunctional." (Michael Behe, "Darwin’s Black Box", 1996)

"A dictionary definition of the word ‘complex’ is: ‘consisting of interconnected or interwoven parts’ […] Loosely speaking, the complexity of a system is the amount of information needed in order to describe it. The complexity depends on the level of detail required in the description. A more formal definition can be understood in a simple way. If we have a system that could have many possible states, but we would like to specify which state it is actually in, then the number of binary digits (bits) we need to specify this particular state is related to the number of states that are possible." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"When the behavior of the system depends on the behavior of the parts, the complexity of the whole must involve a description of the parts, thus it is large. The smaller the parts that must be described to describe the behavior of the whole, the larger the complexity of the entire system. […] A complex system is a system formed out of many components whose behavior is emergent, that is, the behavior of the system cannot be simply inferred from the behavior of its components." (Yaneer Bar-Yamm, "Dynamics of Complexity", 1997)

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"There is no over-arching theory of complexity that allows us to ignore the contingent aspects of complex systems. If something really is complex, it cannot by adequately described by means of a simple theory. Engaging with complexity entails engaging with specific complex systems. Despite this we can, at a very basic level, make general remarks concerning the conditions for complex behaviour and the dynamics of complex systems. Furthermore, I suggest that complex systems can be modelled." (Paul Cilliers," Complexity and Postmodernism", 1998)

"The self-similarity of fractal structures implies that there is some redundancy because of the repetition of details at all scales. Even though some of these structures may appear to teeter on the edge of randomness, they actually represent complex systems at the interface of order and disorder."  (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

20 December 2020

On Noise III

"Economists should study financial markets as they actually operate, not as they assume them to operate - observing the way in which information is actually processed, observing the serial correlations, bonanzas, and sudden stops, not assuming these away as noise around the edges of efficient and rational markets." (Adair Turner, "Economics after the Crisis: Objectives and means", 2012)

"Finding patterns is easy in any kind of data-rich environment; that's what mediocre gamblers do. The key is in determining whether the patterns represent signal or noise." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"The signal is the truth. The noise is what distracts us from the truth." (Nate Silver, "The Signal and the Noise: Why So Many Predictions Fail-but Some Don't", 2012)

"When some systems are stuck in a dangerous impasse, randomness and only randomness can unlock them and set them free. You can see here that absence of randomness equals guaranteed death. The idea of injecting random noise into a system to improve its functioning has been applied across fields. By a mechanism called stochastic resonance, adding random noise to the background makes you hear the sounds (say, music) with more accuracy." (Nassim N Taleb, "Antifragile: Things that gain from disorder", 2012)

"A signal is a useful message that resides in data. Data that isn’t useful is noise. […] When data is expressed visually, noise can exist not only as data that doesn’t inform but also as meaningless non-data elements of the display (e.g. irrelevant attributes, such as a third dimension of depth in bars, color variation that has no significance, and artificial light and shadow effects)." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)

"Data contain descriptions. Some are true, some are not. Some are useful, most are not. Skillful use of data requires that we learn to pick out the pieces that are true and useful. [...] To find signals in data, we must learn to reduce the noise - not just the noise that resides in the data, but also the noise that resides in us. It is nearly impossible for noisy minds to perceive anything but noise in data." (Stephen Few, "Signal: Understanding What Matters in a World of Noise", 2015)

"When we find data quality issues due to valid data during data exploration, we should note these issues in a data quality plan for potential handling later in the project. The most common issues in this regard are missing values and outliers, which are both examples of noise in the data." (John D Kelleher et al, "Fundamentals of Machine Learning for Predictive Data Analytics: Algorithms, worked examples, and case studies", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"Repeated observations of the same phenomenon do not always produce the same results, due to random noise or error. Sampling errors result when our observations capture unrepresentative circumstances, like measuring rush hour traffic on weekends as well as during the work week. Measurement errors reflect the limits of precision inherent in any sensing device. The notion of signal to noise ratio captures the degree to which a series of observations reflects a quantity of interest as opposed to data variance. As data scientists, we care about changes in the signal instead of the noise, and such variance often makes this problem surprisingly difficult." (Steven S Skiena, "The Data Science Design Manual", 2017)

"Using noise (the uncorrelated variables) to fit noise (the residual left from a simple model on the genuinely correlated variables) is asking for trouble." (Steven S Skiena, "The Data Science Design Manual", 2017)

On Noise I

"Noise is the most impertinent of all forms of interruption. It is not only an interruption, but also a disruption of thought." (Arthur Schopenhauer, "Parerga and Paralipomena", 1851)

"Mathematics is the predominant science of our time; its conquests grow daily, though without noise; he who does not employ it for himself, will some day find it employed against himself." (Johann F Herbart, Werke, 1890)

"Life pushes its way through this fatalistically determined world like a river flowing upstream. It is a system of utterly improbable order, a message in a world of noise." (Joseph H Rush, "The Dawn of Life", 1957)

"Higher, directed forms of energy (e.g., mechanical, electric, chemical) are dissipated, that is, progressively converted into the lowest form of energy, i.e., undirected heat movement of molecules; chemical systems tend toward equilibria with maximum entropy; machines wear out owing to friction; in communication channels, information can only be lost by conversion of messages into noise but not vice versa, and so forth." (Ludwig von Bertalanffy, "Robots, Men and Minds", 1967)

"To adapt to a changing environment, the system needs a variety of stable states that is large enough to react to all perturbations but not so large as to make its evolution uncontrollably chaotic. The most adequate states are selected according to their fitness, either directly by the environment, or by subsystems that have adapted to the environment at an earlier stage. Formally, the basic mechanism underlying self-organization is the (often noise-driven) variation which explores different regions in the system’s state space until it enters an attractor. This precludes further variation outside the attractor, and thus restricts the freedom of the system’s components to behave independently. This is equivalent to the increase of coherence, or decrease of statistical entropy, that defines self-organization." (Francis Heylighen, "The Science Of Self-Organization And Adaptivity", 1970)

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of 'nois' is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"An essential element of dynamics systems is a positive feedback that self-enhances the initial deviation from the mean. The avalanche is proverbial. Cities grow since they attract more people, and in the universe, a local accumulation of dust may attract more dust, eventually leading to the birth of a star. Earlier or later, self-enhancing processes evoke an antagonistic reaction. A collapsing stock market stimulates the purchase of shares at a low price, thereby stabilizing the market. The increasing noise, dirt, crime and traffic jams may discourage people from moving into a big city." (Hans Meinhardt, "The Algorithmic Beauty of Sea Shells", 1995)

"Rather mathematicians like to look for patterns, and the primes probably offer the ultimate challenge. When you look at a list of them stretching off to infinity, they look chaotic, like weeds growing through an expanse of grass representing all numbers. For centuries mathematicians have striven to find rhyme and reason amongst this jumble. Is there any music that we can hear in this random noise? Is there a fast way to spot that a particular number is prime? Once you have one prime, how much further must you count before you find the next one on the list? These are the sort of questions that have tantalized generations." (Marcus du Sautoy, "The Music of the Primes", 1998)

"Data are collected as a basis for action. Yet before anyone can use data as a basis for action the data have to be interpreted. The proper interpretation of data will require that the data be presented in context, and that the analysis technique used will filter out the noise."  (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

"Data are generally collected as a basis for action. However, unless potential signals are separated from probable noise, the actions taken may be totally inconsistent with the data. Thus, the proper use of data requires that you have simple and effective methods of analysis which will properly separate potential signals from probable noise." (Donald J Wheeler, "Understanding Variation: The Key to Managing Chaos" 2nd Ed., 2000)

19 December 2020

On Randomness V (Systems I)

"Is a random outcome completely determined, and random only by virtue of our ignorance of the most minute contributing factors? Or are the contributing factors unknowable, and therefore render as random an outcome that can never be determined? Are seemingly random events merely the result of fluctuations superimposed on a determinate system, masking its predictability, or is there some disorderliness built into the system itself?” (Deborah J Bennett, "Randomness", 1998)

"The self-similarity of fractal structures implies that there is some redundancy because of the repetition of details at all scales. Even though some of these structures may appear to teeter on the edge of randomness, they actually represent complex systems at the interface of order and disorder."  (Edward Beltrami, "What is Random?: Chaos and Order in Mathematics and Life", 1999)

"Emergent self-organization in multi-agent systems appears to contradict the second law of thermodynamics. This paradox has been explained in terms of a coupling between the macro level that hosts self-organization (and an apparent reduction in entropy), and the micro level (where random processes greatly increase entropy). Metaphorically, the micro level serves as an entropy 'sink', permitting overall system entropy to increase while sequestering this increase from the interactions where self-organization is desired." (H Van Dyke Parunak & Sven Brueckner, "Entropy and Self-Organization in Multi-Agent Systems", Proceedings of the International Conference on Autonomous Agents, 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"If a network is solely composed of neighborhood connections, information must traverse a large number of connections to get from place to place. In a small-world network, however, information can be transmitted between any two nodes using, typically, only a small number of connections. In fact, just a small percentage of random, long-distance connections is required to induce such connectivity. This type of network behavior allows the generation of 'six degrees of separation' type results, whereby any agent can connect to any other agent in the system via a path consisting of only a few intermediate nodes." (John H Miller & Scott E Page, "Complex Adaptive Systems", 2007)

"Although the potential for chaos resides in every system, chaos, when it emerges, frequently stays within the bounds of its attractor(s): No point or pattern of points is ever repeated, but some form of patterning emerges, rather than randomness. Life scientists in different areas have noticed that life seems able to balance order and chaos at a place of balance known as the edge of chaos. Observations from both nature and artificial life suggest that the edge of chaos favors evolutionary adaptation." (Terry Cooke-Davies et al, "Exploring the Complexity of Projects", 2009)

"Most systems in nature are inherently nonlinear and can only be described by nonlinear equations, which are difficult to solve in a closed form. Non-linear systems give rise to interesting phenomena such as chaos, complexity, emergence and self-organization. One of the characteristics of non-linear systems is that a small change in the initial conditions can give rise to complex and significant changes throughout the system. This property of a non-linear system such as the weather is known as the butterfly effect where it is purported that a butterfly flapping its wings in Japan can give rise to a tornado in Kansas. This unpredictable behaviour of nonlinear dynamical systems, i.e. its extreme sensitivity to initial conditions, seems to be random and is therefore referred to as chaos. This chaotic and seemingly random behaviour occurs for non-linear deterministic system in which effects can be linked to causes but cannot be predicted ahead of time." (Robert K Logan, "The Poetry of Physics and The Physics of Poetry", 2010)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010)

"[...] a high degree of unpredictability is associated with erratic trajectories. This not only because they look random but mostly because infinitesimally small uncertainties on the initial state of the system grow very quickly - actually exponentially fast. In real world, this error amplification translates into our inability to predict the system behavior from the unavoidable imperfect knowledge of its initial state." (Massimo Cencini, "Chaos: From Simple Models to Complex Systems", 2010)

"Although cascading failures may appear random and unpredictable, they follow reproducible laws that can be quantified and even predicted using the tools of network science. First, to avoid damaging cascades, we must understand the structure of the network on which the cascade propagates. Second, we must be able to model the dynamical processes taking place on these networks, like the flow of electricity. Finally, we need to uncover how the interplay between the network structure and dynamics affects the robustness of the whole system." (Albert-László Barabási, "Network Science", 2016)

17 December 2020

On Entropy (1990-1999)

"The inflationary period of expansion does not smooth out irregularity by entropy-producing processes like those explored by the cosmologies of the seventies. Rather it sweeps the irregularity out beyond the Horizon of our visible Universe, where we cannot see it . The entire universe of stars and galaxies on view to us. […] on this hypothesis, is but the reflection of a minute, perhaps infinitesimal, portion of the universe's initial conditions, whose ultimate extent and structure must remain forever unknowable to us. A theory of everything does not help here. The information contained in the observable part of the universe derives from the evolution of a tiny part of the initial conditions for the entire universe. The sum total of all the observations we could possibly make can only tell us about a minuscule portion of the whole." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"Three laws governing black hole changes were thus found, but it was soon noticed that something unusual was going on. If one merely replaced the words 'surface area' by 'entropy' and 'gravitational field' by 'temperature', then the laws of black hole changes became merely statements of the laws of thermodynamics. The rule that the horizon surface areas can never decrease in physical processes becomes the second law of thermodynamics that the entropy can never decrease; the constancy of the gravitational field around the horizon is the so-called zeroth law of thermodynamics that the temperature must be the same everywhere in a state of thermal equilibrium. The rule linking allowed changes in the defining quantities of the black hole just becomes the first law of thermodynamics, which is more commonly known as the conservation of energy." (John D Barrow, "Theories of Everything: The Quest for Ultimate Explanation", 1991)

"The new information technologies can be seen to drive societies toward increasingly dynamic high-energy regions further and further from thermodynamical equilibrium, characterized by decreasing specific entropy and increasingly dense free-energy flows, accessed and processed by more and more complex social, economic, and political structures." (Ervin László, "Information Technology and Social Change: An Evolutionary Systems Analysis", Behavioral Science 37, 1992) 

"Fuzzy entropy measures the fuzziness of a fuzzy set. It answers the question 'How fuzzy is a fuzzy set?' And it is a matter of degree. Some fuzzy sets are fuzzier than others. Entropy means the uncertainty or disorder in a system. A set describes a system or collection of things. When the set is fuzzy, when elements belong to it to some degree, the set is uncertain or vague to some degree. Fuzzy entropy measures this degree. And it is simple enough that you can see it in a picture of a cube." (Bart Kosko, "Fuzzy Thinking: The new science of fuzzy logic", 1993)

"The Law of Entropy Nonconservation required that life be lived forward, from birth to death. […] To wish for the reverse was to wish for the entropy of the universe to diminish with time, which was impossible. One might as well wish for autumn leaves to assemble themselves in neat stacks just as soon as they had fallen from trees or for water to freeze whenever it was heated." (Michael Guillen, "Five Equations That Changed the World", 1995)

"The second law of thermodynamics, which requires average entropy (or disorder) to increase, does not in any way forbid local order from arising through various mechanisms of self-organization, which can turn accidents into frozen ones producing extensive regularities. Again, such mechanisms are not restricted to complex adaptive systems." (Murray Gell-Mann, "What is Complexity?", Complexity Vol 1 (1), 1995)

"All systems evolve, although the rates of evolution may vary over time both between and within systems. The rate of evolution is a function of both the inherent stability of the system and changing environmental circumstances. But no system can be stabilized forever. For the universe as a whole, an isolated system, time’s arrow points toward greater and greater breakdown, leading to complete molecular chaos, maximum entropy, and heat death. For open systems, including the living systems that are of major interest to us and that interchange matter and energy with their external environments, time’s arrow points to evolution toward greater and greater complexity. Thus, the universe consists of islands of increasing order in a sea of decreasing order. Open systems evolve and maintain structure by exporting entropy to their external environments." (L Douglas Kiel, "Chaos Theory in the Social Sciences: Foundations and Applications", 1996)

"Contrary to what happens at equilibrium, or near equilibrium, systems far from equilibrium do not conform to any minimum principle that is valid for functions of free energy or entropy production." (Ilya Prigogine, "The End of Certainty: Time, Chaos, and the New Laws of Nature", 1996)

"Complex systems operate under conditions far from equilibrium. Complex systems need a constant flow of energy to change, evolve and survive as complex entities. Equilibrium, symmetry and complete stability mean death. Just as the flow, of energy is necessary to fight entropy and maintain the complex structure of the system, society can only survive as a process. It is defined not by its origins or its goals, but by what it is doing." (Paul Cilliers,"Complexity and Postmodernism: Understanding Complex Systems", 1998)

"In a closed system, the change in entropy must always be 'positive', meaning toward death. However, in open biological or social systems, entropy can be arrested and may even be transformed into negative entropy - a process of more complete organization and enhanced ability to transform resources. Why? Because the system imports energy and resources from its environment, leading to renewal. This is why education and learning are so important, as they provide new and stimulating input (termed neg-entropy) that can transform each of us." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

"No one has yet succeeded in deriving the second law from any other law of nature. It stands on its own feet. It is the only law in our everyday world that gives a direction to time, which tells us that the universe is moving toward equilibrium and which gives us a criteria for that state, namely, the point of maximum entropy, of maximum probability. The second law involves no new forces. On the contrary, it says nothing about forces whatsoever." (Brian L Silver, "The Ascent of Science", 1998)

"Physical systems are subject to the force of entropy, which increases until eventually the entire system fails. The tendency toward maximum entropy is a movement to disorder, complete lack of resource transformation, and death." (Stephen G Haines, "The Managers Pocket Guide to Systems Thinking & Learning", 1998)

08 December 2020

On Entropy (From Fiction to Science-Fiction)

"One thinks one’s something unique and wonderful at the center of the universe. But in fact one’s merely a slight delay in the ongoing march of entropy." (Aldous Huxley, "Island", 1962)

"No structure, even an artificial one, enjoys the process of entropy. It is the ultimate fate of everything, and everything resists it." (Philip K Dick, "Galactic Pot-Healer", 1969)

"When things don't change any longer, that's the end result of entropy, the heat-death of the universe. The more things go on moving, interrelating, conflicting, changing, the less balance there is - and the more life." (Ursula K Le Guin, "The Lathe of Heaven", 1971)

"In the wastes of nonbeing it is born, flickers out, is born again and holds together, swells and spreads. In lifelessness it lives, against the gray tide of entropy it strives, improbably persists, gathering itself into ever richer complexities until it grows as a swelling wave. (James Tiptree Jr., "SheWaits for All Men Born", 1976)

"Her dance spoke of nothing more and nothing less than the tragedy of being alive, and being human. It spoke, most eloquently, of pain. It spoke, most knowingly, of despair. It spoke of the cruel humor of limitless ambition yoked to limited ability, of eternal hope invested in an ephemeral lifetime, of the driving need to try and create an inexorably predetermined future. It spoke of fear, and of hunger, and, most clearly, of the basic loneliness and alienation of the human animal. It described the universe through the eyes of man: a hostile environment, the embodiment of entropy, into which we are all thrown alone, forbidden by our nature to touch another mind save secondhand, by proxy. It spoke of the blind perversity which forces man to strive hugely for a peace which, once attained, becomes boredom. And it spoke of folly, of the terrible paradox by which man is simultaneously capable of reason and unreason, forever unable to cooperate even with himself." Spider Robinson and Jeanne Robinson, "Stardance", 1977)

"We see the universe as it is, Father Damien, and these naked truths are cruel ones. We who believe in life, and treasure it, will die. Afterward there will be nothing, eternal emptiness, blackness, nonexistence. In our living there has been no purpose, no poetry, no meaning. Nor do our deaths possess these qualities. When we are gone, the universe will not long remember us, and shortly it will be as if we had never lived at all. Our worlds and our universe will not long outlive us. Ultimately entropy will consume all, and our puny efforts cannot stay that awful end." (George R R Martin, "The Way of Cross and Dragon", 1979)

"But no longer were they always obedient to the mandates of their creators; like all material things, they were not immune to the corruptions of Time and its patient, unsleeping servant, Entropy." (Arthur C Clark, "3001: The Final Odyssey", 1997)

"Out of twinkling stardust all came, into dark matter all will fall. Death mocks us as we laugh defiance at entropy, yet ignorance birthed mortals sail forth upon time’s cruel sea." (Peter F Hamilton, "The Temporal Void", 2008)

"Yet, in the end, entropy will always emerge victorious, snuffing out the very last glimmer of heat and light. After that there is only darkness. When that state is reached even eternity will cease to exist, for one moment will be like every other and nothingness will claim the universe." (Peter F Hamilton, "The Temporal Void", 2008)

"Nothing up there tonight but entropy, and the same imaginary shapes that people had been imposing on nature since they’d first thought to wonder at the heavens." (Peter Watts, "Echopraxia", 2014)

"The process of thinking itself requires us to view the universe in the direction of entropy, since an abstraction always involves information loss, since symbols 'abstract' complexity from observed objects." (John C Wright, "Awake in the Night Land", 2014)

"And don’t ever make the mistake of thinking that things you didn’t intend or plan don’t matter. It’s a big, disorganised multiverse out there - an accident of stars. Almost nothing ever works out like we want it to, and when it does, there’s guaranteed to be unexpected consequences. Randomness is what separates life from entropy, but it’s also what makes it fun." (Foz Meadows, "An Accident of Stars", 2016)

"Entropy is just a fancy way of saying: things fall apart." (Dan Brown, "Origin", 2017)

Related Posts Plugin for WordPress, Blogger...

On Leonhard Euler

"I have been able to solve a few problems of mathematical physics on which the greatest mathematicians since Euler have struggled in va...