10 September 2023

On Entropy (2010-2019)

"Entropy is the crisp scientific name for waste, chaos, and disorder. As far as we know, the sole law of physics with no known exceptions anywhere in the universe is this: All creation is headed to the basement. Everything in the universe is steadily sliding down the slope toward the supreme equality of wasted heat and maximum entropy." (Kevin Kelly, "What Technology Wants", 2010)

"If everything in the universe evolves toward increasing disorder, it must have started out in an exquisitely ordered arrangement. This whole chain of logic, purporting to explain why you can't turn an omelet into an egg, apparently rests on a deep assumption about the very beginning of the universe. It was in a state of very low entropy, very high order. Why did our part of the universe pass though a period of such low entropy?" (Sean Carroll, "From Eternity to Here: The Quest for the Ultimate Theory of Time", 2010)

"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010) 

"Information, defined intuitively and informally, might be something like 'uncertainty's antidote'. This turns out also to be the formal definition - the amount of information comes from the amount by which something reduces uncertainty. [...] The higher the [information] entropy, the more information there is. It turns out to be a value capable of measuring a startling array of things- from the flip of a coin to a telephone call, to a Joyce novel, to a first date, to last words, to a Turing test. [...] Entropy suggests that we gain the most insight on a question when we take it to the friend, colleague, or mentor of whose reaction and response we're least certain. And it suggests, perhaps, reversing the equation, that if we want to gain the most insight into a person, we should ask the question of whose answer we're least certain. [...] Pleasantries are low entropy, biased so far that they stop being an earnest inquiry and become ritual. Ritual has its virtues, of course, and I don't quibble with them in the slightest. But if we really want to start fathoming someone, we need to get them speaking in sentences we can't finish." (Brian Christian, "The Most Human Human: What Talking with Computers Teaches Us About What It Means to Be Alive", 2011)

"The laws of thermodynamics tell us something quite different. Economic activity is merely borrowing low-entropy energy inputs from the environment and transforming them into temporary products and services of value. In the transformation process, often more energy is expended and lost to the environment than is embedded in the particular good or service being produced." (Jeremy Rifkin, "The Third Industrial Revolution", 2011)

"The psychic entropy peculiar to the human condition involves seeing more to do than one can actually accomplish and feeling able to accomplish more than what conditions allow."(Mihaly Csikszentmihalyi, "Flow: The Psychology of Happiness", 2013)

"In a physical system, information is the opposite of entropy, as it involves uncommon and highly correlated configurations that are difficult to arrive at." (César A Hidalgo, "Why Information Grows: The Evolution of Order, from Atoms to Economies", 2015)

"The passage of time and the action of entropy bring about ever-greater complexity - a branching, blossoming tree of possibilities. Blossoming disorder (things getting worse), now unfolding within the constraints of the physics of our universe, creates novel opportunities for spontaneous ordered complexity to arise." (D J MacLennan, "Frozen to Life", 2015)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"The natural effect of processes going on in the Universe is to move from a state of order to a state of disorder, unless there is an input of energy from outside." (John R Gribbin, "The Time Illusion", 2016) 

"The Second Law of Thermodynamics states that in an isolated system (one that is not taking in energy), entropy never decreases. (The First Law is that energy is conserved; the Third, that a temperature of absolute zero is unreachable.) Closed systems inexorably become less structured, less organized, less able to accomplish interesting and useful outcomes, until they slide into an equilibrium of gray, tepid, homogeneous monotony and stay there." (Steven Pinker, "The Second Law of Thermodynamics", 2017) [source

"Our greatest enemies are ultimately not our political adversaries but entropy, evolution (in the form of pestilence and the flaws in human nature), and most of all ignorance - a shortfall of knowledge of how best to solve our problems." (Steven Pinker, "Enlightenment Now: The Case for Reason, Science, Humanism, and Progress", 2018)

"[...] our vision of the world is blurred because the physical interactions between the part of the world to which we belong and the rest are blind to many variables. This blurring is at the heart of Boltzmann's theory. From this blurring, the concepts of heat and entropy are born - and these are linked to the phenomena that characterize the flow of time. The entropy of a system depends explicitly on blurring. It depends on what I do not register, because it depends on the number of indistinguishable configurations. The same microscopic configuration may be of high entropy with regard to one blurring and low in relation to another." (Carlo Rovelli, "The Order of Time", 2018)

"The entropy of the world does not depend only on the configuration of the world; it also depends on the way in which we are blurring the world, and this depends on what the variables of the world are that we interact with. That is to say, on the variables with which our part of the world interacts." (Carlo Rovelli, "The Order of Time", 2018)

"The entropy of the world in the far past appears very low to us. But this might not reflect the exact state of the world: it might regard the subset of the world's variables with which we, as physical systems, have interacted. It is with respect to the dramatic blurring produced by our interactions with the world, caused by the small set of macroscopic variables in terms of which we describe the world, that the entropy of the universe was low." (Carlo Rovelli, "The Order of Time", 2018)

"Disorder is a collective property of large assemblages; it makes no sense to say a single molecule is disordered or random. Thermodynamic quantities like entropy and heat energy are defined by reference to enormous numbers of particles - for example, molecules of gas careering about - and averaging across them without considering the details of individual particles. (Such averaging is sometimes called a ‘coarse-grained view’.) Thus, the temperature of a gas is related to the average energy of motion of the gas molecules. The point is that whenever one takes an average some information is thrown away, that is, we accept some ignorance. The average height of a Londoner tells us nothing about the height of a specific person. Likewise, the temperature of a gas tells us nothing about the speed of a specific molecule. In a nutshell: information is about what you know, and entropy is about what you don’t know." (Paul Davies, "The Demon in the Machine: How Hidden Webs of Information Are Solving the Mystery of Life", 2019)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

A Picture's Worth

"The drawing shows me at a glance what would be spread over ten pages in a book." (Ivan Turgenev, 1862) [2] "Sometimes, half ...