"If for the entire universe we conceive the same magnitude to be determined, consistently and with due regard to all circumstances, which for a single body I have called entropy, and if at the same time we introduce the other and simpler conception of energy, we may express in the following manner the fundamental laws of the universe which correspond to the two fundamental theorems of the mechanical theory of heat: (1) The energy of the universe is constant. (2) The entropy of the universe tends to a maximum." (Rudolf Clausius, "The Mechanical Theory of Heat - With its Applications to the Steam Engine and to Physical Properties of Bodies", 1867)
"The Entropy of a system is the mechanical work it can perform without communication of heat, or alteration of its total volume, all transference of heat being performed by reversible engines. When the pressure and temperature of the system have become uniform the entropy is exhausted." (James C Maxwell, "Theory of Heat", 1899)
"The second law of thermodynamics appears solely as a law of probability, entropy as a measure of the probability, and the increase of entropy is equivalent to a statement that more probable events follow less probable ones." (Max Planck, "A Survey of Physics", 1923)
"Entropy is the measure of randomness." (Lincoln Barnett, "The Universe and Dr. Einstein", 1948)
"Just as entropy is a measure of disorganization, the information carried by a set of messages is a measure of organization." (Norbert Wiener, "The Human Use of Human Beings", 1950)
"Entropy is a measure of the heat energy in a substance that has been lost and is no longer available for work. It is a measure of the deterioration of a system." (William B. Sill & Norman Hoss (Eds.), "Popular Science Encyclopedia of the Sciences", 1963)
"You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, no one really knows what entropy really is, so in a debate you will always have the advantage." (John von Neumann) [Suggesting to Claude Shannon a name for his new uncertainty function, see Scientific American Vol. 225 (3), 1971]
"Thus, in physics, entropy is associated with the possibility of converting thermal energy into mechanical energy. If the entropy does not change during a process, the process is reversible. If the entropy increases, the available energy decreases. Statistical mechanics interprets an increase of entropy as a decrease in order or, if we wish, as a decrease in our knowledge." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals & Noise" 2nd Ed., 1980)
"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)
"Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information." (Hans C von Baeyer," Information, The New Language of Science", 2003)
"Heat is the energy of random chaotic motion, and entropy is the amount of hidden microscopic information." (Leonard Susskind, "The Black Hole War", 2008)
"Entropy is the crisp scientific name for waste, chaos, and disorder." (Kevin Kelly, "What Technology Wants", 2010)
"Entropy is a measure of amount of uncertainty or disorder present in the system within the possible probability distribution. The entropy and amount of unpredictability are directly proportional to each other." (G Suseela & Y Asnath V Phamila, "Security Framework for Smart Visual Sensor Networks", 2019)
No comments:
Post a Comment