29 January 2026

On Measures (1975-1999)

 "The amount of information conveyed by the message increases as the amount of uncertainty as to what message actually will be produced becomes greater. A message which is one out of ten possible messages conveys a smaller amount of information than a message which is one out of a million possible messages. The entropy of communication theory is a measure of this uncertainty and the uncertainty, or entropy, is taken as the measure of the amount of information conveyed by a message from a source. The more we know about what message the source will produce, the less uncertainty, the less the entropy, and the less the information." (John R Pierce, "An Introduction to Information Theory: Symbols, Signals and Noise", 1979)

"The term closed loop-learning process refers to the idea that one learns by determining what s desired and comparing what is actually taking place as measured at the process and feedback for comparison. The difference between what is desired and what is taking place provides an error indication which is used to develop a signal to the process being controlled." (Harold Chestnut, 1984)

"Just like a computer, we must remember things in the order in which entropy increases. This makes the second law of thermodynamics almost trivial. Disorder increases with time because we measure time in the direction in which disorder increases."  (Stephen Hawking, "A Brief History of Time", 1988)

"Engineers, always looking for optimal values for the measures of magnitudes which interest them, think of mathematicians as custodians of a fund of formulae, to be supplied to them on demand." (Jean Dieudonné, "Mathematics - The Music of Reason", 1992)

"It has long been appreciated by science that large numbers behave differently than small numbers. Mobs breed a requisite measure of complexity for emergent entities. The total number of possible interactions between two or more members accumulates exponentially as the number of members increases. At a high level of connectivity, and a high number of members, the dynamics of mobs takes hold. " (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations." (Carl Sagan, "The Demon-Haunted World: Science as a Candle in the Dark", 1995)

"Yet everything has a beginning, everything comes to an end, and if the universe actually began in some dense explosion, thus creating time and space, so time and space are themselves destined to disappear, the measure vanishing with the measured, until with another ripple running through the primordial quantum field, something new arises from nothingness once again." (David Berlinski, "A Tour of the Calculus", 1995)

"Probabilities aren't just numbers, and they aren't just frequencies-on-average. They are also rather like a substance that flows, dividing according to the likelihood of various outcomes, subdividing when several trials are performed in succession, and adding together when several outcomes are combined to give an event. This is a metaphor, but an accurate and powerful one. It is, in a sense, the metaphor that mathematicians formalise when they offer a definition of probability. In this sense, probability behaves like volume, mass, or area. The technical term is 'measure'. The technical definition of probability is 'a measure such that various nice things happen'. Probability is a quantity that flows through the conceptual maze of possible events, and it behaves just like water flowing through pipes." (Ian Stewart, "The Magical Maze: Seeing the World Through Mathematical Eyes", 1997)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Measures (-1849)

"Numbers prime to one another are those which are measured by a unit alone as a common measure." (Euclid, "The Elements"...