07 August 2022

Edwin T Jaynes - Collected Quotes

"In conventional statistical mechanics the energy plays a preferred role among all dynamical quantities because it is conserved both in the time development of isolated systems and in the interaction of different systems. Since, however, the principles of maximum-entropy inference are independent of any physical properties, it appears that in subjective statistical mechanics all measurable quantities may be treated on the same basis, subject to certain precautions." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"Just as in applied statistics the crux of a problem is often the devising of some method of sampling that avoids bias, our problem is that of finding a probability assignment which avoids bias, while agreeing with whatever information is given. The great advance provided by information theory lies in the discovery that there is a unique, unambiguous criterion for the 'amount of uncertainty' represented by a discrete probability distribution, which agrees with our intuitive notions that a broad distribution represents more uncertainty than does a sharply peaked one, and satisfies all other conditions which make it reasonable." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"On the other hand, the 'subjective' school of thought, regards probabilities as expressions of human ignorance; the probability of an event is merely a formal expression of our expectation that the event will or did occur, based on whatever information is available. To the subjectivist, the purpose of probability theory is to help us in forming plausible conclusions in cases where there is not enough information available to lead to certain conclusions; thus detailed verification is not expected. The test of a good subjective probability distribution is does it correctly represent our state of knowledge as to the value of x?" (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"The mere fact that the same mathematical expression -Σ pi log(pi) [i is index], occurs both in statistical mechanics and in information theory does not in itself establish any connection between these fields. This can be done only by finding new viewpoints from which thermodynamic entropy and information-theory entropy appear as the same concept." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956)

"[...] thermodynamics knows of no such notion as the 'entropy of a physical system'. Thermodynamics does have the concept of the entropy of a thermodynamic system; but a given physical system corresponds to many different thermodynamic systems." (Edwin T Jaynes, "Gibbs vs Boltzmann Entropies", 1964)

"In particular, the uncertainty principle has stood for a generation, barring the way to more detailed descriptions of nature; and yet, with the lesson of parity still fresh in our minds, how can anyone be quite so sure of its universal validity when we note that, to this day, it has never been subjected to even one direct experimental test?" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"'You cannot base a general mathematical theory on imprecisely defined concepts. You can make some progress that way; but sooner or later the theory is bound to dissolve in ambiguities which prevent you from extending it further.' Failure to recognize this fact has another unfortunate consequence which is, in a practical sense, even more disastrous: 'Unless the conceptual problems of a field have been clearly resolved, you cannot say which mathematical problems are the relevant ones worth working on; and your efforts are more than likely to be wasted.'" (Edwin T Jaynes, "Foundations of Probability Theory and Statistical Mechanics", 1967)

"In decision theory, mathematical analysis shows that once the sampling distribution, loss function, and sample are specified, the only remaining basis for a choice among different admissible decisions lies in the prior probabilities. Therefore, the logical foundations of decision theory cannot be put in fully satisfactory form until the old problem of arbitrariness (sometimes called 'subjectiveness') in assigning prior probabilities is resolved." (Edwin T Jaynes, "Prior Probabilities", 1978)

"It appears to be a quite general principle that, whenever there is a randomized way of doing something, then there is a nonrandomized way that delivers better performance but requires more thought." (Edwin T Jaynes, "Probability Theory: The Logic of Science", 1979)

"The semiliterate on the next bar stool will tell you with absolute, arrogant assurance just how to solve the world's problems; while the scholar who has spent a lifetime studying their causes is not at all sure how to do this." (Edwin T Jaynes, "Probability Theory: The Logic of Science", 1979)

"The difference is that energy is a property of the microstates, and so all observers, whatever macroscopic variables they may choose to define their thermodynamic states, must ascribe the same energy to a system in a given microstate. But they will ascribe different entropies to that microstate, because entropy is not a property of the microstate, but rather of the reference class in which it is embedded. As we learned from Boltzmann, Planck, and Einstein, the entropy of a thermodynamic state is a measure of the number of microstates compatible with the macroscopic quantities that you or I use to define the thermodynamic state." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

"There is no end to this search for the ultimate ‘true’ entropy until we have reached the point where we control the location of each atom independently. But just at that point the notion of entropy collapses, and we are no longer talking thermodynamics." (Edwin T Jaynes, "Papers on Probability, Statistics, and Statistical Physics", 1983)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Data: Longitudinal Data

  "Longitudinal data sets are comprised of repeated observations of an outcome and a set of covariates for each of many subjects. One o...