"[...] according to the quantum theory, randomness is a basic trait of reality, whereas in classical physics it is a derivative property, though an equally objective one. Note, however, that this conclusion follows only under the realist interpretation of probability as the measure of possibility. If, by contrast, one adopts the subjectivist or Bayesian conception of probability as the measure of subjective uncertainty, then randomness is only in the eye of the beholder." (Mario Bunge, "Matter and Mind: A Philosophical Inquiry", 2010)
"Second Law of thermodynamics is not an equality, but an inequality, asserting merely that a certain quantity referred to as the entropy of an isolated system - which is a measure of the system’s disorder, or ‘randomness’ - is greater (or at least not smaller) at later times than it was at earlier times." (Roger Penrose, "Cycles of Time: An Extraordinary New View of the Universe", 2010)
"In an information economy, entrepreneurs master the science of information in order to overcome the laws of the purely physical sciences. They can succeed because of the surprising power of the laws of information, which are conducive to human creativity. The central concept of information theory is a measure of freedom of choice. The principle of matter, on the other hand, is not liberty but limitation - it has weight and occupies space." (George Gilder, "Knowledge and Power: The Information Theory of Capitalism and How it is Revolutionizing our World", 2013)
"In negative feedback regulation the organism has set points to which different parameters (temperature, volume, pressure, etc.) have to be adapted to maintain the normal state and stability of the body. The momentary value refers to the values at the time the parameters have been measured. When a parameter changes it has to be turned back to its set point. Oscillations are characteristic to negative feedback regulation […]" (Gaspar Banfalvi, "Homeostasis - Tumor – Metastasis", 2014)
"Fuzzy thinking can never be proven wrong. And only when we are proven wrong so clearly that we can no longer deny it to ourselves will we adjust our mental models of the world - producing a clearer picture of reality. Forecast, measure, revise: it is the surest path to seeing better." (Philip E Tetlock, "Superforecasting: The Art and Science of Prediction", 2015)
"The proper measure of a philosophical system or a scientific theory is not the degree to which it anticipated modern thought, but its degree of success in treating the philosophical and scientific problems of its own day." (Steven Weinberg, "To Explain the World: The Discovery of Modern Science", 2015)
"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)
"Statistics is the science of collecting, organizing, summarizing, and analyzing information to draw conclusions or answer questions. In addition, statistics is about providing a measure of confidence in any conclusions." (Michael Sullivan, "Statistics: Informed Decisions Using Data", 5th Ed., 2017)
No comments:
Post a Comment