08 November 2025

On Information Theory II

"If quantum communication and quantum computation are to flourish, a new information theory will have to be developed." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In fact, an information theory that leaves out the issue of noise turns out to have no content." (Hans Christian von Baeyer, "Information, The New Language of Science", 2003)

"In an information economy, entrepreneurs master the science of information in order to overcome the laws of the purely physical sciences. They can succeed because of the surprising power of the laws of information, which are conducive to human creativity. The central concept of information theory is a measure of freedom of choice. The principle of matter, on the other hand, is not liberty but limitation - it has weight and occupies space." (George Gilder, "Knowledge and Power: The Information Theory of Capitalism and How it is Revolutionizing our World", 2013)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"The mathematical theory of information [...] is solely concerned with the ‘surprise’ aspect of information. There are two reasons for this. Firstly, information theory was originally developed within the context of communication engineering, where it was only the surprise factor that was of relevance. Secondly, ‘meaning’ has so far proved too difficult a concept to develop mathematically. The consequence of this is that we should be aware that ‘information’ [...] has the restricted technical meaning of ‘measure of surprise’." (David Applebaum," Probability and Information: An Integrated Approach", 2008)

"Cognitive psychology has followed a different direction, considering intelligence as a set of mental representations and a series of processes that operate on these representations that allows the individual to adapt to the changing conditions of the environment. This type of approach is connected with information theory. The intelligent mind operates by processing information that it collects from the environment, and the better and faster this information is processed, the more intelligence is demonstrated." (Diego Rasskin-Gutman, "Chess Metaphors: Artificial Intelligence and the Human Mind", 2009)

"Information Theory is a mathematical treatment of what is left after the meanings have been removed from a Communication." (John Gall, "The Systems Bible: The Beginner's Guide to Systems Large and Small"[Systematics 3rd Ed.], 2011)

"In an information economy, entrepreneurs master the science of information in order to overcome the laws of the purely physical sciences. They can succeed because of the surprising power of the laws of information, which are conducive to human creativity. The central concept of information theory is a measure of freedom of choice. The principle of matter, on the other hand, is not liberty but limitation - it has weight and occupies space." (George Gilder, "Knowledge and Power: The Information Theory of Capitalism and How it is Revolutionizing our World", 2013)

"[…] the term 'information', as used in information theory, has nothing to do with meaning. It is a measure of the order, or nonrandomness, of a signal; and the main concern of information theory is the problem of how to get a message, coded as a signal, through a noisy channel." (Fritjof Capra, "The Systems View of Life: A Unifying Vision", 2014)

"Information theory leads to the quantification of the information content of the source, as denoted by entropy, the characterization of the information-bearing capacity of the communication channel, as related to its noise characteristics, and consequently the establishment of the relationship between the information content of the source and the capacity of the channel. In short, information theory provides a quantitative measure of the information contained in message signals and help determine the capacity of a communication system to transfer this information from source to sink over a noisy channel in a reliable fashion." (Ali Grami, "Information Theory", 2016)

"In information theory this notion, introduced by Claude Shannon, is used to express unpredictability of information content. For instance, if a data set containing n items was divided into k groups each comprising n i items, then the entropy of such a partition is H = p 1 log( p 1 ) + … + p k log( p k ), where p i = n i / n . In case of two alternative partitions, the mutual information is a measure of the mutual dependence between these partitions." (Slawomir T Wierzchon, "Ensemble Clustering Data Mining and Databases", 2018) 

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Information Theory II

"If quantum communication and quantum computation are to flourish, a new information theory will have to be developed." (Hans Chri...