05 December 2020

On Information Theory I

"[…] information theory is characterised essentially by its dealing always with a set of possibilities; both its primary data and its final statements are almost always about the set as such, and not about some individual element in the set." (W Ross Ashby, "An Introduction to Cybernetics", 1956)

"Just as in applied statistics the crux of a problem is often the devising of some method of sampling that avoids bias, our problem is that of finding a probability assignment which avoids bias, while agreeing with whatever information is given. The great advance provided by information theory lies in the discovery that there is a unique, unambiguous criterion for the 'amount of uncertainty' represented by a discrete probability distribution, which agrees with our intuitive notions that a broad distribution represents more uncertainty than does a sharply peaked one, and satisfies all other conditions which make it reasonable." (Edwin T Jaynes, "Information Theory and Statistical Mechanics" I, 1956

"Cybernetics is concerned primarily with the construction of theories and models in science, without making a hard and fast distinction between the physical and the biological sciences. The theories and models occur both in symbols and in hardware, and by 'hardware’ we shall mean a machine or computer built in terms of physical or chemical, or indeed any handleable parts. Most usually we shall think of hardware as meaning electronic parts such as valves and relays. Cybernetics insists, also, on a further and rather special condition that distinguishes it from ordinary scientific theorizing: it demands a certain standard of effectiveness. In this respect it has acquired some of the same motive power that has driven research on modern logic, and this is especially true in the construction and application of artificial languages and the use of operational definitions. Always the search is for precision and effectiveness, and we must now discuss the question of effectiveness in some detail. It should be noted that when we talk in these terms we are giving pride of place to the theory of automata at the expense, at least to some extent, of feedback and information theory." (Frank H George, "The Brain As A Computer", 1962)

"Now we are looking for another basic outlook on the world - the world as organization. Such a conception - if it can be substantiated - would indeed change the basic categories upon which scientific thought rests, and profoundly influence practical attitudes. This trend is marked by the emergence of a bundle of new disciplines such as cybernetics, information theory, general system theory, theories of games, of decisions, of queuing and others; in practical applications, systems analysis, systems engineering, operations research, etc. They are different in basic assumptions, mathematical techniques and aims, and they are often unsatisfactory and sometimes contradictory. They agree, however, in being concerned, in one way or another, with ‘systems’, ‘wholes’ or ‘organizations’; and in their totality, they herald a new approach." (Ludwig von Bertalanffy, "General System Theory", 1968)

"The general notion in communication theory is that of information. In many cases, the flow of information corresponds to a flow of energy, e. g. if light waves emitted by some objects reach the eye or a photoelectric cell, elicit some reaction of the organism or some machinery, and thus convey information." (Ludwig von Bertalanffy, "General System Theory", 1968) 

"My analysis of living systems uses concepts of thermodynamics, information theory, cybernetics, and systems engineering, as well as the classical concepts appropriate to each level. The purpose is to produce a description of living structure and process in terms of input and output, flows through systems, steady states, and feedbacks, which will clarify and unify the facts of life." (James G Miller, "Living Systems: Basic Concepts", 1969)

"The 'flow of information' through human communication channels is enormous. So far no theory exists, to our knowledge, which attributes any sort of unambiguous measure to this 'flow'." (Anatol Rapoport, "Modern Systems Research for the Behavioral Scientist", 1969)

"Probability plays a central role in many fields, from quantum mechanics to information theory, and even older fields use probability now that the presence of "noise" is officially admitted. The newer aspects of many fields start with the admission of uncertainty." (Richard Hamming, "Methods of Mathematics Applied to Calculus, Probability, and Statistics", 1985)

"The field of 'information theory' began by using the old hardware paradigm of transportation of data from point to point." (Marshall McLuhan & Eric McLuhan, Laws of Media: The New Science, 1988)

"Without an understanding of causality there can be no theory of communication. What passes as information theory today is not communication at all, but merely transportation." (Marshall McLuhan & Eric McLuhan, "Laws of Media: The New Science", 1988)

"The cybernetics phase of cognitive science produced an amazing array of concrete results, in addition to its long-term (often underground) influence: the use of mathematical logic to understand the operation of the nervous system; the invention of information processing machines (as digital computers), thus laying the basis for artificial intelligence; the establishment of the metadiscipline of system theory, which has had an imprint in many branches of science, such as engineering (systems analysis, control theory), biology (regulatory physiology, ecology), social sciences (family therapy, structural anthropology, management, urban studies), and economics (game theory); information theory as a statistical theory of signal and communication channels; the first examples of self-organizing systems. This list is impressive: we tend to consider many of these notions and tools an integrative part of our life […]" (Francisco J Varela, "The Embodied Mind", 1991)

"[...] the mean information of a message is defined as the amount of chance (or randomness) present in a set of possible messages. To see that this is a natural definition, note that by choosing a message, one destroys the randomness present in the variety of possible messages. Information theory is thus concerned, as is statistical mechanics, with measuring amounts of randomness. The two theories are therefore closely related." (David Ruelle, "Chance and Chaos", 1991)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On James J Sylvester

"It is true that, in the eyes of the pure mathematician, Quaternions have one grand and fatal defect. They cannot be applied to space o...