17 December 2020

Lars Skyttner - Collected Quotes

"A mathematical model uses mathematical symbols to describe and explain the represented system. Normally used to predict and control, these models provide a high degree of abstraction but also of precision in their application." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"A symbol is a mental representation regarding the internal reality referring to its object by a convention and produced by the conscious interpretation of a sign. In contrast to signals, symbols may be used every time if the receiver has the corresponding representation. Symbols also relate to feelings and thus give access not only to information but also to the communicator’s motivational and emotional state. The use of symbols makes it possible for the organism using it to evoke in the receiver the same response it evokes in himself. To communicate with symbols is to use a language." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"As a meta-discipline, systems science will transfer its content from discipline to discipline and address problems beyond conventional reductionist boundaries. Generalists, qualified to manage today’s problem better than the specialist, could be fostered. With these intentions, systems thinking and systems science should not replace but add, complement and integrate those aspects that seem not to be adequately treated by traditional science." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Defined from a societal standpoint, information may be seen as an entity which reduces maladjustment between system and environment. In order to survive as a thermodynamic entity, all social systems are dependent upon an information flow. This explanation is derived from the parallel between entropy and information where the latter is regarded as negative entropy (negentropy). In more common terms information is a form of processed data or facts about objects, events or persons, which are meaningful for the receiver, inasmuch as an increase in knowledge reduces uncertainty." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Entropy [...] is the amount of disorder or randomness present in any system. All non-living systems tend toward disorder; left alone they will eventually lose all motion and degenerate into an inert mass. When this permanent stage is reached and no events occur, maximum entropy is attained. A living system can, for a finite time, avert this unalterable process by importing energy from its environment. It is then said to create negentropy, something which is characteristic of all kinds of life." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Expressed in terms of entropy, open systems are negentropic, that is, tend toward a more elaborate structure. As open systems, organisms which are in equilibrium are capable of working for a long time by use of the constant input of matter and energy. Closed systems, however, increase their entropy, tend to run down and can therefore be called ’dying systems’. When reaching a steady state the closed system is not capable of performing any work." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Heisenberg’s principle must be considered a special case of the complementarity principle […]. This states that an experiment on one aspect of a system (of atomic dimensions) destroys the possibility of learning about a complementarity aspect of the same system. Together these principles have shocking consequences for the comprehension of entropy and determinism." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"In the definition of meaning, it is assumed that both the source and receiver have previously coded (and stored) signals of the same or similar referents, such that the messages may have meaning and relate to behaviour. That is, the used symbols must have the same signification for both sender and receiver. If not, the receiver will create a different mental picture than intended by the transmitter. Meaning is generated by individuals in a process of social interaction with a more or less common environment. It is a relation subsisting within a field of experience and appears as an emergent property of a symbolic representation when used in culturally accepted interaction. The relation between the symbolic representation and its meaning is random. Of this, however, the mathematical theory has nothing to say. If human links in the chain of communication are missing, of course no questions of meaning will arise." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Information is neither matter nor energy, it is rather an abstract concept of the same kind as entropy, which must be considered a conceptual relative. 'Amount of information' is a metaphorical term and has in fact no numerical properties."  (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Information entropy has its own special interpretation and is defined as the degree of unexpectedness in a message. The more unexpected words or phrases, the higher the entropy. It may be calculated with the regular binary logarithm on the number of existing alternatives in a given repertoire. A repertoire of 16 alternatives therefore gives a maximum entropy of 4 bits. Maximum entropy presupposes that all probabilities are equal and independent of each other. Minimum entropy exists when only one possibility is expected to be chosen. When uncertainty, variety or entropy decreases it is thus reasonable to speak of a corresponding increase in information." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Living systems in general are energy transducers which use information to perform more efficiently, converting one form of energy into another, and converting energy into information. Living species have developed a genius system to overcome entropy by their procreative faculty. […] Storing the surplus energy in order to survive is to reverse the entropic process or to create negentropy. A living being can only resist the degradation of its own structure. The entropic process influencing the structure and environment of the whole system is beyond individual control." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Potential energy is organized energy, heat is disorganized energy and entropy therefore results in dissolution and disorder. The sum of all the quantities of heat lost in the course of all the activities that have taken place in the universe equals the total accumulation of entropy. A popular analogy of entropy is that it is not possible to warm oneself on something which is colder than oneself. […] Note also that maximun entropy is maximum randomization." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Reductionism argues that from scientific theories which explain phenomena on one level, explanations for a higher level can be deduced. Reality and our experience can be reduced to a number of indivisible basic elements. Also qualitative properties are possible to reduce to quantitative ones." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Systems science, with such an ambition and with its basic Systems Theory, provides a general language with which to tie together various areas in interdisciplinary communication. As such it automatically strives towards a universal science, i.e. to join together the many splintered disciplines with a 'law of laws', applicable to them all and integrating all scientific knowledge." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Systems thinking expands the focus of the observer, whereas analytical thinking reduces it. In other words, analysis looks into things, synthesis looks out of them. This attitude of systems thinking is often called expansionism, an alternative to classic reductionism. Whereas analytical thinking concentrates on static and structural properties, systems thinking concentrates on the function and behaviour of whole systems. Analysis gives description and knowledge; systems thinking gives explanation and understanding." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The function of living matter is apparently to expand the organization of the universe. Here, locally decreased entropy as a result of biological order in existing life is invalidating the effects of the second law of thermodynamics, although at the expense of increased entropy in the whole system. It is the running down of the universe that made the sun and the earth possible. It is the running down of the sun that made life and us possible." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The organizing principle of purpose can generally have two directions: one towards the system itself and one towards the environment. Directed towards the system, the aim is to maintain homeostasis. Directed towards the environment, the aim is often to modify it to resemble a desired state or, if this is not possible, to bypass or override the disturbances." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"The second law of thermodynamics states that all energy in the universe degrades irreversibly. Thus, differences between energy forms must decrease over time. Everything is spread! (The principle of degradation of energy with regard to quality.) Translated to the area of systems the law tells us that the entropy of an isolated system always increases. Another consequence is that when two systems are joined together, the entropy of the united system is greater than the sum of the entropies of the individual systems." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"When defining living systems, the term dynamic equilibrium is essential. It does not imply something which is steady or stable. On the contrary, it is a floating state characterized by invisible movements and preparedness for change. To be in dynamic equilibrium is adapting adjustment to balance. Homeostasis stands for the sum of all control functions creating the state of dynamic equilibrium in a healthy organism. It is the ability of the body to maintain a narrow range of internal conditions in spite of environmental changes." (Lars Skyttner, "General Systems Theory: Problems, Perspective, Practice", 2005)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

George B Dyson - Collected Quotes

"An Internet search engine is a finite-state, deterministic machine, except at those junctures where people, individually and collectiv...