28 July 2022

On Simultaneity V: Machines

"The view that machines cannot give rise to surprises is due, I believe, to a fallacy to which philosophers and mathematicians are particularly subject. This is the assumption that as soon as a fact is presented to a mind all consequences of that fact spring into the mind simultaneously with it. It is a very useful assumption under many circumstances, but one too easily forgets that it is false. A natural consequence of doing so is that one then assumes that there is no virtue in the mere working out of consequences from data and general principles. (Alan M Turing, "Computing Machinery and Intelligence", Mind Vol. 59, 1950)

"Instead of having a single control unit sequencing the operations of the machine in series (except for certain subsidiary operations as certain input and output functions) as is now done, the idea is to decentralize control with several different control units capable of directing various simultaneous operations and interrelating them when appropriate." (John F Nash, "Parallel Control", 1954)

"At the other far extreme, we find many systems ordered as a patchwork of parallel operations, very much as in the neural network of a brain or in a colony of ants. Action in these systems proceeds in a messy cascade of interdependent events. Instead of the discrete ticks of cause and effect that run a clock, a thousand clock springs try to simultaneously run a parallel system. Since there is no chain of command, the particular action of any single spring diffuses into the whole, making it easier for the sum of the whole to overwhelm the parts of the whole. What emerges from the collective is not a series of critical individual actions but a multitude of simultaneous actions whose collective pattern is far more important. This is the swarm model." (Kevin Kelly, "Out of Control: The New Biology of Machines, Social Systems and the Economic World", 1995)

"The acquisition of information is a flow from noise to order - a process converting entropy to redundancy. During this process, the amount of information decreases but is compensated by constant re- coding. In the recoding the amount of information per unit increases by means of a new symbol which represents the total amount of the old. The maturing thus implies information condensation. Simultaneously, the redundance decreases, which render the information more difficult to interpret." (Lars Skyttner, "General Systems Theory: Ideas and Applications", 2001)

"Machines can pool their resources, intelligence, and memories. Two machines - or one million machines - can join together to become one and then become separate again. Multiple machines can do both at the same time: become one and separate simultaneously. Humans call this falling in love, but our biological ability to do this is fleeting and unreliable." (Ray Kurzweil, "The Singularity is Near", 2005)

"When a machine manages to be simultaneously meaningful and surprising in the same rich way, it too compels a mentalistic interpretation. Of course, somewhere behind the scenes, there are programmers who, in principle, have a mechanical interpretation. But even for them, that interpretation loses its grip as the working program fills its memory with details too voluminous for them to grasp." (Ray Kurzweil, "The Singularity is Near", 2005)

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...

On Data: Longitudinal Data

  "Longitudinal data sets are comprised of repeated observations of an outcome and a set of covariates for each of many subjects. One o...