"Neural computing is the study of cellular networks that have a natural property for storing experimental knowledge. Such systems bear a resemblance to the brain in the sense that knowledge is acquired through training rather than programming and is retained due to changes in node functions. The knowledge takes the form of stable states or cycles of states in the operation of the net. A central property of such nets is to recall these states or cycles in response to the presentation of cues." (Igor Aleksander & Helen Morton, "Neural computing architectures: the design of brain-like machines", 1989)
"A neural network is a massively parallel distributed processor that has a natural propensity for storing experiential knowledge and making it available for use. It resembles the brain in two respects: 1. Knowledge is acquired by the network through a learning process. 2. Interneuron connection strengths known as synaptic weights are used to store the knowledge." (Igor Aleksander & Helen Morton, "An Introduction to Neural Computing", 1990)
"Neural Computing is the study of networks of adaptable nodes which through a process of learning from task examples, store experiential knowledge and make it available for use." (Igor Aleksander & Helen Morton, "An Introduction to Neural Computing", 1990)
"For a machine, the mark of consciousness is the ability (possessed by organisms) to know in some detail where it currently is, to understand where it comes from, and to have its own drives to make decisions. It must therefore have a detailed representation of its current position in its world, some knowledge of its own makeup, and a great deal of knowledge about how it might interact with humans." (Igor Aleksander, "How to Build a Mind: toward machines with imagination", 2001)
"One of the factors that distinguishes engineering from science is that the engineer builds complex systems from simple bits, whereas the scientist breaks complex systems into hopefully comprehensible components. The first is called understanding by synthesis and the second is understanding by analysis." (Igor Aleksander, "How to Build a Mind: toward machines with imagination", 2001)
"People talk far too glibly about 'recognizing' things and then build machines that simply label patterns. There is a vast difference between recognizing patterns by labeling them correctly and knowing the objects that are perceived. Such knowledge is a happy resonance between imagination and perception, possessed neither by WISARD nor by the many neural pattern-recognition machines built over the last fifteen or so years. Something extra is required: yes, inner states are necessary, but they cannot be just any old inner states." (Igor Aleksander, "How to Build a Mind: toward machines with imagination", 2001)
"Yes, learning and adaptation seem to constitute one of the dividing lines between list processing and brains. Another seems to be that the brain is a highly structured piece of engineering in which most of what happens is determined by its specialized structure. The engineering of a computer is such as to be as general as possible to let the programmer write his list-processing programs: so, the hardware of the brain does matter in letting it do what it does. In the brain it creates specific overall aptitudes, but in computers it is carefully made neutral so as to keep them as general as possible." (Igor Aleksander, "How to Build a Mind: toward machines with imagination", 2001)
"Machine consciousness refers to attempts by those who design
and analyse informational machines to apply their methods to various ways of
understanding consciousness and to examine the possible role of consciousness
in informational machines." (Igor Aleksander, "Machine consciousness", Scholarpedia, 3(2), 2008)
No comments:
Post a Comment