Work on artificial neural network, commonly referred to as neural networks, has been motivated right from its inception by the recognition that the brain computers in an entirely different way from the conventional digital computer. The struggle to understand the brain owes much to the pioneering work of Ramon, who introduced the idea of nerons as structural constituents of the brain. Typically, neurons are five to six orders of magnitude slower than silicon logic gates; events in a silicon chip happen in the nanosecond range, whereas neural events happen in the millisecond range.
The brain is a highly complex, nonlinear, and parallel computer (information processing system). It has the capability of organizing neurons so as to perform certain computations (eg. Pattern recognition, perception, and motor control) many times faster than the fastest digital computer in existence today). Consider for example human vision, which is an information processing task. It is the function of the visual system to provide a representation of the environment around us and, more important, to supply the information we need to interact with the environment. To be specific, the brain routinely accomplishes perceptual recognition tasks in something of the order of 100-200 ms, whereas tasks of much lesser complexity will take days on a huge conventional computer.
A neural network is a massively parallel distributed processor that has a natural propensity for storing experiental knowledge and making it available for use. It resembles the brain in two respects:
1. Knowledge is acquired by the network through a learning process
2. Interneuron connection strengths known as synaptic weights are used to store the knowledge.
Neural networks are also referred to in the literature as neurocomputers, connectionist networks, parallel distributed processors, etc. Throughout the book we use the term neural networks; occasionally, the term “neurocomputer” or “connectionist network” is used.
The brain is a highly complex, nonlinear, and parallel computer (information processing system). It has the capability of organizing neurons so as to perform certain computations (eg. Pattern recognition, perception, and motor control) many times faster than the fastest digital computer in existence today). Consider for example human vision, which is an information processing task. It is the function of the visual system to provide a representation of the environment around us and, more important, to supply the information we need to interact with the environment. To be specific, the brain routinely accomplishes perceptual recognition tasks in something of the order of 100-200 ms, whereas tasks of much lesser complexity will take days on a huge conventional computer.
A neural network is a massively parallel distributed processor that has a natural propensity for storing experiental knowledge and making it available for use. It resembles the brain in two respects:
1. Knowledge is acquired by the network through a learning process
2. Interneuron connection strengths known as synaptic weights are used to store the knowledge.
Neural networks are also referred to in the literature as neurocomputers, connectionist networks, parallel distributed processors, etc. Throughout the book we use the term neural networks; occasionally, the term “neurocomputer” or “connectionist network” is used.