The basic ideas in neural networks

Citation metadata

Authors: David E. Rumelhart, Bernard Widrow and Michael A. Lehr
Date: Mar. 1, 1994
From: Communications of the ACM(Vol. 37, Issue 3.)
Publisher: Association for Computing Machinery, Inc.
Document Type: Article
Length: 4,239 words
Abstract :

Neural network research has focused on the study of brain-style computation, connectionist architectures, parallel distributed-processing systems, neuromorphic computation, and artificial neural systems. The brain is used as the model of a parallel computational device, which is a considerable departure from the traditional serial computer design. Researchers have attempted to develop simplified mathematical models of brain-like systems and study the models to see how they can be used to solve computational problems. Most approaches use the neuron as the basic processing unit; it is characterized by an activity level, an output value, a set of input connections, a bias value, and a set of output connections. All of the aspects are represented mathematically by real numbers, so each connection has an associated weight, or synaptic strength, that determines the effect of the incoming input on the unit's activation level. Neural network learning and the backpropagation learning procedure are examined.

Main content

Source Citation

Source Citation   (MLA 8th Edition)
Rumelhart, David E., et al. "The basic ideas in neural networks." Communications of the ACM, Mar. 1994, p. 87+. Gale Academic Onefile, Accessed 15 Oct. 2019.

Gale Document Number: GALE|A15061351