
doi: 10.1007/bf00452996
pmid: 7824685
One biological principle that is often overlooked in the design of artificial neural networks (ANNs) is redundancy. Redundancy is the replication of processes within the brain. This paper examines the effects of redundancy on learning in ANNs when given either a function-approximation task or a pattern-classification task. The function-approximation task simulated a robotic arm reaching toward an object in two-dimensional space, and the pattern-classification task was detecting parity. Results indicated that redundant ANNs learned the pattern-classification problem much faster, and converge on a solution 100% of the time, whereas standard ANNs sometimes failed to learn the problem. Furthermore, when overall network error is considered, redundant ANNs were significantly more accurate than standard ANNs in performing the function-approximation task. These results are discussed in terms of the relevance of redundancy to the performance of ANNs in general, and the relevance of redundancy in biological systems in particular.
Orientation, Mental Recall, Brain, Humans, Neural Networks, Computer, Robotics, Algorithms, Problem Solving, Psychomotor Performance
Orientation, Mental Recall, Brain, Humans, Neural Networks, Computer, Robotics, Algorithms, Problem Solving, Psychomotor Performance
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 10 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
