publication . Other literature type . Article . 1994

Learning long-term dependencies with gradient descent is difficult

Y. Bengio; P. Simard; P. Frasconi;
  • Published: 01 Mar 1994
  • Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Abstract
Recurrent neural networks can be used to map input sequences to output sequences, such as for recognition, production or prediction problems. However, practical difficulties have been reported in training recurrent neural networks to perform tasks in which the temporal contingencies present in the input/output sequences span long intervals. We show why gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases. These results expose a trade-off between efficient learning by gradient descent and latching on information for long periods. Based on an understanding of this problem, alternativ...
Subjects
free text keywords: Artificial intelligence, business.industry, business, Recurrent neural network, Gradient method, Vanishing gradient problem, Gradient descent, Computer science, Intelligent Network, Machine learning, computer.software_genre, computer, Pattern recognition, Dynamical systems theory, Artificial neural network, Numerical analysis
Powered by OpenAIRE Open Research Graph
Any information missing or wrong?Report an Issue
publication . Other literature type . Article . 1994

Learning long-term dependencies with gradient descent is difficult

Y. Bengio; P. Simard; P. Frasconi;