publication . Article . Other literature type . 1996

The Effects of Adding Noise During Backpropagation Training on a Generalization Performance

Guozhong An;
Restricted
  • Published: 01 Apr 1996 Journal: Neural Computation, volume 8, pages 643-674 (issn: 0899-7667, eissn: 1530-888X, Copyright policy)
  • Publisher: MIT Press - Journals
Abstract
<jats:p> We study the effects of adding noise to the inputs, outputs, weight connections, and weight changes of multilayer feedforward neural networks during backpropagation training. We rigorously derive and analyze the objective functions that are minimized by the noise-affected training processes. We show that input noise and weight noise encourage the neural-network output to be a smooth function of the input or its weights, respectively. In the weak-noise limit, noise added to the output of the neural networks only changes the objective function by a constant. Hence, it cannot improve generalization. Input noise introduces penalty terms in the objective fun...
Subjects
free text keywords: Arts and Humanities (miscellaneous), Cognitive Neuroscience, Artificial neural network, Regression, Algorithm, Feedforward neural network, Smoothness, Backpropagation, Mathematics, Stochastic process, Regularization (mathematics), Artificial intelligence, business.industry, business
Related Organizations
Any information missing or wrong?Report an Issue