
Learning of layered neural networks is studied using the methods of statistical mechanics. Networks are trained from examples using the Gibbs algorithm. We focus on the generalization curve, i.e. the average generalization error as a function of the number of the examples. We consider perceptron learning with a sigmoid transfer function. Ising perceptrons, with weights constrained to be discrete, exhibit sudden learning at low temperatures within the annealed approximation. There is a first order transition from a state of poor generalization to a state of perfect generalization. When the transfer function is smooth, the first order transition occurs only at low temperatures. The transition becomes continuous at high temperatures. When the transfer function is steep, the first order transition line is extended to the higher temperature. The analytic results show a good agreement with the computer simulations.
EXAMPLES, NEURAL NETWORKS, STORAGE CAPACITY
EXAMPLES, NEURAL NETWORKS, STORAGE CAPACITY
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 1 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
