
doi: 10.1007/bfb0098198
A novel approach to estimate generalisation errors of the simple perceptron of the worst case is introduced. It is well known that the generalisation error of the simple perceptron is of the form d/t with an unknown constant d which depends only on the dimension of inputs, where t is the number of learned examples. Based upon extreme value theory in statistics we obtain an exact form of the generalisation error of the simple perceptron. The method introduced in this paper opens up new possibilities to consider generalisation errors of a class of neural networks.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
