
doi: 10.1109/72.286908
pmid: 18267804
This paper considers a least-squares approach to function approximation and generalization. The particular problem addressed is one in which the training data are noiseless and the requirement is to define a mapping that approximates the data and that generalizes to situations in which data samples are corrupted by noise in the input variables. The least-squares approach produces a generalizer that has the form of a radial basis function network for a finite number of training samples. The finite sample approximation is valid provided that the perturbations due to noise on the expected operating conditions are large compared to the sample spacing in the data space. In the other extreme of small noise perturbations, a particular parametric form must be assumed for the generalizer. It is shown that better generalization will occur if the error criterion used in training the generalizer is modified by the addition of a specific regularization term. This is illustrated by an approximator that has a feedforward architecture and is applied to the problem of point-source location using the outputs of an array of receivers in the focal-plane of a lens.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 48 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
