
In this paper, least trimmed squares (LTS) based CPBUM neural networks are proposed to improve the outliers and noise problems of conventional neural networks. In general, the obtained training data in the real applications maybe contain the outliers and noise. Although the CPBUM neural networks have fast convergent speed, this model is difficult to deal with training data set with outliers and noise. Hence, the robust property must be enhanced for the CPBUM neural networks. In this paper, the LTS computational architecture is proposed for the CPBUM neural networks. That is, the LTS approach can trim some large noise and outliers in the training data set. After the LTS, the gradient-descent kind of learning algorithms is used as the learning algorithm to adjust the weights of the CPBUM neural networks. It tunes out that the LTS based CPBUM neural networks have fast convergent speed and robust against outliers and noise than the conventional neural networks with robust mechanism. Simulation results are provided to show the validity and applicability of the proposed neural networks.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 3 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
