
The Cerebellar Model Articulation Controller (CMAC) neural network is an associative memory that is biologically inspired by the cerebellum, which is found in the brains of animals. The standard CMAC uses the least mean squares algorithm to train the weights. Recently, the recursive least squares (RLS) algorithm was proposed as a superior algorithm for training the CMAC online as it can converge in just one epoch, and does not require tuning of a learning rate. However, the RLS algorithm was found to be very computationally demanding as its computational complexity is dependent on the square of the number of weights required which can be huge for the CMAC. Here, we show a more efficient RLS algorithm that uses inverse QR decomposition and additionally provides a regularized solution, improving generalization. However, while the inverse QR decomposition based RLS algorithm reduces computation time significantly; it is still not fast enough for use in CMACs greater than two dimensions. To further improve efficiency we show that by using kernel methods the CMAC computational complexity can be transformed to become dependent on the number of unique training data. Additionally, it is shown how modeling error can be improved through use of higher order basis functions.
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 3 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
