
handle: 10037/16437 , 10871/39989
Autoencoders learn data representations (codes) in such a way that the input is reproduced at the output of the network. However, it is not always clear what kind of properties of the input data need to be captured by the codes. Kernel machines have experienced great success by operating via inner-products in a theoretically well-defined reproducing kernel Hilbert space, hence capturing topological properties of input data. In this paper, we enhance the autoencoder's ability to learn effective data representations by aligning inner products between codes with respect to a kernel matrix. By doing so, the proposed kernelized autoencoder allows learning similarity-preserving embeddings of input data, where the notion of similarity is explicitly controlled by the user and encoded in a positive semi-definite kernel matrix. Experiments are performed for evaluating both reconstruction and kernel alignment performance in classification tasks and visualization of high-dimensional data. Additionally, we show that our method is capable to emulate kernel principal component analysis on a denoising task, obtaining competitive results at a much lower computational cost.
This work extends the preliminary (conference) version of this paper (arXiv:1702.02526), Applied Soft Computing, Elsevier, 2018
FOS: Computer and information sciences, VDP::Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550::Datateknologi: 551, Computer Science - Machine Learning, VDP::Technology: 500::Information and communication technology: 550::Computer technology: 551, Kernel methods, Computer Science - Neural and Evolutionary Computing, Deep learning, Machine Learning (stat.ML), Autoencoders, Representation learning, 004, 620, Machine Learning (cs.LG), Statistics - Machine Learning, Neural and Evolutionary Computing (cs.NE)
FOS: Computer and information sciences, VDP::Teknologi: 500::Informasjons- og kommunikasjonsteknologi: 550::Datateknologi: 551, Computer Science - Machine Learning, VDP::Technology: 500::Information and communication technology: 550::Computer technology: 551, Kernel methods, Computer Science - Neural and Evolutionary Computing, Deep learning, Machine Learning (stat.ML), Autoencoders, Representation learning, 004, 620, Machine Learning (cs.LG), Statistics - Machine Learning, Neural and Evolutionary Computing (cs.NE)
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 14 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
