
arXiv: 1803.00001
In this article we study the field of Hilbertian metrics and positive definit (pd) kernels on probability measures, they have a real interest in kernel methods. Firstly we will make a study based on the Alpha-Beta-divergence to have a Hilbercan metric by proposing an improvement of this divergence by constructing it so that its is symmetrical the Alpha-Beta-Symmetric-divergence (ABS-divergence) and also do some studies on these properties but also propose the kernels associated with this divergence. Secondly we will do mumerical studies incorporating all proposed metrics/kernels into support vector machine (SVM). Finally we presented a algorithm for image classification by using our divergence.
1o pages, 11 figures
Methodology (stat.ME), FOS: Computer and information sciences, Computer Science - Machine Learning, 60-04, 60-08, 62-04, 62-07, Statistics - Methodology, Machine Learning (cs.LG)
Methodology (stat.ME), FOS: Computer and information sciences, Computer Science - Machine Learning, 60-04, 60-08, 62-04, 62-07, Statistics - Methodology, Machine Learning (cs.LG)
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
