
<script type="text/javascript">
<!--
document.write('<div id="oa_widget"></div>');
document.write('<script type="text/javascript" src="https://www.openaire.eu/index.php?option=com_openaire&view=widget&format=raw&projectId=undefined&type=result"></script>');
-->
</script>We propose a generalization of modern representation learning objectives by reframing them as recursive divergence alignment processes over localized conditional distributions While recent frameworks like Information Contrastive Learning I-Con unify multiple learning paradigms through KL divergence between fixed neighborhood conditionals we argue this view underplays a crucial recursive structure inherent in the learning process. We introduce Recursive KL Divergence Optimization RKDO a dynamic formalism where representation learning is framed as the evolution of KL divergences across data neighborhoods. This formulation captures contrastive clustering and dimensionality reduction methods as static slices while offering a new path to model stability and local adaptation. Our experiments demonstrate that RKDO offers dual efficiency advantages approximately 30 percent lower loss values compared to static approaches across three different datasets and 60 to 80 percent reduction in computational resources needed to achieve comparable results. This suggests that RKDOs recursive updating mechanism provides a fundamentally more efficient optimization landscape for representation learning with significant implications for resource constrained applications.
FOS: Computer and information sciences, Computer Science - Machine Learning, KL Divergence, Computer Science - Artificial Intelligence, Computer Science - Information Theory, Computer Vision and Pattern Recognition (cs.CV), Information Theory (cs.IT), Computer Science - Computer Vision and Pattern Recognition, Computer Science - Neural and Evolutionary Computing, Machine Learning (cs.LG), Machine Learning, Artificial Intelligence (cs.AI), Machine Learning/classification, Neural and Evolutionary Computing (cs.NE), Unsupervised Machine Learning
FOS: Computer and information sciences, Computer Science - Machine Learning, KL Divergence, Computer Science - Artificial Intelligence, Computer Science - Information Theory, Computer Vision and Pattern Recognition (cs.CV), Information Theory (cs.IT), Computer Science - Computer Vision and Pattern Recognition, Computer Science - Neural and Evolutionary Computing, Machine Learning (cs.LG), Machine Learning, Artificial Intelligence (cs.AI), Machine Learning/classification, Neural and Evolutionary Computing (cs.NE), Unsupervised Machine Learning
| citations This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
