
arXiv: 2311.08549
This paper aims at building the theoretical foundations for manifold learning algorithms in the space of absolutely continuous probability measures $\mathcal{P}_{\mathrm{a.c.}}(Ω)$ with $Ω$ a compact and convex subset of $\mathbb{R}^d$, metrized with the Wasserstein-2 distance $\mathbb{W}$. We begin by introducing a construction of submanifolds $Λ$ in $\mathcal{P}_{\mathrm{a.c.}}(Ω)$ equipped with metric $\mathbb{W}_Λ$, the geodesic restriction of $\mathbb{W}$ to $Λ$. In contrast to other constructions, these submanifolds are not necessarily flat, but still allow for local linearizations in a similar fashion to Riemannian submanifolds of $\mathbb{R}^d$. We then show how the latent manifold structure of $(Λ,\mathbb{W}_Λ)$ can be learned from samples $\{λ_i\}_{i=1}^N$ of $Λ$ and pairwise extrinsic Wasserstein distances $\mathbb{W}$ on $\mathcal{P}_{\mathrm{a.c.}}(Ω)$ only. In particular, we show that the metric space $(Λ,\mathbb{W}_Λ)$ can be asymptotically recovered in the sense of Gromov--Wasserstein from a graph with nodes $\{λ_i\}_{i=1}^N$ and edge weights $W(λ_i,λ_j)$. In addition, we demonstrate how the tangent space at a sample $λ$ can be asymptotically recovered via spectral analysis of a suitable ``covariance operator'' using optimal transport maps from $λ$ to sufficiently close and diverse samples $\{λ_i\}_{i=1}^N$. The paper closes with some explicit constructions of submanifolds $Λ$ and numerical examples on the recovery of tangent spaces through spectral analysis.
Mathematics - Differential Geometry, FOS: Computer and information sciences, Computer Science - Machine Learning, Optimal transportation, Gromov-Wasserstein convergence, Machine Learning (stat.ML), Machine Learning (cs.LG), Abstract approximation theory (approximation in normed linear spaces and other abstract spaces), optimal transport, Differential Geometry (math.DG), tangent space recovery, Statistics - Machine Learning, manifold learning, Wasserstein spaces, FOS: Mathematics, Riemannian, Finsler and other geometric structures on infinite-dimensional manifolds, Geometric probability and stochastic geometry, 49Q22, 41A65, 58B20, 53Z50
Mathematics - Differential Geometry, FOS: Computer and information sciences, Computer Science - Machine Learning, Optimal transportation, Gromov-Wasserstein convergence, Machine Learning (stat.ML), Machine Learning (cs.LG), Abstract approximation theory (approximation in normed linear spaces and other abstract spaces), optimal transport, Differential Geometry (math.DG), tangent space recovery, Statistics - Machine Learning, manifold learning, Wasserstein spaces, FOS: Mathematics, Riemannian, Finsler and other geometric structures on infinite-dimensional manifolds, Geometric probability and stochastic geometry, 49Q22, 41A65, 58B20, 53Z50
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 0 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Average | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
