
arXiv: 2002.12653
This paper presents mathematical results in support of the methodology of the probabilistic learning on manifolds (PLoM) recently introduced by the authors, which has been used with success for analyzing complex engineering systems. The PLoM considers a given initial dataset constituted of a small number of points given in an Euclidean space, which are interpreted as independent realizations of a vector-valued random variable for which its non-Gaussian probability measure is unknown but is, \textit{a priori}, concentrated in an unknown subset of the Euclidean space. The objective is to construct a learned dataset constituted of additional realizations that allow the evaluation of converged statistics. A transport of the probability measure estimated with the initial dataset is done through a linear transformation constructed using a reduced-order diffusion-maps basis. In this paper, it is proven that this transported measure is a marginal distribution of the invariant measure of a reduced-order It�� stochastic differential equation that corresponds to a dissipative Hamiltonian dynamical system. This construction allows for preserving the concentration of the probability measure. This property is shown by analyzing a distance between the random matrix constructed with the PLoM and the matrix representing the initial dataset, as a function of the dimension of the basis. It is further proven that this distance has a minimum for a dimension of the reduced-order diffusion-maps basis that is strictly smaller than the number of points in the initial dataset. Finally, a brief numerical application illustrates the mathematical results.
41 pages, 4 figures
[MATH.MATH-PR] Mathematics [math]/Probability [math.PR], Probabilistic learning, measure concentration, MCMC, data driven, Mathematics - Statistics Theory, Statistics Theory (math.ST), manifolds, [STAT.ML] Statistics [stat]/Machine Learning [stat.ML], [STAT] Statistics [stat], machine learning, dissipative Hamiltonian stochastic dynamics, FOS: Mathematics, unsupervised, 68Q32, 68T05, 62D05, 62G07, 62G09, 62H12, diffusion maps, sampling on manifolds, supervised
[MATH.MATH-PR] Mathematics [math]/Probability [math.PR], Probabilistic learning, measure concentration, MCMC, data driven, Mathematics - Statistics Theory, Statistics Theory (math.ST), manifolds, [STAT.ML] Statistics [stat]/Machine Learning [stat.ML], [STAT] Statistics [stat], machine learning, dissipative Hamiltonian stochastic dynamics, FOS: Mathematics, unsupervised, 68Q32, 68T05, 62D05, 62G07, 62G09, 62H12, diffusion maps, sampling on manifolds, supervised
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 25 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
