
arXiv: 1808.00931
Data-driven discovery of "hidden physics" -- i.e., machine learning of differential equation models underlying observed data -- has recently been approached by embedding the discovery problem into a Gaussian Process regression of spatial data, treating and discovering unknown equation parameters as hyperparameters of a modified "physics informed" Gaussian Process kernel. This kernel includes the parametrized differential operators applied to a prior covariance kernel. We extend this framework to linear space-fractional differential equations. The methodology is compatible with a wide variety of fractional operators in $\mathbb{R}^d$ and stationary covariance kernels, including the Matern class, and can optimize the Matern parameter during training. We provide a user-friendly and feasible way to perform fractional derivatives of kernels, via a unified set of d-dimensional Fourier integral formulas amenable to generalized Gauss-Laguerre quadrature. The implementation of fractional derivatives has several benefits. First, it allows for discovering fractional-order PDEs for systems characterized by heavy tails or anomalous diffusion, bypassing the analytical difficulty of fractional calculus. Data sets exhibiting such features are of increasing prevalence in physical and financial domains. Second, a single fractional-order archetype allows for a derivative of arbitrary order to be learned, with the order itself being a parameter in the regression. This is advantageous even when used for discovering integer-order equations; the user is not required to assume a "dictionary" of derivatives of various orders, and directly controls the parsimony of the models being discovered. We illustrate on several examples, including fractional-order interpolation of advection-diffusion and modeling relative stock performance in the S&P 500 with alpha-stable motion via a fractional diffusion equation.
26 pages, 10 figures. In v2, a minor change to the formatting of a handful of references was made in the bibliography; the main text was unchanged. In v3, minor improvements were made to the exposition; more details about motivation, examples, optimization, and relation to previous works were given
FOS: Computer and information sciences, matérn kernel, Computer Science - Machine Learning, 35R11, 65N21, 62M10, 62F15, 60G15, 60G52, Numerical methods for inverse problems for boundary value problems involving PDEs, Bayesian inference, fractional diffusion, Gaussian processes, Machine Learning (stat.ML), Fractional partial differential equations, stable process, Machine Learning (cs.LG), Time series, auto-correlation, regression, etc. in statistics (GARCH), Stable stochastic processes, anomalous diffusion, Statistics - Machine Learning
FOS: Computer and information sciences, matérn kernel, Computer Science - Machine Learning, 35R11, 65N21, 62M10, 62F15, 60G15, 60G52, Numerical methods for inverse problems for boundary value problems involving PDEs, Bayesian inference, fractional diffusion, Gaussian processes, Machine Learning (stat.ML), Fractional partial differential equations, stable process, Machine Learning (cs.LG), Time series, auto-correlation, regression, etc. in statistics (GARCH), Stable stochastic processes, anomalous diffusion, Statistics - Machine Learning
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 29 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Top 10% | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Top 10% |
