
This paper presents the machine learning-based ensemble conditional mean filter (ML-EnCMF) -- a filtering method based on the conditional mean filter (CMF) previously introduced in the literature. The updated mean of the CMF matches that of the posterior, obtained by applying Bayes' rule on the filter's forecast distribution. Moreover, we show that the CMF's updated covariance coincides with the expected conditional covariance. Implementing the EnCMF requires computing the conditional mean (CM). A likelihood-based estimator is prone to significant errors for small ensemble sizes, causing the filter divergence. We develop a systematical methodology for integrating machine learning into the EnCMF based on the CM's orthogonal projection property. First, we use a combination of an artificial neural network (ANN) and a linear function, obtained based on the ensemble Kalman filter (EnKF), to approximate the CM, enabling the ML-EnCMF to inherit EnKF's advantages. Secondly, we apply a suitable variance reduction technique to reduce statistical errors when estimating loss function. Lastly, we propose a model selection procedure for element-wisely selecting the applied filter, i.e., either the EnKF or ML-EnCMF, at each updating step. We demonstrate the ML-EnCMF performance using the Lorenz-63 and Lorenz-96 systems and show that the ML-EnCMF outperforms the EnKF and the likelihood-based EnCMF.
ddc:004, FOS: Computer and information sciences, Computer Science - Machine Learning, Learning and adaptive systems in artificial intelligence, Machine Learning (stat.ML), 62M45, 62M20, 65C20, 86-08, Statistics - Computation, Inference from stochastic processes and prediction, Machine Learning (cs.LG), Statistics - Machine Learning, conditional expectation, FOS: Mathematics, Mathematics - Numerical Analysis, Computation (stat.CO), DATA processing & computer science, weather forecast, deep learning, Monte Carlo methods, Numerical Analysis (math.NA), 620, 004, Filtering in stochastic control theory, Neural nets and related approaches to inference from stochastic processes, nonlinear filter, inverse problem, info:eu-repo/classification/ddc/004
ddc:004, FOS: Computer and information sciences, Computer Science - Machine Learning, Learning and adaptive systems in artificial intelligence, Machine Learning (stat.ML), 62M45, 62M20, 65C20, 86-08, Statistics - Computation, Inference from stochastic processes and prediction, Machine Learning (cs.LG), Statistics - Machine Learning, conditional expectation, FOS: Mathematics, Mathematics - Numerical Analysis, Computation (stat.CO), DATA processing & computer science, weather forecast, deep learning, Monte Carlo methods, Numerical Analysis (math.NA), 620, 004, Filtering in stochastic control theory, Neural nets and related approaches to inference from stochastic processes, nonlinear filter, inverse problem, info:eu-repo/classification/ddc/004
| selected citations These citations are derived from selected sources. This is an alternative to the "Influence" indicator, which also reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | 4 | |
| popularity This indicator reflects the "current" impact/attention (the "hype") of an article in the research community at large, based on the underlying citation network. | Top 10% | |
| influence This indicator reflects the overall/total impact of an article in the research community at large, based on the underlying citation network (diachronically). | Average | |
| impulse This indicator reflects the initial momentum of an article directly after its publication, based on the underlying citation network. | Average |
