← Back to Event List

Applied Mathematics Colloquium

Dr. Kody Law, Oak Ridge National Laboratory

Location

Mathematics/Psychology : 104

Date & Time

November 4, 2016, 2:00 pm3:00 pm

Description

Title: Data assimilation using multilevel Monte Carlo


Abstract: 
For half a century computational scientists have been numerically simulating complex systems. Uncertainty is recently becoming a requisite consideration in complex applications which have been classically treated deterministically. This has led to an increasing interest in recent years in uncertainty quantification (UQ). Another recent trend is the explosion of available data. Bayesian inference provides a principled and well-defined approach to the integration of data into an a priori known distribution. The posterior distribution, however, is known only point-wise (possibly with an intractable likelihood) and up to a normalizing constant. This becomes prohibitively expensive when the dimension of the state and/or parameters are high and/or the model is expensive, for example arising as the approximation of a partial differential equation (PDE). If the state is evolving in time and observations are arriving online it becomes even more difficult to iteratively perform inference, known as filtering. Data assimilation [6] refers traditionally to the latter context, however the static smoothing problem arises in so-called re-analysis, where one more carefully analyses a previous fixed data set. In lieu of adhering to the Bayesian probabilistic formulation, one often in practice resorts to more tractable tasks such as tracking the underlying signal (and ideally also some measure of spread). Data assimilation will be introduced, and some connections will be drawn between these perspectives. The rest of the talk will be focused on the development of new algorithms for probing the posterior distribution or an approximation thereof.


Monte Carlo methods have been designed to sample posterior distributions, such as Markov chain Monte Carlo (MCMC) and sequential Monte Carlo (SMC) samplers for offline inference, and particle filters and ensemble Kalman filters (EnKF) for the online case. Recently, the multilevel Monte Carlo (MLMC) framework has been extended to some of these cases, so that approximation error can be optimally balanced with statistical sampling error, and ultimately the Bayesian inverse problem can be solved for the same asymptotic cost as solving the deterministic forward problem. This talk will concern the recent development of multilevel SMC (MLSMC) samplers [1] for offline inference, and the resulting estimators for standard quantities of interest as well as normalizing constants [2]. The methods have been applied successfully to nonlocal equations [5], which are used to model anomalous diffusion and fractures in materials. Online examples will also be presented, including ML particle filters [4] and ensemble Kalman filters [3].

References
[1] Alexandros Beskos, Ajay Jasra, Kody Law, Raul Tempone, and Yan Zhou. Multilevel sequential Monte Carlo samplers. Stochastic Processes and Applications, http://dx.doi.org/10.1016/j.spa.2016.08.004.
[2] Pierre Del Moral, Ajay Jasra, Kody Law, and Yan Zhou. Multilevel sequential Monte Carlo samplers for normalizing constants. arXiv preprint arXiv:1603.01136, 2016.
[3] Hakon Hoel, Kody J.H. Law, and Raul Tempone. Multilevel ensemble Kalman filtering. SIAM Journal of Numerical Analysis, 54(3):1813{1839,2016.
[4] Ajay Jasra, Kengo Kamatani, Kody JH Law, and Yan Zhou. Multilevel particle filter. arXiv preprint arXiv:1510.04977, 2015.
[5] Ajay Jasra, Kody Law, and Yan Zhou. Forward and inverse uncertainty quantification using multilevel Monte Carlo algorithms for an elliptic nonlocal equation. arXiv preprint arXiv:1603.06381, 2016.
[6] Kody Law, Andrew Stuart, and Konstantinos Zygalakis. Data assimilation: a mathematical introduction, volume 62 of Texts in Applied Mathematics. Springer, 2015.