Optimal low-rank approximations of Bayesian linear inverse problems
Presenter
September 8, 2017
Keywords:
- inverse problems, Bayesian inference, low-rank approximation, covariance approximation, Förstner- Moonen metric, posterior mean approximation, Bayes risk, optimality
Abstract
Since in Bayesian inversion data are often informative only on a low-dimensional subspace of the parameter space,
significant computational savings can be achieved using such subspace to characterize and approximate the posterior distribution of the parameters.
We study approximations of the posterior covariance matrix defined as low-rank updates of the prior covariance matrix and
prove their optimality for a broad class of loss functions which includes the Forstner
metric for SPD matrices, as well as the Kullback-Leibler divergence and the Hellinger distance between the prior
and posterior distributions.
We also propose fast approximations of the posterior mean and prove
their optimality with respect to a weighted Bayes risk.
We conclude by providing similar results for the goal-oriented case where the
inference focuses on functions of the parameters. In this case the approximations
are tailored to the particular function of interest.