Videos

Filtering and residual bounds for Anderson acceleration

Presenter
July 25, 2023
Abstract
Anderson acceleration (AA) has become increasingly popular in recent years due to its efficacy on a wide range of problems, including optimization, machine learning and complex multiphysics simulations. In this talk, we will discuss recent innovations in the theory and implementation of the algorithm. AA requires the storage of a (usually) small number of solution and update vectors, and the solution of an optimization problem that is generally posed as least-squares and solved efficiently by a thin QR decomposition. On any given problem, how successful it is depends on the details of its implementation, including how many and which of the solution and update vectors are used. We will introduce a filtered variant of the algorithm that improves both numerical stability and convergence by selectively removing columns from the least-squares matrix at each iteration. We will discuss the theory behind the introduced filtering strategy and connect it to one-step residual bounds for AA using standard tools and techniques from numerical linear algebra. We will demonstrate the method on discretized nonlinear PDE.