Videos

First Order Methods for Manifolds with Orthogonality Constraints and Connections to the SVD

Presenter
August 2, 2016
Abstract
It has been observed in a variety of contexts that gradient descent methods have great success in solving low-rank matrix factorization problems, despite the relevant problem formulation being non-convex. The Singular Value Decomposition (SVD) is the solution to a non-convex optimization problem, and there are several highly successful linear-algebraic algorithms for solving it. Unfortunately, these algorithms cannot easily be extended to problems with regularizers or missing data. In this talk I will discuss several extensions of the standard SVD problem and the natural first order incremental gradient descent method for each, constraining the gradient method to the Grassmannian to incorporate orthogonality constraints. I will discuss these algorithms and theoretical results in the standard l2 norm subspace learning problem with noisy or undersampled data. Additionally I will present algorithms and empirical results for the robust PCA and sparse PCA problems. This talk will cover joint work with Rob Nowak, Ben Recht, Steven Wright, Jun He, Arthur Szlam, CJ Taylor, Ryan Kennedy, and my students Dejiao Zhang and Pengyu Xiao.