Low distortion embeddings with bottom-up manifold learning
Presenter
June 29, 2023
Abstract
Manifold learning algorithms aim to map high-dimensional data into lower dimensions while preserving local and global structure. In this talk, I present Low Distortion Local Eigenmaps (LDLE), a bottom-up manifold learning framework that constructs low-distortion local views of a dataset in lower dimensions and registers them to obtain a global embedding. Motivated by Jones, Maggioni, and Schul (2008), LDLE constructs local views by selecting subsets of the global eigenvectors of the graph Laplacian such that they are locally orthogonal. The global embedding is obtained by rigidly aligning these local views, which is solved iteratively. Our global alignment formulation enables tearing manifolds so as to embed them into their intrinsic dimension, including manifolds without boundary and non-orientable manifolds. We define a strong and weak notion of global distortion to evaluate embeddings in low dimensions. We show that Riemannian Gradient Descent (RGD) converges to an embedding with guaranteed low global distortion. Compared to competing manifold learning and data visualization approaches, we demonstrate that LDLE achieves lowest local and global distortion on real and synthetic datasets.