Raghu Meka - Complexity of Sparse Linear Regression - IPAM at UCLA
Presenter
March 1, 2024
Abstract
Recorded 01 March 2024. Raghu Meka of the University of California, Los Angeles, presents "Complexity of Sparse Linear Regression" at IPAM's EnCORE Workshop on Computational vs Statistical Gaps in Learning and Optimization.
Abstract: Sparse linear regression (SLR) is fundamental in signal processing, statistics, and machine learning. Despite its importance, a significant gap exists between statistical sample complexity and algorithmic efficiency, especially with Gaussian covariates. Traditional methods like Lasso, basis pursuit provably fail with ill-conditioned covariance matrices and correlations.
In this talk, I'll highlight ways around these limitations by presenting important cases where the problem can be solved by new methods even when the covariance matrix is ill-conditioned by exploiting other structural properties of the covariance matrix: we can get significantly better efficient algorithms when the covariance matrix has a low treewidth dependency graph, or has few bad correlations, or if the correlations arise from few latent variables. We'll also discuss the limitations of broad classes of algorithms for the problem.
Based on joint works with Jon Kelner, Frederic Koehler, Dhruv Rohatgi.
Learn more online at: https://www.ipam.ucla.edu/programs/workshops/encore-workshop-on-computational-vs-statistical-gaps-in-learning-and-optimization/?tab=overview