Videos

Fanny Yang - Surprising phenomena of max-lp-margin classifiers in high dimensions - IPAM at UCLA

Presenter
October 17, 2024
Abstract
Recorded 17 October 2024. Fanny Yang of ETH Zurich presents "Surprising phenomena of max-lp-margin classifiers in high dimensions" at IPAM's Theory and Practice of Deep Learning Workshop. Abstract: In recent years, the analysis of max-lp-margin classifiers has gained attention from the theory community not only due to the implicit bias of first-order methods, but also due to the observation of harmless interpolation for neural networks. In this talk, I will discuss two results: We show that surprisingly, in the noiseless case, while minimizing the l1-norm achieves optimal rates for regression for hard-sparse ground truths, this adaptivity does not directly apply to the equivalent of max l1-margin classification. Further, for noisy observations, we prove how max-lp-margin classifiers can achieve 1/\sqrt{n} rates for p slightly larger than one, while the maximum l1-margin classifier only achieves rates of order 1/sqrt(log(d/n)). Learn more online at: https://www.ipam.ucla.edu/programs/workshops/workshop-ii-theory-and-practice-of-deep-learning/?tab=overview