Videos

Vidya Muthukumar - Comparison and transfer between tasks in overparameterized learning

Presenter
October 15, 2024
Abstract
Recorded 15 October 2024. Vidya Muthukumar of the Georgia Institute of Technology presents "Comparison and transfer between tasks in overparameterized learning" at IPAM's Theory and Practice of Deep Learning Workshop. Abstract: We consider overparameterized linear (and in some cases kernel) models and study similarities and differences in classification and regression tasks. First, we uncover the existence of high-dimensional regimes under which regression would not generalize (i.e. would not satisfy statistical consistency), but the corresponding classification task would. Next, we consider the problem of task transfer, i.e. can we use model parameters that were trained on a classification task for generalization to the corresponding regression task? We show that, while such models will never generalize in a "zero-shot" sense, they can be post-processed through a simple algorithm that has "few-shot" access to regression labels to successfully generalize in the regression task. The main technical ingredient in both analyses is a fine-grained characterization of individual parameters arising from minimum-norm interpolation on regression/classification tasks, which may be of independent interest. Learn more online at: https://www.ipam.ucla.edu/programs/workshops/workshop-ii-theory-and-practice-of-deep-learning/?tab=overview