Videos

Misha Belkin - The elusive generalization and easy optimization, Pt. 2 of 2 - IPAM at UCLA

Presenter
September 12, 2024
Abstract
Recorded 12 September 2024. Misha Belkin of the University of California, San Diego, presents "The elusive generalization and easy optimization, Pt. 2 of 2" at IPAM's Mathematics of Intelligences Tutorials. Abstract: Generalization is the central topic of machine learning and data science. What patterns can be learned from observations and how can we be sure that they extend to future, not yet seen, data? In this tutorial I will outline the arc of recent developments in current understanding (or lack thereof) of generalization in machine learning. These changes occurred largely due to empirical findings in neural networks which necessitated revisiting theoretical foundations of generalizations. I will also discuss the recent understanding of optimization by gradient descent and show why large non-convex systems are remarkably easy to optimize by local methods. Learn more online at: https://www.ipam.ucla.edu/programs/workshops/mathematics-of-intelligences-tutorials/?tab=schedule