Videos

Dynamics and symmetries in neural network learning

Presenter
June 8, 2023
Abstract
This talk will encompass two topics in the area of scientific machine learning: learning dynamics and symmetries. First, we look at training dynamics: recent results show that the training of neural networks does not always converge to a fixed point in parameter space. We investigate generalization in such settings. By taking a dynamical systems perspective and defining a more general notion of algorithmic stability, we draw connections between training behavior, stability and generalization. Second, in many applications, data and tasks have symmetries and thereby imply desired invariances. We will look at invariances of eigenvectors, which are important, for instance, when learning with graphs. We derive appropriate neural network architectures and show empirical and theoretical benefits of encoding such invariances.
Supplementary Materials