The topology, geometry, and combinatorics of feedforward neural networks
Presenter
October 30, 2023
Abstract
Deep neural networks are a class of parameterized functions that have proven remarkably successful at making predictions about unseen data from finite labeled data sets. They do so even in settings when classical intuition suggests that they ought to be overfitting (aka memorizing) the data.
I will begin by describing the structure of neural networks and how they learn. I will then advertise one of the theoretical questions animating the field: how does the relationship between the number of parameters and the size of the data set impact the dynamics of how they learn? Along the way I will emphasize the many ways in which topology, geometry, and combinatorics play a role in the field.