Videos

Infinite-Width Bounded-Norm Networks: A View from Function Space

Presenter
April 30, 2019
Abstract
Nathan Srebro TTI-Chicago Computer Science, Toyota Technological Institute There has been much research in the past four decades on understanding what functions can be captured, or approximated, by multi-layer neural networks, with the focus being on how well a function can be approximated as function of the number of units in the network. But more recently, we have come to understand that in modern deep learning the magnitude of the weights, rather then the number of units, play a more important role in complexity control, and that the models we are learning are perhaps better thought of as having an unbounded number of units but bounded overall norm. We make the first steps toward understanding what kind of functions can be captured by such bounded norm infinite width networks, and what type of complexity control in function space does bounding the norm of the weights induce, by providing a detailed study for one dimensional functions with a surprisingly simple and satisfying answer.