Videos

Memory, Communication, and Statistical Queries

Presenter
May 17, 2016
MSC:
  • 91E40
Abstract
If a concept class can be represented with a certain amount of memory, can it be efficiently learned with the same amount of memory? What concepts can be efficiently learned by algorithms that extract only a few bits of information from each example? We introduce a formal framework for studying these questions, and investigate the relationship between the fundamental resources of memory or communication and the sample complexity of the learning task. We relate our memory-bounded and communication-bounded learning models to the well- studied statistical query model. This connection can be leveraged to obtain both upper and lower bounds: we show strong lower bounds on learning parity functions with bounded communication, as well as upper bounds on solving sparse linear regression problems with limited memory. We will conclude with a discussion of several intriguing open problems. This is joint work with Jacob Steinhardt and Stefan Wager.