Paata Ivanisvili - Learning low degree functions in logarithmic number of random queries
Presenter
February 10, 2022
Abstract
Recorded 10 February 2022. Paata Ivanisvili of the University of California, Irvine presents "Learning low degree functions in logarithmic number of random queries" at IPAM's Calculus of Variations in Probability and Geometry Workshop.
Abstract: Perhaps a basic question one asks in learning theory is as follows: we have an unknown function f on the hypercube {-1,1}^n, and we are allowed to query samples (X, f(X)) where X is uniformly distributed on {-1,1}^n. After getting these samples (X_1, f(X_1)), ..., (X_N, f(X_N)) we would like to construct a function h which approximates f up to an error epsilon (say in L^2). Of course h is a random function as it involves i.i.d. random variables X_1, ... , X_N in its construction. Therefore, we want to construct such h which approximates f with probability at least 1-delta. So given parameters epsilon, delta in (0,1) the goal is to minimize the number of random queries N. I will show that around log(n) random queries are sufficient (up to a multiplicative factor depending on d, epsilon, delta) to learn bounded degree d functions. Based on joint work with Alexandros Eskenazis.