Videos

Optimization Taxonomy and Mathematical Foundations

Presenter
August 1, 2016
Keywords:
  • Continuous Optimization, Linear Programming, continuous nonlinear optimization, convex optimization, first-order methods, Newton
Abstract
In this series of lectures on continuous optimization, we introduce the area of optimizing over spaces of continuous variables, particularly vectors and matrices of real numbers. The talks will have an algorithmic focus, developing the theory necessary to describe the algorithms and understand their fundamental properties. After describing the landscape of problem types, and describing a few sample applications, we discuss some mathematical foundations in real analysis, convex analysis, and linear algebra. We then introduce first-order methods, in which information about the gradient of the objective function is used to successively improve the approximate solution. We then discuss algorithms that require even less information - a single element of the gradient, or an unbiased estimate that can be obtained cheaply. We also discuss algorithms that require more information, in particular, Newton's method. Turning to linear programming - a fundamental and widely used optimization paradigm - we outline duality theory and sketch the two major algorithms - simplex and interior-point. We turn then to algorithms for nonlinear constrained optimization, outlining the theory and the major classes of methods. Finally, we discuss how all these tools are being used in one area of great current interest - machine learning and data analysis. Course material (slides, supporting notes, worksheets) can be found here: http://pages.cs.wisc.edu/~swright/nd2016/