Class of ’27 Lecture: Fundamental Optimization Methods in Data Analysis

Optimization formulations and algorithms are vital tools for solving problems in data analysis. There has been particular interest in some fundamental, elementary, optimization algorithms that were previously thought to have only niche appeal. Stochastic gradient, coordinate descent, and accelerated first-order methods are three examples. We outline applications in which these approaches are useful, discuss the basic properties of these methods, and survey some recent developments in the analysis of their convergence behavior.
Date
Location
Amos Eaton 214
Speaker: Stephen Wright from University of Wisconsin
Back to top