Class of ’27 Lecture : Coordinate Descent Methods

Coordinate descent is an approach for minimizing functions in which only a subset of the variables are allowed to change at each iteration, while the remaining variables are held fixed. This approach has been popular in applications since the earliest days of optimization, because it is intuitive and because the low-dimensional searches that take place at each iteration are inexpensive in many applications. In recent years, the popularity of coordinate descent methods has grown further because of their usefulness in data analysis. In this talk we describe situations in which coordinate descent methods are useful, and discuss several variants of these methods and their convergence properties. We describe recent analysis of the convergence of asynchronous parallel versions of these methods, which achieve high efficiency on multicore computers.
Date
Location
Amos Eaton 214
Speaker: Stephen J. Wright from University of Wisconsin
Back to top