First-Order Methods for Differentiable "Nonsmooth" Convex Optimization: A Tale of Frank-Wolfe and Multiplicative-Gradient

Abstract: The remarkable success of first-order methods on huge-scale problems has mostly been in the realm of problems where the objective function or its gradient satisfy a Lipschitz property. Here we consider important convex optimization problems where neither the objective nor its gradient are Lipschitz, and typically blows up on part of the (relative) boundary of the feasible region. Such problems appear in a wide range of applications across numerous fields, including statistical machine learning, medical imaging, experimental design, quantum physics, etc. Unfortunately, the vast majority of existing first-order methods cannot be applied to solve these problems due to their seemingly pathological behavior. In this talk we present new structures underlying these problems that lead to two new methods (the Multiplicative- Gradient method and a version of the Frank-Wolfe method) that successfully exploit these structures – both theoretically and computationally. Our theory shows that these two methods have simple and elegant computational guarantees, and our numerical experiments demonstrate the rather remarkable efficiency and efficacy of these methods in practice.
Date
Location
Low 3051
Speaker: Renbo Zhao from MIT
Back to top