Solving real-world stochastic optimization problems presents two key challenges: the messiness of real-world data, which can be noisy, biased, or corrupted due to factors like outliers, distribution shifts, and even adversarial attacks; and the laborious, time-intensive requirement of manually tuning step sizes in many existing algorithms.
I will introduce a simple adaptive optimization framework that avoids the need for manual step size tuning by adaptively adjusting the step size in each iteration based on the progress of the algorithm. To address the issue of messy data, the algorithm only assumes access to function-related information through probabilistic oracles, which may be biased and corrupted.
This framework is very general, encompassing a wide range of algorithms for both unconstrained and constrained optimization. It is applicable to multiple problem settings, such as expected loss minimization in machine learning, simulation optimization, and derivative-free optimization. Uder reasonable conditions on the oracles, we provide a meta-theorem to bound the sample complexity for any algorithm in the framework.
Date
Location
Troy 2012
Speaker:
Miaolan Xie
from Cornell University