Optimization problems involving uncertainty are central to various domains, including modern statistics and data science. Despite their ubiquity, finding efficient algorithms which provide a solution to these problems remains a major challenge. Interestingly, many random optimization problems share a common feature, dubbed as a statistical-computational gap: while the optimal value can be pinpointed non-constructively, known polynomial-time algorithms find strictly sub-optimal solutions and no better algorithm is known.
In this talk, I will discuss a theoretical framework for understanding the fundamental computational limits of such problems. This framework is based on the Overlap Gap Property (OGP), an intricate geometrical property which is known to be a barrier for large classes of algorithms. I will first focus on the symmetric binary perceptron (SBP), a model for classifying/storing random patterns as well as a random constraint satisfaction problem, widely studied in probability, computer science, and statistical physics. I will show how the OGP framework yields nearly sharp lower bounds against the classes of stable algorithms and online algorithms for the SBP. These classes capture the best known algorithms for the SBP and many other random models. I will then demonstrate how the same framework extends to other models, including the random number balancing problem, which is closely related to the design of randomized controlled trials, as well as its planted counterpart.
Our results for online algorithms are the first essentially tight unconditional lower bounds in the online setting, and our results for stable algorithms are the first applications of Ramsey Theory in the study of the limits of algorithms. More importantly, our techniques yield new toolkits for studying statistical-computational gaps in other random models.
Date
Location
TROY 2012
Speaker:
Eren C. Kizildag
from Columbia University