Convex-concave minimax problems arise in many applications, including robust learning and Stackelberg models. Most first-order methods address unconstrained or projection-friendly settings, while functional constraints remain far less explored. We study a class of convex-concave and convex-strongly-concave minimax problems with functional constraints. By exploiting strong duality, we incorporate the inner functional constraints into the objective, which allows us to apply a proximal augmented Lagrangian framework. Each resulting subproblem is then solved by an inexact accelerated proximal gradient scheme to handle inexact gradients arising from approximately solving an auxiliary maximization subproblem. We show that the proposed method returns a primal-dual εε-KKT point, requiring O~(ε−1)O~(ε−1) first-order iteration complexity in the convex-strongly-concave case, from which primal εε-optimality can be derived. For the convex-concave case, we establish a O~(ε−3/2)O~(ε−3/2) iteration complexity.
For more information, please visit our website: Math Frontier Seminar Website.