Accelerated Single-Call Methods for Constrained Min-Max Optimization

Abstract

We study first-order methods for constrained min-max optimization. Existing methods either requires two gradient calls or two projections in each iteration, which may be costly in applications. In this paper, we first show that the Optimistic Gradient (OG) method, a single-call single-projection algorithm, has $O(1/sqrt(T))$ convergence rate for inclusion problems with operators that satisfy the weak Minty variation inequality (MVI). Our second result is the first single-call single-projection algorithm – the Accelerated Reflected Gradient (ARG) method that achieves the optimal $O(1/T)$ convergence rate for inclusion problems that satisfy negative comonotonicity. Both the weak MVI and negative comonotonicity are well-studied assumptions and capture a rich set of non-convex non-concave min-max optimization problems. Finally, we show that the Reflected Gradient (RG) method, another single-call single-projection algorithm, has $O(1/sqrt(T))$ last-iterate convergence rate for constrained convex-concave min-max optimization, answering an open problem of [Hsieh et al, 2019].

Publication
The 11th International Conference on Learning Representations (ICLR)
Yang Cai
Yang Cai
Associate Professor

Related