Zheng, T., Zhu, L., So, A. M., Blanchet, J., & Li, J. (2022). Universal Gradient Descent Ascent Method for Nonconvex-Nonconcave Minimax Optimization. ArXiv. /abs/2212.12978

View Publication

Abstract

Nonconvex-nonconcave minimax optimization has received intense attention over the last decade due to its broad applications in machine learning. Most existing algorithms rely on one-sided information, such as the convexity (resp. concavity) of the primal (resp. dual) functions, or other specific structures, such as the Polyak-Łojasiewicz (PŁ) and Kurdyka-Łojasiewicz (KŁ) conditions. However, verifying these regularity conditions is challenging in practice. To meet this challenge, we propose a novel universally applicable single-loop algorithm, the doubly smoothed gradient descent ascent method (DS-GDA), which naturally balances the primal and dual updates. That is, DS-GDA with the same hyperparameters is able to uniformly solve nonconvex-concave, convex-nonconcave, and nonconvex-nonconcave problems with one-sided KŁ properties, achieving convergence with complexity. Sharper (even optimal) iteration complexity can be obtained when the KŁ exponent is known. Specifically, under the one-sided KŁ condition with exponent , DS-GDA converges with an iteration complexity of $\mathcal {O}(\epsilon^{-2\max\\{2\theta, 1\\}}) $. They all match the corresponding best results in the literature. Moreover, we show that DS-GDA is practically applicable to general nonconvex-nonconcave problems even without any regularity conditions, such as the PŁ condition, KŁ condition, or weak Minty variational inequalities condition. For various challenging nonconvex-nonconcave examples in the literature, including* Forsaken*,* Bilinearly-coupled minimax*,* Sixth-order polynomial*, and* PolarGame*, the proposed DS-GDA can all get …

Authors
Taoli Zheng, Linglingzhi Zhu, Anthony Man-Cho So, José Blanchet, Jiajin Li
Publication date
2024/2/13
Journal
Advances in Neural Information Processing Systems
Volume
36