Afrooz Jalilzadeh, Uday Shanbhag, Jose Blanchet, Peter W. Glynn (2022) Smoothed Variable Sample-Size Accelerated Proximal Methods for Nonsmooth Stochastic Convex Programs. Stochastic Systems 12(4):373-410. https://doi.org/10.1287/stsy.2022.0095

View Publication

Abstract

We consider the unconstrained minimization of the function F, where F = f + g, f is an expectation-valued nonsmooth convex or strongly convex function, and g is a closed, convex, and proper function. (I) Strongly convex f. When f is μ-strongly convex in x, traditional stochastic subgradient schemes often display poor behavior, arising in part from noisy subgradients and diminishing steplengths. Instead, we apply a variable sample-size accelerated proximal scheme (VS-APM) on F, the Moreau envelope of F; we term such a scheme as and in contrast with schemes, utilizes constant steplengths and increasingly exact gradients. We consider two settings. (a) Bounded domains. In this setting, displays linear convergence in inexact gradient steps, each of which requires utilizing an inner scheme. Specically, achieves an optimal oracle complexity in steps of with an iteration complexity of in inexact (outer …

Authors
Afrooz Jalilzadeh, Uday Shanbhag, Jose Blanchet, Peter W Glynn
Publication date
2022/12
Journal
Stochastic Systems
Volume
12
Issue
4
Pages
373-410
Publisher
INFORMS