Blanchet, J., Goldfarb, D., Iyengar, G., Li, F., & Zhou, C. (2017). Unbiased Simulation for Optimizing Stochastic Function Compositions. ArXiv. /abs/1711.07564
Abstract
In this paper, we introduce an unbiased gradient simulation algorithms for solving convex optimization problem with stochastic function compositions. We show that the unbiased gradient generated from the algorithm has finite variance and finite expected computation cost. We then combined the unbiased gradient simulation with two variance reduced algorithms (namely SVRG and SCSG) and showed that the proposed optimization algorithms based on unbiased gradient simulations exhibit satisfactory convergence properties. Specifically, in the SVRG case, the algorithm with simulated gradient can be shown to converge linearly to optima in expectation and almost surely under strong convexity. Finally, for the numerical experiment,we applied the algorithms to two important cases of stochastic function compositions optimization: maximizing the Cox's partial likelihood model and training conditional random fields.
Authors
Jose Blanchet, Donald Goldfarb, Garud Iyengar, Fengpei Li, Chaoxu Zhou
Publication date
2017/11/20
Journal
arXiv preprint arXiv:1711.07564