Nguyen, V. A., Zhang, X., Blanchet, J., & Georghiou, A. (2020). Distributionally Robust Parametric Maximum Likelihood Estimation. ArXiv. /abs/2010.05321
Abstract
We consider the parameter estimation problem of a probabilistic generative model prescribed using a natural exponential family of distributions. For this problem, the typical maximum likelihood estimator usually overfits under limited training sample size, is sensitive to noise and may perform poorly on downstream predictive tasks. To mitigate these issues, we propose a distributionally robust maximum likelihood estimator that minimizes the worst-case expected log-loss uniformly over a parametric Kullback-Leibler ball around a parametric nominal distribution. Leveraging the analytical expression of the Kullback-Leibler divergence between two distributions in the same natural exponential family, we show that the min-max estimation problem is tractable in a broad setting, including the robust training of generalized linear models. Our novel robust estimator also enjoys statistical consistency and delivers promising empirical results in both regression and classification tasks.
Authors
Viet Anh Nguyen, Xuhui Zhang, Jose Blanchet, Angelos Georghiou
Publication date
2020
Journal
Advances in Neural Information Processing Systems
Volume
33
Pages
7922-7932