Lotidis, K., Bambos, N., Blanchet, J. & Li, J.. (2023). Wasserstein Distributionally Robust Linear-Quadratic Estimation under Martingale Constraints. Proceedings of The 26th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 206:8629-8644 Available from https://proceedings.mlr.press/v206/lotidis23a.html.

View Publication

Abstract

We focus on robust estimation of the unobserved state of a discrete-time stochastic system with linear dynamics. A standard analysis of this estimation problem assumes a baseline innovation model; with Gaussian innovations we recover the Kalman filter. However, in many settings, there is insufficient or corrupted data to validate the baseline model. To cope with this problem, we minimize the worst-case mean-squared estimation error of adversarial models chosen within a Wasserstein neighborhood around the baseline. We also constrain the adversarial innovations to form a martingale difference sequence. The martingale constraint relaxes the iid assumptions which are often imposed on the baseline model. Moreover, we show that the martingale constraints guarantee that the adversarial dynamics remain adapted to the natural time-generated information. Therefore, adding the martingale constraint allows to improve upon over-conservative policies that also protect against unrealistic omniscient adversaries. We establish a strong duality result which we use to develop an efficient subgradient method to compute the distributionally robust estimation policy. If the baseline innovations are Gaussian, we show that the worst-case adversary remains Gaussian. Our numerical experiments indicate that the martingale constraint may also aid in adding a layer of robustness in the choice of the adversarial power.

Authors
Kyriakos Lotidis, Nicholas Bambos, Jose Blanchet, Jiajin Li
Publication date
2023/4/11
Conference
International Conference on Artificial Intelligence and Statistics
Pages
8629-8644
Publisher
PMLR