Systems | Information | Learning | Optimization
 

Statistical Aspects of Wasserstein Distributionally Robust Optimization Estimators

Wasserstein-based distributional robust optimization problems are formulated as min-max games in which a statistician chooses a parameter to minimize an expected loss against an adversary (say nature) that wishes to maximize the loss by choosing an appropriate probability model within a certain non-parametric class. Recently, these formulations have been studied in the context in which the non-parametric class chosen by nature is defined as a Wasserstein-distance neighborhood around the empirical measure. It turns out that by appropriately choosing the loss and the geometry of the Wasserstein distance one can recover a wide range of classical statistical estimators (including Lasso, Graphical Lasso, SVM, group Lasso, among many others). This talk studies a wide range of rich statistical quantities associated with these problems; for example, the optimal (in a certain sense) choice of the adversarial perturbation, weak convergence of natural confidence regions associated with these formulations, and asymptotic normality of the DRO estimators. (This talk is based on joint work with Y. Kang, K. Murthy, and N. Si.)
February 24 @ 12:30
12:30 pm (1h)

Remote

Jose Blanchet

Video