Distributionally Robust Optimization and Robust Statistics

Distributionally Robust Optimization and Robust Statistics

January 29, 2024 | Jose Blanchet, Jiajin Li, Sirui Lin, Xuhui Zhang
Distributionally Robust Optimization (DRO) and Robust Statistics are reviewed in this paper. DRO is a principled approach for constructing statistical estimators that hedge against the impact of deviations in the expected loss between the training and deployment environments. Many well-known estimators in statistics and machine learning are distributionally robust in a precise sense. The paper aims to bridge the gap between classical results and their DRO equivalent formulations, and to clarify the difference between DRO and classical statistical robustness. DRO estimators tend to be pessimistic in an adversarial setting, leading to a min-max type formulation, while classical robust statistics estimators tend to be optimistic, leading to a min-min type formulation. The paper discusses various formulations of DRO, including φ-divergence-based DRO, optimal transport-based DRO, and integral probability metric-based DRO. It also discusses the statistical properties of DRO, including the selection of radius δn, asymptotic normality, finite-sample guarantees, and optimality. The paper also discusses the tractability of DRO and its application in the Bayesian framework. Robust statistics is also discussed, with a focus on its connection to Rockafellian relaxations. The paper concludes with a discussion of trending topics in DRO, including dynamic decision-making problems and causal inference.Distributionally Robust Optimization (DRO) and Robust Statistics are reviewed in this paper. DRO is a principled approach for constructing statistical estimators that hedge against the impact of deviations in the expected loss between the training and deployment environments. Many well-known estimators in statistics and machine learning are distributionally robust in a precise sense. The paper aims to bridge the gap between classical results and their DRO equivalent formulations, and to clarify the difference between DRO and classical statistical robustness. DRO estimators tend to be pessimistic in an adversarial setting, leading to a min-max type formulation, while classical robust statistics estimators tend to be optimistic, leading to a min-min type formulation. The paper discusses various formulations of DRO, including φ-divergence-based DRO, optimal transport-based DRO, and integral probability metric-based DRO. It also discusses the statistical properties of DRO, including the selection of radius δn, asymptotic normality, finite-sample guarantees, and optimality. The paper also discusses the tractability of DRO and its application in the Bayesian framework. Robust statistics is also discussed, with a focus on its connection to Rockafellian relaxations. The paper concludes with a discussion of trending topics in DRO, including dynamic decision-making problems and causal inference.
Reach us at info@study.space