Robust and Efficient Estimation by Minimising a Density Power Divergence

Robust and Efficient Estimation by Minimising a Density Power Divergence

| AYANENDRANATH BASU, IAN R. HARRIS, NILS L. HJORT, M. C. JONES
This paper introduces a new family of density-based divergence measures, called density power divergences, for robust parameter estimation and model fitting. The proposed method avoids the need for nonparametric density estimation, which is common in existing minimum divergence methods like minimum Hellinger distance estimation. The class of density power divergences is indexed by a parameter α, which controls the trade-off between robustness and efficiency. When α = 0, the divergence corresponds to the Kullback–Leibler divergence, and the method reduces to maximum likelihood estimation. For α = 1, the divergence corresponds to the mean squared error, leading to a robust but inefficient estimator. For other values of α, the method provides a balance between robustness and efficiency. The paper discusses the properties of the density power divergence estimator, including its asymptotic normality and robustness. It also presents a robust model choice criterion based on the density power divergence. The method is shown to be robust to outliers and efficient under model conditions. The paper investigates the performance of the estimator in various parametric families, including the normal distribution, and demonstrates its effectiveness in handling outliers. The method is also extended to regression models, where it is shown to be robust and efficient. The paper concludes that the density power divergence estimator provides a flexible and robust alternative to maximum likelihood estimation, with the ability to balance robustness and efficiency by adjusting the parameter α.This paper introduces a new family of density-based divergence measures, called density power divergences, for robust parameter estimation and model fitting. The proposed method avoids the need for nonparametric density estimation, which is common in existing minimum divergence methods like minimum Hellinger distance estimation. The class of density power divergences is indexed by a parameter α, which controls the trade-off between robustness and efficiency. When α = 0, the divergence corresponds to the Kullback–Leibler divergence, and the method reduces to maximum likelihood estimation. For α = 1, the divergence corresponds to the mean squared error, leading to a robust but inefficient estimator. For other values of α, the method provides a balance between robustness and efficiency. The paper discusses the properties of the density power divergence estimator, including its asymptotic normality and robustness. It also presents a robust model choice criterion based on the density power divergence. The method is shown to be robust to outliers and efficient under model conditions. The paper investigates the performance of the estimator in various parametric families, including the normal distribution, and demonstrates its effectiveness in handling outliers. The method is also extended to regression models, where it is shown to be robust and efficient. The paper concludes that the density power divergence estimator provides a flexible and robust alternative to maximum likelihood estimation, with the ability to balance robustness and efficiency by adjusting the parameter α.
Reach us at info@study.space