Measuring Statistical Dependence with Hilbert-Schmidt Norms

Measuring Statistical Dependence with Hilbert-Schmidt Norms

June 2005 | Arthur Gretton, Olivier Bousquet, Alexander Smola, Bernhard Schölkopf
The paper "Measuring Statistical Dependence with Hilbert-Schmidt Norms" by Arthur Gretton, Olivier Bousquet, Alexander Smola, and Bernhard Schölkopf introduces a new method for measuring statistical dependence using the Hilbert-Schmidt norm of the cross-covariance operator in reproducing kernel Hilbert spaces (RKHSs). This method, termed the Hilbert-Schmidt Independence Criterion (HSIC), offers several advantages over previous kernel-based independence criteria: 1. **Simplicity**: The empirical estimate of HSIC is simpler than other kernel dependence tests and does not require user-defined regularization. 2. **Population Convergence**: HSIC converges to the population quantity at an exponential rate, ensuring that independence tests based on HSIC do not suffer from slow learning rates. 3. **Competitive Performance**: In the context of independent component analysis (ICA), HSIC performs competitively with other kernel-based criteria and specialized ICA methods. The authors derive the Hilbert-Schmidt norm of the cross-covariance operator and show that the empirical estimate of HSIC is equivalent to the quadratic dependence measure proposed by Achard et al. However, they provide a more detailed analysis and prove that HSIC is indeed a valid dependence criterion under all circumstances. The paper also discusses the computational efficiency of HSIC, showing that it can be computed in \(O(m^2)\) time, compared to \(O(m^3)\) for other kernel methods. Additionally, it presents an efficient approximation to the empirical HSIC using an incomplete Cholesky decomposition. Experimental results on ICA problems demonstrate that HSIC is superior to previous methods and is more robust to outliers compared to specialized ICA algorithms. The authors conclude that HSIC is a practical and effective criterion for testing independence, with simpler definition, fewer tuning parameters, and competitive performance.The paper "Measuring Statistical Dependence with Hilbert-Schmidt Norms" by Arthur Gretton, Olivier Bousquet, Alexander Smola, and Bernhard Schölkopf introduces a new method for measuring statistical dependence using the Hilbert-Schmidt norm of the cross-covariance operator in reproducing kernel Hilbert spaces (RKHSs). This method, termed the Hilbert-Schmidt Independence Criterion (HSIC), offers several advantages over previous kernel-based independence criteria: 1. **Simplicity**: The empirical estimate of HSIC is simpler than other kernel dependence tests and does not require user-defined regularization. 2. **Population Convergence**: HSIC converges to the population quantity at an exponential rate, ensuring that independence tests based on HSIC do not suffer from slow learning rates. 3. **Competitive Performance**: In the context of independent component analysis (ICA), HSIC performs competitively with other kernel-based criteria and specialized ICA methods. The authors derive the Hilbert-Schmidt norm of the cross-covariance operator and show that the empirical estimate of HSIC is equivalent to the quadratic dependence measure proposed by Achard et al. However, they provide a more detailed analysis and prove that HSIC is indeed a valid dependence criterion under all circumstances. The paper also discusses the computational efficiency of HSIC, showing that it can be computed in \(O(m^2)\) time, compared to \(O(m^3)\) for other kernel methods. Additionally, it presents an efficient approximation to the empirical HSIC using an incomplete Cholesky decomposition. Experimental results on ICA problems demonstrate that HSIC is superior to previous methods and is more robust to outliers compared to specialized ICA algorithms. The authors conclude that HSIC is a practical and effective criterion for testing independence, with simpler definition, fewer tuning parameters, and competitive performance.
Reach us at info@study.space
[slides and audio] Measuring Statistical Dependence with Hilbert-Schmidt Norms