A Review of Feature Selection Methods Based on Mutual Information

A Review of Feature Selection Methods Based on Mutual Information

| Jorge R. Vergara · Pablo A. Estévez
This paper provides a comprehensive review of information-theoretic feature selection methods, focusing on mutual information (MI). It defines key concepts such as feature relevance, redundancy, and complementarity, and introduces the Markov blanket. The problem of optimal feature selection is defined, and a unified theoretical framework is presented to explain the evolution and advantages of different MI-based feature selection methods. The paper also discusses open problems in the field, including the need for further developing a unifying framework, improving efficiency in high-dimensional spaces, investigating the relationship between MI and Bayes error classification, and exploring the effect of finite samples on statistical criteria. The review highlights the importance of considering complementarity in feature selection and suggests future research directions.This paper provides a comprehensive review of information-theoretic feature selection methods, focusing on mutual information (MI). It defines key concepts such as feature relevance, redundancy, and complementarity, and introduces the Markov blanket. The problem of optimal feature selection is defined, and a unified theoretical framework is presented to explain the evolution and advantages of different MI-based feature selection methods. The paper also discusses open problems in the field, including the need for further developing a unifying framework, improving efficiency in high-dimensional spaces, investigating the relationship between MI and Bayes error classification, and exploring the effect of finite samples on statistical criteria. The review highlights the importance of considering complementarity in feature selection and suggests future research directions.
Reach us at info@study.space