Multi-class AdaBoost

Multi-class AdaBoost

2009 | Ji Zhu, Hui Zou, Saharon Rosset and Trevor Hastie
This paper introduces a new multi-class AdaBoost algorithm that directly extends the two-class AdaBoost algorithm without reducing the multi-class problem to multiple two-class problems. The proposed algorithm is equivalent to a forward stagewise additive modeling algorithm that minimizes a novel exponential loss function for multi-class classification. It is shown that this exponential loss function is Fisher-consistent for multi-class classification. The algorithm is easy to implement and performs well in terms of misclassification error rate. The paper also discusses the statistical justification of the algorithm, showing that the extra term in the exponential loss function is not artificial but follows naturally from the multi-class generalization of the exponential loss in the binary case. The algorithm is tested on both simulation and real-world data, and it is shown to perform well compared to other multi-class boosting algorithms such as AdaBoost.MH. The paper concludes that the new algorithm is a natural extension of the AdaBoost algorithm to the multi-class case and demonstrates the usefulness of the forward stagewise modeling view of boosting.This paper introduces a new multi-class AdaBoost algorithm that directly extends the two-class AdaBoost algorithm without reducing the multi-class problem to multiple two-class problems. The proposed algorithm is equivalent to a forward stagewise additive modeling algorithm that minimizes a novel exponential loss function for multi-class classification. It is shown that this exponential loss function is Fisher-consistent for multi-class classification. The algorithm is easy to implement and performs well in terms of misclassification error rate. The paper also discusses the statistical justification of the algorithm, showing that the extra term in the exponential loss function is not artificial but follows naturally from the multi-class generalization of the exponential loss in the binary case. The algorithm is tested on both simulation and real-world data, and it is shown to perform well compared to other multi-class boosting algorithms such as AdaBoost.MH. The paper concludes that the new algorithm is a natural extension of the AdaBoost algorithm to the multi-class case and demonstrates the usefulness of the forward stagewise modeling view of boosting.
Reach us at info@study.space
[slides] Multi-class AdaBoost %E2%88%97 | StudySpace