Logistic Model Trees

Logistic Model Trees

2005 | NIELS LANDWEHR, MARK HALL, EIBE FRANK
Logistic Model Trees (LMT) combine tree induction with logistic regression for classification. The algorithm uses a stagewise fitting process to build logistic regression models at the leaves of a decision tree, allowing it to select relevant attributes naturally. It incrementally refines logistic models at higher tree levels to build models at lower levels. LMT is evaluated on 36 UCI datasets and outperforms several state-of-the-art methods, including C4.5, CART, logistic regression, model trees, functional trees, naive Bayes trees, and Lotus. It is competitive with boosted decision trees, producing more interpretable models. LMT adapts the tree size to the data complexity, using simple regression functions for efficiency. The algorithm builds logistic regression models at tree leaves by refining models from higher levels, using LogitBoost for iterative refinement. It handles nominal attributes by converting them to binary indicators and manages missing values by using mean/mode imputation. LMT is compared to other tree-based methods like model trees, stepwise model tree induction, logistic regression trees, functional trees, and naive Bayes trees. The algorithm is efficient, using simple regression functions to reduce computational cost and improve accuracy. It is implemented with a five-fold cross-validation approach to determine the optimal number of iterations. LMT produces accurate and compact classifiers, making it a strong alternative to traditional tree-based methods for classification tasks.Logistic Model Trees (LMT) combine tree induction with logistic regression for classification. The algorithm uses a stagewise fitting process to build logistic regression models at the leaves of a decision tree, allowing it to select relevant attributes naturally. It incrementally refines logistic models at higher tree levels to build models at lower levels. LMT is evaluated on 36 UCI datasets and outperforms several state-of-the-art methods, including C4.5, CART, logistic regression, model trees, functional trees, naive Bayes trees, and Lotus. It is competitive with boosted decision trees, producing more interpretable models. LMT adapts the tree size to the data complexity, using simple regression functions for efficiency. The algorithm builds logistic regression models at tree leaves by refining models from higher levels, using LogitBoost for iterative refinement. It handles nominal attributes by converting them to binary indicators and manages missing values by using mean/mode imputation. LMT is compared to other tree-based methods like model trees, stepwise model tree induction, logistic regression trees, functional trees, and naive Bayes trees. The algorithm is efficient, using simple regression functions to reduce computational cost and improve accuracy. It is implemented with a five-fold cross-validation approach to determine the optimal number of iterations. LMT produces accurate and compact classifiers, making it a strong alternative to traditional tree-based methods for classification tasks.
Reach us at info@study.space