Deep Forest

Deep Forest

6 Jul 2020 | Zhi-Hua Zhou, Ji Feng
This paper explores the possibility of building deep models using non-differentiable modules, specifically decision trees. The authors propose the gcForest approach, which generates a "deep forest" that retains three key characteristics of deep neural networks: layer-by-layer processing, in-model feature transformation, and sufficient model complexity. gcForest is a decision tree ensemble method with fewer hyper-parameters than deep neural networks and can automatically determine its model complexity based on data. Experiments show that gcForest performs robustly across different hyper-parameter settings and datasets, even outperforming deep neural networks in some cases. The study opens the door to deep learning based on non-differentiable modules, highlighting the potential of constructing deep models without backpropagation.This paper explores the possibility of building deep models using non-differentiable modules, specifically decision trees. The authors propose the gcForest approach, which generates a "deep forest" that retains three key characteristics of deep neural networks: layer-by-layer processing, in-model feature transformation, and sufficient model complexity. gcForest is a decision tree ensemble method with fewer hyper-parameters than deep neural networks and can automatically determine its model complexity based on data. Experiments show that gcForest performs robustly across different hyper-parameter settings and datasets, even outperforming deep neural networks in some cases. The study opens the door to deep learning based on non-differentiable modules, highlighting the potential of constructing deep models without backpropagation.
Reach us at info@study.space