30 January 2024 | Bo Liu, Shusen Wei, Fan Zhang, Nawei Guo, Hongyu Fan and Wei Yao
This paper proposes a novel multi-task distillation learning (MTDL) framework for tomato leaf disease diagnosis. The framework aims to improve the performance of disease classification and severity prediction while reducing model complexity. The MTDL framework consists of three main components: knowledge disentanglement, mutual knowledge transfer, and knowledge integration. The framework employs a multi-stage learning strategy to leverage the complementary nature of classification and severity prediction tasks.
The dataset used in this study is aggregated from three distinct sources, including the AI Challenger 2018 Crop Leaf Disease Challenge, the PlantDoc dataset, and the Taiwan Tomato Disease dataset. The dataset contains a variety of tomato leaf diseases, with a total of 61 categories. The dataset is divided into training, validation, and test sets in an 8:1:1 ratio.
The MTDL framework is evaluated against several baseline methods, including ResNet101, ResNet50, DenseNet121, VGG16, EfficientNet, ShuffleNetV2, MobileNetV3, and SqueezeNet. The results show that the MTDL framework outperforms these methods in both disease classification and severity prediction tasks. Specifically, the MTDL-optimized EfficientNet outperforms the single-task ResNet101 in classification accuracy by 0.68% and severity estimation by 1.52%, using only 9.46% of its parameters.
The framework also introduces a decoupled teacher-free knowledge distillation (DTF-KD) method to simplify the learning process. This method introduces a virtual teacher to guide the learning process by providing separate instructions for the correct class and non-correct classes. The results demonstrate that the DTF-KD method significantly improves the accuracy of both the MTDL and its variants.
The experimental results show that the MTDL framework effectively leverages the staged learning of knowledge and the complementarity between different tasks. The framework achieves a balanced enhancement in both overall performance and category-specific outcomes. The results also indicate that the MTDL framework is effective in achieving a balance between performance and efficiency, particularly in heterogeneous settings where the teacher and student models have different architectures.
The study concludes that the proposed MTDL framework has practical potential for intelligent agriculture applications, demonstrating the effectiveness of multi-task learning in improving the performance of disease diagnosis systems.This paper proposes a novel multi-task distillation learning (MTDL) framework for tomato leaf disease diagnosis. The framework aims to improve the performance of disease classification and severity prediction while reducing model complexity. The MTDL framework consists of three main components: knowledge disentanglement, mutual knowledge transfer, and knowledge integration. The framework employs a multi-stage learning strategy to leverage the complementary nature of classification and severity prediction tasks.
The dataset used in this study is aggregated from three distinct sources, including the AI Challenger 2018 Crop Leaf Disease Challenge, the PlantDoc dataset, and the Taiwan Tomato Disease dataset. The dataset contains a variety of tomato leaf diseases, with a total of 61 categories. The dataset is divided into training, validation, and test sets in an 8:1:1 ratio.
The MTDL framework is evaluated against several baseline methods, including ResNet101, ResNet50, DenseNet121, VGG16, EfficientNet, ShuffleNetV2, MobileNetV3, and SqueezeNet. The results show that the MTDL framework outperforms these methods in both disease classification and severity prediction tasks. Specifically, the MTDL-optimized EfficientNet outperforms the single-task ResNet101 in classification accuracy by 0.68% and severity estimation by 1.52%, using only 9.46% of its parameters.
The framework also introduces a decoupled teacher-free knowledge distillation (DTF-KD) method to simplify the learning process. This method introduces a virtual teacher to guide the learning process by providing separate instructions for the correct class and non-correct classes. The results demonstrate that the DTF-KD method significantly improves the accuracy of both the MTDL and its variants.
The experimental results show that the MTDL framework effectively leverages the staged learning of knowledge and the complementarity between different tasks. The framework achieves a balanced enhancement in both overall performance and category-specific outcomes. The results also indicate that the MTDL framework is effective in achieving a balance between performance and efficiency, particularly in heterogeneous settings where the teacher and student models have different architectures.
The study concludes that the proposed MTDL framework has practical potential for intelligent agriculture applications, demonstrating the effectiveness of multi-task learning in improving the performance of disease diagnosis systems.