30 January 2024 | Bo Liu, Shusen Wei, Fan Zhang, Nawei Guo, Hongyu Fan and Wei Yao
The paper presents a novel multi-task distillation learning (MTDL) framework for the comprehensive diagnosis of tomato leaf diseases. The framework addresses the challenges of symptom variations, limited labeled data, and model complexity in automated disease recognition. It employs a multi-stage strategy that includes knowledge disentanglement, mutual learning, and knowledge integration to leverage the complementary nature of disease classification and severity prediction. The MTDL framework is designed to be adaptable to different disease identification scenarios with varying computational power configurations and performance requirements.
The experimental results demonstrate that the proposed framework improves performance while reducing model complexity. The MTDL-optimized EfficientNet outperforms single-task ResNet101 in classification accuracy by 0.68% and severity estimation by 1.52%, using only 9.46% of its parameters. The findings highlight the practical potential of the MTDL framework for intelligent agriculture applications.
The paper also introduces a decoupled teacher-free knowledge distillation (DTF-KD) method to simplify the training process by reducing the reliance on teacher models during the learning process. This method introduces a virtual teacher to guide the learning process by providing separate instructions for the correct and non-correct classes, enhancing the flexibility and adaptability of the framework.
The effectiveness of the multi-stage distillation learning in the MTDL framework is validated through performance improvements across various network architectures. The trade-off between performance and efficiency is also investigated, showing that the MTDL framework can achieve significant improvements in accuracy while maintaining or reducing the number of parameters and floating-point operations (FLOPs).The paper presents a novel multi-task distillation learning (MTDL) framework for the comprehensive diagnosis of tomato leaf diseases. The framework addresses the challenges of symptom variations, limited labeled data, and model complexity in automated disease recognition. It employs a multi-stage strategy that includes knowledge disentanglement, mutual learning, and knowledge integration to leverage the complementary nature of disease classification and severity prediction. The MTDL framework is designed to be adaptable to different disease identification scenarios with varying computational power configurations and performance requirements.
The experimental results demonstrate that the proposed framework improves performance while reducing model complexity. The MTDL-optimized EfficientNet outperforms single-task ResNet101 in classification accuracy by 0.68% and severity estimation by 1.52%, using only 9.46% of its parameters. The findings highlight the practical potential of the MTDL framework for intelligent agriculture applications.
The paper also introduces a decoupled teacher-free knowledge distillation (DTF-KD) method to simplify the training process by reducing the reliance on teacher models during the learning process. This method introduces a virtual teacher to guide the learning process by providing separate instructions for the correct and non-correct classes, enhancing the flexibility and adaptability of the framework.
The effectiveness of the multi-stage distillation learning in the MTDL framework is validated through performance improvements across various network architectures. The trade-off between performance and efficiency is also investigated, showing that the MTDL framework can achieve significant improvements in accuracy while maintaining or reducing the number of parameters and floating-point operations (FLOPs).