19 Mar 2024 | Grzegorz Rypeś1,2, Sebastian Cygert1,3, Valeriya Khan1,2, Tomasz Trzcinski1,2,4, Bartosz Zielinski1,5 & Bartlomiej Twardowski1,4,7
The paper introduces SEED, a novel ensemble method for class-incremental learning (CIL) that addresses the issue of catastrophic forgetting by selectively training only one expert per task. SEED uses a Gaussian distribution to represent each class in the latent space of each expert, and selects the expert with the least overlap in class distributions for fine-tuning. This approach ensures diversity and heterogeneity among the experts while maintaining high stability. The method is evaluated on various benchmarks, demonstrating state-of-the-art performance in exemplar-free CIL scenarios, including equal split and task distribution shifts. SEED outperforms other methods in terms of accuracy and stability, achieving superior results with fewer parameters. The paper also includes an ablation study and discusses the trade-off between plasticity and stability, as well as the limitations of the method.The paper introduces SEED, a novel ensemble method for class-incremental learning (CIL) that addresses the issue of catastrophic forgetting by selectively training only one expert per task. SEED uses a Gaussian distribution to represent each class in the latent space of each expert, and selects the expert with the least overlap in class distributions for fine-tuning. This approach ensures diversity and heterogeneity among the experts while maintaining high stability. The method is evaluated on various benchmarks, demonstrating state-of-the-art performance in exemplar-free CIL scenarios, including equal split and task distribution shifts. SEED outperforms other methods in terms of accuracy and stability, achieving superior results with fewer parameters. The paper also includes an ablation study and discusses the trade-off between plasticity and stability, as well as the limitations of the method.