Elastic Feature Consolidation for Cold Start Exemplar-free Incremental Learning

Elastic Feature Consolidation for Cold Start Exemplar-free Incremental Learning

2024 | Simone Magistri, Tomaso Trinci, Albin Soutif-Cormerais, Joost van de Weijer, Andrew D. Bagdanov
Elastic Feature Consolidation (EFC) is a novel approach for Exemplar-Free Class Incremental Learning (EFCIL) that addresses the challenge of learning new tasks without access to previous task data. The method introduces an Empirical Feature Matrix (EFM) to regularize feature drift in directions important for previous tasks, while allowing plasticity in other directions. EFC also employs an Asymmetric Prototype Replay loss (PR-ACE) to balance new-task data with Gaussian prototypes, improving the trade-off between plasticity and stability. The EFM induces a pseudo-metric in feature space, which is used to estimate prototype drift and update Gaussian prototypes in an asymmetric cross entropy loss. Experimental results on CIFAR-100, Tiny-ImageNet, ImageNet-Subset, and ImageNet-1K show that EFC significantly outperforms state-of-the-art methods, particularly in the challenging Cold Start scenario where the first task is insufficiently large to learn a high-quality backbone. EFC achieves better performance by maintaining model plasticity and effectively mitigating forgetting. The method is evaluated in both Warm Start and Cold Start scenarios, with EFC demonstrating superior performance in the latter. The approach is also efficient in terms of storage, as it uses task-specific covariance matrices and a single covariance from the most recent task for prototype generation. The results show that EFC is effective in reducing task-recency bias and improving the accuracy of classifiers in incremental learning settings.Elastic Feature Consolidation (EFC) is a novel approach for Exemplar-Free Class Incremental Learning (EFCIL) that addresses the challenge of learning new tasks without access to previous task data. The method introduces an Empirical Feature Matrix (EFM) to regularize feature drift in directions important for previous tasks, while allowing plasticity in other directions. EFC also employs an Asymmetric Prototype Replay loss (PR-ACE) to balance new-task data with Gaussian prototypes, improving the trade-off between plasticity and stability. The EFM induces a pseudo-metric in feature space, which is used to estimate prototype drift and update Gaussian prototypes in an asymmetric cross entropy loss. Experimental results on CIFAR-100, Tiny-ImageNet, ImageNet-Subset, and ImageNet-1K show that EFC significantly outperforms state-of-the-art methods, particularly in the challenging Cold Start scenario where the first task is insufficiently large to learn a high-quality backbone. EFC achieves better performance by maintaining model plasticity and effectively mitigating forgetting. The method is evaluated in both Warm Start and Cold Start scenarios, with EFC demonstrating superior performance in the latter. The approach is also efficient in terms of storage, as it uses task-specific covariance matrices and a single covariance from the most recent task for prototype generation. The results show that EFC is effective in reducing task-recency bias and improving the accuracy of classifiers in incremental learning settings.
Reach us at info@study.space
[slides and audio] Elastic Feature Consolidation for Cold Start Exemplar-free Incremental Learning