Memory Aware Synapses: Learning what (not) to forget

Memory Aware Synapses: Learning what (not) to forget

5 Oct 2018 | Rahaf Aljundi, Francesca Babiloni, Mohamed Elhoseiny, Marcus Rohrbach, Tinne Tuytelaars
Memory Aware Synapses (MAS) is a novel approach for lifelong learning that enables models to learn what (not) to forget. Inspired by neuroplasticity, MAS computes the importance of neural network parameters in an unsupervised and online manner. It estimates the sensitivity of the network's output function to changes in parameters, allowing the model to preserve important knowledge from previous tasks while adapting to new ones. MAS is connected to Hebb's rule, a model of learning in the brain, and demonstrates state-of-the-art performance on object recognition and triplet prediction tasks. The method adapts importance weights based on unlabeled data, enabling it to specialize to specific test conditions. MAS outperforms existing lifelong learning methods in terms of forgetting reduction and memory efficiency. It is shown to be effective in both short and long sequences of tasks, with minimal forgetting even in complex scenarios. The method is memory-efficient and can be applied to any pretrained network, making it a versatile solution for continual learning.Memory Aware Synapses (MAS) is a novel approach for lifelong learning that enables models to learn what (not) to forget. Inspired by neuroplasticity, MAS computes the importance of neural network parameters in an unsupervised and online manner. It estimates the sensitivity of the network's output function to changes in parameters, allowing the model to preserve important knowledge from previous tasks while adapting to new ones. MAS is connected to Hebb's rule, a model of learning in the brain, and demonstrates state-of-the-art performance on object recognition and triplet prediction tasks. The method adapts importance weights based on unlabeled data, enabling it to specialize to specific test conditions. MAS outperforms existing lifelong learning methods in terms of forgetting reduction and memory efficiency. It is shown to be effective in both short and long sequences of tasks, with minimal forgetting even in complex scenarios. The method is memory-efficient and can be applied to any pretrained network, making it a versatile solution for continual learning.
Reach us at info@study.space
Understanding Memory Aware Synapses%3A Learning what (not) to forget