Transfer without Forgetting#
October 2022
Continual Learning Distillation Transfer Learning
The study explores the connection between Continual Learning and Transfer Learning, emphasizing the limitations of network pretraining due to catastrophic forgetting. To address this, we introduce Transfer without Forgetting (TwF), a method that utilizes a fixed pretrained sibling network and layer-wise loss to retain knowledge from the source domain. Experimental results show TwF consistently outperforms other CL methods in Class-Incremental accuracy across different datasets and buffer sizes.