loading page

Continual Learning with Knowledge Distillation: A Survey
  • +1
  • Songze Li,
  • Tonghua Su,
  • Xuyao Zhang,
  • Zhongjie Wang
Tonghua Su
Xuyao Zhang
Zhongjie Wang

Corresponding Author:

Abstract

The foremost challenge in continual learning is devising strategies to alleviate catastrophic forgetting, thereby preserving a model's memory of prior knowledge while learning new tasks. Knowledge distillation, a form of data regularization, is garnering increasing attention in the field of continual learning for its ability to constrain a model's discriminative power for previous tasks by emulating the outputs of old task models while learning new tasks, thus mitigating forgetting. This paper offers a comprehensive survey of continual learning methods employing knowledge distillation within the realm of image classification. We inductively categorize these methods according to the source of knowledge utilized and provide a detailed analysis of the distillation solutions they employ. Furthermore, given the characteristic inability of continual learning to access historical data, we introduce a novel taxonomy for continual learning approaches from the perspective of auxiliary data usage. In addition, we have conducted extensive experiments on CIFAR-100, TinyImageNet, and ImageNet-100 across nine knowledge distillation-integrated continual learning methods, deeply analyzing the role of knowledge distillation in different continual learning scenarios to alleviate model forgetting. Our substantial experimental evidence demonstrates that knowledge distillation can indeed reduce forgetting across most scenarios.
29 Dec 2023Submitted to TechRxiv
02 Jan 2024Published in TechRxiv