Although the concept of catastrophic forgetting is straightforward, there is a lack of study on its causes. In this paper, we systematically explore and reveal three causes for catastrophic forgetting in Class Incremental Learning(CIL). From the perspective of representation learning,(i) intra-phase forgetting happens when the learner fails to correctly align the same-phase data as training proceeds and (ii) inter-phase confusion happens when the learner confuses the current-phase data with the previous-phase. From the task-specific point of view, the CIL model suffers from the problem of (iii) classifier deviation. After investigating existing strategies, we observe that there is a lack of study on how to prevent the inter-phase confusion. To initiate the research on this specific issue, we propose a simple yet effective framework, Contrastive Class Concentration for CIL (C4IL). Our framework leverages the class concentration effect of contrastive learning, yielding a representation distribution with better intra-class compactibility and inter-class separability. Empirically, we observe that C4IL significantly lowers the probability of inter-phase confusion and as a result improves the performance on multiple CIL settings of multiple datasets.
翻译:虽然灾难性遗忘的概念是直截了当的,但缺乏对其原因的研究。在本文件中,我们系统地探讨并揭示了在课堂递增学习中灾难性遗忘的三个原因。从代表性学习的角度来看,(一) 学习者未能正确将与培训成果相同的阶段数据与培训成果一致,(二) 学习者将当前阶段的数据与前阶段混淆,发生阶段间混乱。从具体任务的角度来看,CIL模式存在(三)分类偏差的问题。在调查现有战略之后,我们发现,缺乏关于如何防止阶段间混乱的研究。为启动关于这一具体问题的研究,我们提出了一个简单而有效的框架,即C4IL的对比等级集中。我们的框架利用了对比学习的阶级集中效应,使代表性分布产生更好的阶级内部压缩性和阶级间分离性。我们发现,C4IL大大降低了跨阶段混淆的可能性,并因此改善了多级LIL的多重数据设置的性能。