Class incremental medical image segmentation (CIMIS) aims to preserve knowledge of previously learned classes while learning new ones without relying on old-class labels. However, existing methods 1) either adopt one-size-fits-all strategies that treat all spatial regions and feature channels equally, which may hinder the preservation of accurate old knowledge, 2) or focus solely on aligning local prototypes with global ones for old classes while overlooking their local representations in new data, leading to knowledge degradation. To mitigate the above issues, we propose Prototype-Guided Calibration Distillation (PGCD) and Dual-Aligned Prototype Distillation (DAPD) for CIMIS in this paper. Specifically, PGCD exploits prototype-to-feature similarity to calibrate class-specific distillation intensity in different spatial regions, effectively reinforcing reliable old knowledge and suppressing misleading information from old classes. Complementarily, DAPD aligns the local prototypes of old classes extracted from the current model with both global prototypes and local prototypes, further enhancing segmentation performance on old categories. Comprehensive evaluations on two widely used multi-organ segmentation benchmarks demonstrate that our method outperforms state-of-the-art methods, highlighting its robustness and generalization capabilities.
翻译:类增量医学图像分割(CIMIS)旨在学习新类别时保留已学旧类别的知识,且无需依赖旧类标签。然而,现有方法存在以下局限:1)部分方法采用“一刀切”策略,平等对待所有空间区域与特征通道,可能阻碍准确旧知识的保留;2)另一类方法仅关注将旧类别的局部原型与全局原型对齐,却忽视了新数据中旧类别的局部表征,导致知识退化。为缓解上述问题,本文提出面向CIMIS的原型引导校准蒸馏(PGCD)与双重对齐原型蒸馏(DAPD)。具体而言,PGCD利用原型-特征相似度校准不同空间区域的类别特定蒸馏强度,有效强化可靠的旧知识并抑制来自旧类别的误导性信息。作为补充,DAPD将从当前模型提取的旧类别局部原型与全局原型及局部原型同时对齐,进一步提升旧类别的分割性能。在两个广泛使用的多器官分割基准数据集上的综合评估表明,本方法优于现有先进方法,凸显了其鲁棒性与泛化能力。