Robust mammography registration is essential for clinical applications like tracking disease progression and monitoring longitudinal changes in breast tissue. However, progress has been limited by the absence of public datasets and standardized benchmarks. Existing studies are often not directly comparable, as they use private data and inconsistent evaluation frameworks. To address this, we present MGRegBench, a public benchmark dataset for mammogram registration. It comprises over 5,000 image pairs, with 100 containing manual anatomical landmarks and segmentation masks for rigorous evaluation. This makes MGRegBench one of the largest public 2D registration datasets with manual annotations. Using this resource, we benchmarked diverse registration methods including classical (ANTs), learning-based (VoxelMorph, TransMorph), implicit neural representation (IDIR), a classic mammography-specific approach, and a recent state-of-the-art deep learning method MammoRegNet. The implementations were adapted to this modality from the authors' implementations or re-implemented from scratch. Our contributions are: (1) the first public dataset of this scale with manual landmarks and masks for mammography registration; (2) the first like-for-like comparison of diverse methods on this modality; and (3) an extensive analysis of deep learning-based registration. We publicly release our code and data to establish a foundational resource for fair comparisons and catalyze future research. The source code and data are at https://github.com/KourtKardash/MGRegBench.
翻译:稳健的乳腺X线摄影配准对于疾病进展追踪和乳腺组织纵向变化监测等临床应用至关重要。然而,由于缺乏公开数据集和标准化基准,该领域进展有限。现有研究通常无法直接比较,因为它们使用私有数据和不一致的评估框架。为此,我们提出了MGRegBench,一个用于乳腺X线摄影配准的公开基准数据集。它包含超过5000对图像,其中100对带有手动标注的解剖标志点和分割掩码,用于严格评估。这使得MGRegBench成为最大的带手动标注的公开二维配准数据集之一。利用该资源,我们对多种配准方法进行了基准测试,包括经典方法(ANTs)、基于学习的方法(VoxelMorph、TransMorph)、隐式神经表示方法(IDIR)、一种经典的乳腺摄影专用方法以及最新的先进深度学习方法MammoRegNet。这些实现均从作者原始代码适配至本模态或从头重新实现。我们的贡献在于:(1)首个具有此规模、带手动标注标志点和掩码的乳腺X线摄影配准公开数据集;(2)首次在该模态上对多种方法进行同条件比较;(3)对基于深度学习的配准方法进行了全面分析。我们公开代码和数据,旨在建立一个用于公平比较的基础资源并推动未来研究。源代码和数据位于 https://github.com/KourtKardash/MGRegBench。