A family of loss functions built on pair-based computation have been proposed in the literature which provide a myriad of solutions for deep metric learning. In this paper, we provide a general weighting framework for understanding recent pair-based loss functions. Our contributions are three-fold: (1) we establish a General Pair Weighting (GPW) framework, which casts the sampling problem of deep metric learning into a unified view of pair weighting through gradient analysis, providing a powerful tool for understanding recent pair-based loss functions; (2) we show that with GPW, various existing pair-based methods can be compared and discussed comprehensively, with clear differences and key limitations identified; (3) we propose a new loss called multi-similarity loss (MS loss) under the GPW, which is implemented in two iterative steps (i.e., mining and weighting). This allows it to fully consider three similarities for pair weighting, providing a more principled approach for collecting and weighting informative pairs. Finally, the proposed MS loss obtains new state-of-the-art performance on four image retrieval benchmarks, where it outperforms the most recent approaches, such as ABE\cite{Kim_2018_ECCV} and HTL by a large margin: 60.6% to 65.7% on CUB200, and 80.9% to 88.0% on In-Shop Clothes Retrieval dataset at Recall@1. Code is available at https://github.com/MalongTech/research-ms-loss.
翻译:文献中提出了基于对子计算的损失函数组合,为深层次计量学习提供了各种各样的解决方案。在本文中,我们为理解最近基于对子的损失函数提供了一个总体加权框架。我们的贡献有三重:(1) 我们建立了“对子加权”总框架(GPW),通过梯度分析,将深层计量学习的抽样问题转化为对子加权的统一观点,为了解最近基于对子的损失函数提供了一个强有力的工具;(2) 我们表明,通过GPW,可以对现有各种基于对子的方法进行比较和全面讨论,并找出明确的差异和关键限制;(3) 我们提出了在GPW下称为多相似性损失(MS损失)的新损失框架,该框架分为两个迭代步骤(即采矿和加权)实施。 这使得它能够充分考虑对齐的三种相似之处,通过梯度分析,为收集和加权信息化的对子提供更加有原则性的方法。(2) 我们表明,拟议的MS损失在四个图像检索基准上获得了新的州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-州-