Distribution shifts between training and test data are inevitable over the lifecycle of a deployed model, leading to performance decay. Adapting a model on test samples can help mitigate this drop in performance. However, most test-time adaptation methods have focused on synthetic corruption shifts, leaving a variety of distribution shifts underexplored. In this paper, we focus on distribution shifts that evolve gradually over time, which are common in the wild but challenging for existing methods, as we show. To address this, we propose STAD, a Bayesian filtering method that adapts a deployed model to temporal distribution shifts by learning the time-varying dynamics in the last set of hidden features. Without requiring labels, our model infers time-evolving class prototypes that act as a dynamic classification head. Through experiments on real-world temporal distribution shifts, we show that our method excels in handling small batch sizes and label shift.


翻译:在模型部署的生命周期中,训练数据与测试数据之间的分布偏移是不可避免的,这会导致模型性能下降。在测试样本上对模型进行适应有助于缓解这种性能下降。然而,大多数测试时适应方法主要关注合成数据损坏引起的分布偏移,而对多种其他类型的分布偏移研究不足。本文聚焦于随时间逐渐演变的分布偏移,这类偏移在真实场景中普遍存在,但正如我们所示,对现有方法具有挑战性。为解决此问题,我们提出了STAD,一种贝叶斯滤波方法,通过学习最后一组隐藏特征中随时间变化的动态特性,使部署模型适应时序分布偏移。我们的模型无需标签即可推断随时间演化的类别原型,这些原型充当动态分类头。通过在真实世界时序分布偏移上的实验,我们证明该方法在处理小批量数据和标签偏移方面表现优异。

0
下载
关闭预览

相关内容

ACM/IEEE第23届模型驱动工程语言和系统国际会议,是模型驱动软件和系统工程的首要会议系列,由ACM-SIGSOFT和IEEE-TCSE支持组织。自1998年以来,模型涵盖了建模的各个方面,从语言和方法到工具和应用程序。模特的参加者来自不同的背景,包括研究人员、学者、工程师和工业专业人士。MODELS 2019是一个论坛,参与者可以围绕建模和模型驱动的软件和系统交流前沿研究成果和创新实践经验。今年的版本将为建模社区提供进一步推进建模基础的机会,并在网络物理系统、嵌入式系统、社会技术系统、云计算、大数据、机器学习、安全、开源等新兴领域提出建模的创新应用以及可持续性。 官网链接:http://www.modelsconference.org/
Top
微信扫码咨询专知VIP会员