Recent breakthroughs in self-supervised Joint-Embedding Predictive Architectures (JEPAs) have established that regularizing Euclidean representations toward isotropic Gaussian priors yields provable gains in training stability and downstream generalization. We introduce a new, flexible family of KerJEPAs, self-supervised learning algorithms with kernel-based regularizers. One instance of this family corresponds to the recently-introduced LeJEPA Epps-Pulley regularizer which approximates a sliced maximum mean discrepancy (MMD) with a Gaussian prior and Gaussian kernel. By expanding the class of viable kernels and priors and computing the closed-form high-dimensional limit of sliced MMDs, we develop alternative KerJEPAs with a number of favorable properties including improved training stability and design flexibility.
翻译:自监督联合嵌入预测架构(JEPA)的最新突破表明,将欧几里得表示正则化至各向同性高斯先验,能在训练稳定性和下游泛化性方面带来可证明的收益。我们提出了一种新颖且灵活的KerJEPA族,即采用基于核的正则化器的自监督学习算法。该族的一个实例对应最近提出的LeJEPA Epps-Pulley正则化器,它使用高斯先验和高斯核来近似切片最大平均差异(MMD)。通过扩展可行核与先验的类别,并计算切片MMD的闭式高维极限,我们开发了具有多种优良特性的替代性KerJEPA,包括改进的训练稳定性和设计灵活性。