Probabilistic relaxations of graph cuts offer a differentiable alternative to spectral clustering, enabling end-to-end and online learning without eigendecompositions, yet prior work centered on RatioCut and lacked general guarantees and principled gradients. We present a unified probabilistic framework that covers a wide class of cuts, including Normalized Cut. Our framework provides tight analytic upper bounds on expected discrete cuts via integral representations and Gauss hypergeometric functions with closed-form forward and backward. Together, these results deliver a rigorous, numerically stable foundation for scalable, differentiable graph partitioning covering a wide range of clustering and contrastive learning objectives.
翻译:图割的概率松弛为谱聚类提供了可微分的替代方案,无需特征分解即可实现端到端和在线学习,但先前的研究主要集中于RatioCut,缺乏通用保证和理论梯度。我们提出了一个统一的概率框架,涵盖包括归一化割在内的广泛图割类别。该框架通过积分表示和高斯超几何函数,为期望离散割提供了紧密的解析上界,并具有闭式前向与反向传播。这些结果共同为可扩展、可微分的图划分奠定了严谨且数值稳定的理论基础,覆盖了广泛的聚类和对比学习目标。