Graphons, as limits of graph sequences, provide a framework for analyzing the asymptotic behavior of graph neural operators. Spectral convergence of sampled graphs to graphons yields operator-level convergence rates, enabling transferability analyses of GNNs. This note summarizes known bounds under no assumptions, global Lipschitz continuity, and piecewise-Lipschitz continuity, highlighting tradeoffs between assumptions and rates, and illustrating their empirical tightness on synthetic and real data.
翻译:图序列极限的图子为分析图神经算子的渐近行为提供了理论框架。采样图向图子的谱收敛可推导算子层级的收敛速率,从而支撑图神经网络的迁移性分析。本文在无假设、全局Lipschitz连续及分段Lipschitz连续三种条件下,系统归纳了现有收敛上界结果,揭示了假设条件与收敛速率间的权衡关系,并通过合成数据与真实数据的实验验证了这些界限的实证紧致性。