We propose Koopman Spectral Wasserstein Gradient Descent (KSWGD), a generative modeling framework that combines operator-theoretic spectral analysis with optimal transport. The novel insight is that the spectral structure required for accelerated Wasserstein gradient descent can be directly estimated from trajectory data via Koopman operator approximation which can eliminate the need for explicit knowledge of the target potential or neural network training. We provide rigorous convergence analysis and establish connection to Feynman-Kac theory that clarifies the method's probabilistic foundation. Experiments across diverse settings, including compact manifold sampling, metastable multi-well systems, image generation, and high dimensional stochastic partial differential equation, demonstrate that KSWGD consistently achieves faster convergence than other existing methods while maintaining high sample quality.
翻译:我们提出Koopman谱Wasserstein梯度下降(KSWGD),这是一个将算子论谱分析与最优传输相结合的生成建模框架。该方法的创新性在于:加速Wasserstein梯度下降所需的谱结构可直接通过Koopman算子近似从轨迹数据中估计,从而无需目标势函数的显式知识或神经网络训练。我们提供了严格的收敛性分析,并建立了与Feynman-Kac理论的联系,从而阐明了该方法的概率论基础。在紧凑流形采样、亚稳态多势阱系统、图像生成和高维随机偏微分方程等多种场景下的实验表明,KSWGD在保持高样本质量的同时,始终比其他现有方法实现更快的收敛速度。