Generative adversarial networks (GANs) are a widely used framework for learning generative models. Wasserstein GANs (WGANs), one of the most successful variants of GANs, require solving a minmax optimization problem to global optimality, but are in practice successfully trained using stochastic gradient descent-ascent. In this paper, we show that, when the generator is a one-layer network, stochastic gradient descent-ascent converges to a global solution with polynomial time and sample complexity.
翻译:产生对抗性网络(GANs)是广泛使用的学习基因模型的框架。 瓦塞尔斯坦GANs(WGANs)是全球基因模型最成功的变种之一,它要求解决一个最小优化问题,使其符合全球最佳性,但在实践中,它使用随机梯度梯度梯度梯度梯度梯度升温法得到了成功培训。 在本文中,我们表明,当生成器是一个单层网络时,随机梯度梯度梯度梯度升温点会与具有多元时间和样本复杂性的全球解决方案汇合在一起。