Self-consuming generative models have received significant attention over the last few years. In this paper, we study a self-consuming generative model with heterogeneous preferences that is a generalization of the model in Ferbach et al. (2024). The model is retrained round by round using real data and its previous-round synthetic outputs. The asymptotic behavior of the retraining dynamics is investigated across four regimes using different techniques including the nonlinear Perron--Frobenius theory. Our analyses improve upon that of Ferbach et al. (2024) and provide convergence results in settings where the well-known Banach contraction mapping arguments do not apply. Stability and non-stability results regarding the retraining dynamics are also given.
翻译:自消耗生成模型在过去几年中受到了广泛关注。本文研究了一种具有异构偏好的自消耗生成模型,该模型是Ferbach等人(2024)所提模型的推广。该模型通过逐轮使用真实数据及其上一轮生成的合成输出来进行再训练。我们采用包括非线性Perron-Frobenius理论在内的多种方法,在四种不同机制下研究了再训练动态的渐近行为。我们的分析改进了Ferbach等人(2024)的工作,并在著名的Banach压缩映射论证不适用的场景中提供了收敛性结果。同时,本文还给出了关于再训练动态的稳定性与非稳定性结论。