We study the problem of probabilistic inference on the joint distribution defined by a normalizing flow model. Given a pre-trained flow model $p(\boldsymbol{x})$, we wish to estimate $p(\boldsymbol{x}_2 \mid \boldsymbol{x}_1)$ for some arbitrary partitioning of the variables $\boldsymbol{x} = (\boldsymbol{x}_1, \boldsymbol{x}_2)$. We first show that this task is computationally hard for a large class of flow models. Motivated by this hardness result, we propose a framework for $\textit{approximate}$ probabilistic inference. Specifically, our method trains a new generative model with the property that its composition with the given model approximates the target conditional distribution. By parametrizing this new distribution as another flow model, we can efficiently train it using variational inference and also handle conditioning under arbitrary differentiable transformations. We experimentally demonstrate that our approach outperforms Langevin Dynamics in terms of sample quality, while requiring much fewer parameters and training time compared to regular variational inference. We further validate the flexibility of our method on a variety of inference tasks with applications to inverse problems.
翻译:我们研究由正常流模式定义的联合分布的概率推论问题。 我们首先显示, 在经过事先训练的流动模型 $p (\ boldsymbol{xx}) $p (\ boldsymbol{x}2\ mid\ boldsymbol{x{x}1) 的情况下, 我们想要估算美元( boldsymbol{x}) $, 用于任意分割变量 $\ boldsymbol{x} = (\ boldsymbol{x}) = (\ boldsylsymbol{x2) $ 。 我们首先显示, 由于这种硬性结果, 我们提议了一个框架 $\ textysysol{ pourx% 2\ morted abandality residate of a $ probility ference) 。 具体地说, 我们的方法将新的模型与目标的有条件分布相近似, 我们可以用变换的变换方法来训练它, 我们用不同的变换方法来进一步验证。