The Expectation-Maximization algorithm is perhaps the most broadly used algorithm for inference of latent variable problems. A theoretical understanding of its performance, however, largely remains lacking. Recent results established that EM enjoys global convergence for Gaussian Mixture Models. For Mixed Linear Regression, however, only local convergence results have been established, and those only for the high SNR regime. We show here that EM converges for mixed linear regression with two components (it is known that it may fail to converge for three or more), and moreover that this convergence holds for random initialization. Our analysis reveals that EM exhibits very different behavior in Mixed Linear Regression from its behavior in Gaussian Mixture Models, and hence our proofs require the development of several new ideas.
翻译:期望- 最大化算法也许是用来推断潜在变量问题的最广泛使用的算法。 但是,对其表现的理论理解在很大程度上仍然缺乏。 最近的结果证明EM在高山混合混合体模型中具有全球趋同性。 但是,对于混合线回归来说,只建立了局部趋同结果,而那些结果只建立了高度的 SNR 系统。 我们在这里显示EM在混合线回归中具有两个组成部分(已知它可能无法为三个或三个以上而趋同 ), 而且这种趋同会维持随机初始化。 我们的分析表明,EM在混合线回归中表现出与它在高山混合体模型中的行为截然不同的行为,因此我们的证据需要发展几个新的想法。