We consider the symmetric binary perceptron model, a simple model of neural networks that has gathered significant attention in the statistical physics, information theory and probability theory communities, with recent connections made to the performance of learning algorithms in Baldassi et al. '15. We establish that the partition function of this model, normalized by its expected value, converges to a lognormal distribution. As a consequence, this allows us to establish several conjectures for this model: (i) it proves the contiguity conjecture of Aubin et al. '19 between the planted and unplanted models in the satisfiable regime; (ii) it establishes the sharp threshold conjecture; (iii) it proves the frozen 1-RSB conjecture in the symmetric case, conjectured first by Krauth-M\'ezard '89 in the asymmetric case. In a recent concurrent work of Perkins-Xu [PX21], the last two conjectures were also established by proving that the partition function concentrates on an exponential scale. This left open the contiguity conjecture and the lognormal limit characterization, which are established here. In particular, our proof technique relies on a dense counter-part of the small graph conditioning method, which was developed for sparse models in the celebrated work of Robinson and Wormald.
翻译:我们认为,对称二进制透视模型是一个简单的神经网络模型,在统计物理、信息理论和概率理论界引起了人们的极大关注,最近与巴尔达西等人 '15的学习算法的运行建立了联系。我们确定,这一模型的分割功能根据其预期值而正常化,会归结为逻辑正常分布。因此,这使我们能够为这一模型建立若干猜想:(一)它证明了Aubin等人 '19年在可视化体系中植入和未植入的模型之间的毗连性猜想;(二)它建立了尖尖点猜度;(三)它证明了在对称案例中的1-RSB预测被冻结,第一个被克拉乌斯-M\\'ezard '89在不对称案例中预测。在Perkins-Xu [PX21]最近的一项同时开展的工作中,最后两个猜想也是通过证明分解功能集中在指数尺度上而建立的。这让直线透视和直置的顶点预测成为了我们精确度和正态模型的精确度模型的基础。在此处建立了一种精确的模型的精确度分析模型,这是一种特定的精确度模型的模型的精确度分析。