Growing evidence suggests that synaptic weights in the brain follow heavy-tailed distributions, yet most theoretical analyses of recurrent neural networks (RNNs) assume Gaussian connectivity. We systematically study the activity of RNNs with random weights drawn from biologically plausible L\'evy alpha-stable distributions. While mean-field theory for the infinite system predicts that the quiescent state is always unstable -- implying ubiquitous chaos -- our finite-size analysis reveals a sharp transition between quiescent and chaotic dynamics. We theoretically predict the gain at which the system transitions from quiescent to chaotic dynamics, and validate it through simulations. Compared to Gaussian networks, heavy-tailed RNNs exhibit a broader parameter regime near the edge of chaos, namely a slow transition to chaos. However, this robustness comes with a tradeoff: heavier tails reduce the Lyapunov dimension of the attractor, indicating lower effective dimensionality. Our results reveal a biologically aligned tradeoff between the robustness of dynamics near the edge of chaos and the richness of high-dimensional neural activity. By analytically characterizing the transition point in finite-size networks -- where mean-field theory breaks down -- we provide a tractable framework for understanding dynamics in realistically sized, heavy-tailed neural circuits.
翻译:越来越多的证据表明,大脑中的突触权重遵循重尾分布,然而大多数关于循环神经网络(RNN)的理论分析都假设其为高斯连接。我们系统地研究了权重从生物学上合理的Lévy alpha稳定分布中随机抽取的RNN的活动。虽然无限系统的平均场理论预测静止态总是不稳定的——这意味着普遍存在的混沌——但我们的有限尺寸分析揭示了静止动力学与混沌动力学之间存在急剧转变。我们从理论上预测了系统从静止态转变为混沌动力学时的增益,并通过模拟验证了该预测。与高斯网络相比,重尾RNN在混沌边缘附近展现出更宽广的参数区域,即向混沌的缓慢转变。然而,这种鲁棒性伴随着一个权衡:更重的尾部会降低吸引子的李雅普诺夫维数,表明有效维数更低。我们的结果揭示了一种与生物学相符的权衡,即在混沌边缘附近动力学的鲁棒性与高维神经活动的丰富性之间。通过解析地刻画有限尺寸网络中的转变点——此处平均场理论失效——我们为理解现实尺寸、重尾神经回路中的动力学提供了一个易于处理的框架。