In this paper, we propose a novel algorithm called Neuron-wise Parallel Subspace Correction Method (NPSC) for training ReLU neural networks for numerical solution of partial differential equations (PDEs). Despite of extremely extensive research activities in applying neural networks for numerical PDEs, there is still a serious lack of training algorithms that can be used to obtain approximation with adequate accuracy. Based on recent results on the spectral properties of linear layers and landscape analysis for single neuron problems, we develop a special type of subspace correction method that deals with the linear layer and each neuron in the nonlinear layer separately. An optimal preconditioner that resolves the ill-conditioning of the linear layer is presented, so that the linear layer is trained in a uniform number of iterations with respect to the number of neurons. In each single neuron problem, a good local minimum is found by a superlinearly convergent algorithm, avoiding regions where the loss function is flat. Performance of the proposed method is demonstrated through numerical experiments for function approximation problems and PDEs.
翻译:在本文中,我们提出一种新的算法,名为“中枢平行子空间校正方法”(NPSC),用于培训RELU神经网络,用于部分差异方程的数值解决方案。尽管在应用神经网络以数字PDE(PDE)方面开展了极为广泛的研究活动,但在应用神经网络以数字PDE(PDE)方面,仍然严重缺乏能够充分精确地获取近似的培训算法。根据线性层光谱特性和单一神经问题景观分析的最新结果,我们开发了一种特殊类型的子空间校正方法,分别处理线性层和非线性层的每个神经。通过功能近似问题和PDES的数值实验,可以证明拟议方法的绩效。