This paper studies the shallow Ritz method for solving one-dimensional diffusion-reaction problems. The method is capable of improving the order of approximation for non-smooth problems. By following a similar approach to the one presented in [9], we present a damped block Newton (dBN) method to achieve nearly optimal order of approximation. The dBN method optimizes the Ritz functional by alternating between the linear and non-linear parameters of the shallow ReLU neural network (NN). For diffusion-reaction problems, new difficulties arise: (1) for the linear parameters, the mass matrix is dense and even more ill-conditioned than the stiffness matrix, and (2) for the non-linear parameters, the Hessian matrix is dense and may be singular. This paper addresses these challenges, resulting in a dBN method with computational cost of ${\cal O}(n)$. The ideas presented for diffusion-reaction problems can also be applied to least-squares approximation problems. For both applications, starting with the non-linear parameters as a uniform partition, numerical experiments show that the dBN method moves the mesh points to nearly optimal locations.
翻译:本文研究用于求解一维扩散-反应问题的浅层Ritz方法。该方法能够提升非光滑问题的逼近阶数。通过采用与文献[9]类似的研究思路,我们提出一种阻尼块牛顿(dBN)方法以实现近乎最优的逼近阶。dBN方法通过交替优化浅层ReLU神经网络中线性参数与非线性参数的方式,对Ritz泛函进行优化。对于扩散-反应问题,新的困难随之产生:(1) 对于线性参数,质量矩阵呈现稠密性且其病态程度甚至高于刚度矩阵;(2) 对于非线性参数,Hessian矩阵不仅稠密且可能出现奇异性。本文针对这些挑战提出了解决方案,最终构建出计算复杂度为${\cal O}(n)$的dBN方法。针对扩散-反应问题提出的思想体系同样适用于最小二乘逼近问题。数值实验表明,在这两类应用中,以均匀剖分作为非线性参数初始值时,dBN方法能够将网格点调整至近乎最优的位置。