This paper studies the shallow Ritz method for solving one-dimensional diffusion-reaction problems. The method is capable of improving the order of approximation for non-smooth problems. By following a similar approach to the one presented in [9], we present a damped block Newton (dBN) method to achieve nearly optimal order of approximation. The dBN method optimizes the Ritz functional by alternating between the linear and non-linear parameters of the shallow ReLU neural network (NN). For diffusion-reaction problems, new difficulties arise: (1) for the linear parameters, the mass matrix is dense and even more ill-conditioned than the stiffness matrix, and (2) for the non-linear parameters, the Hessian matrix is dense and may be singular. This paper addresses these challenges, resulting in a dBN method with computational cost of ${\cal O}(n)$. The ideas presented for diffusion-reaction problems can also be applied to least-squares approximation problems. For both applications, starting with the non-linear parameters as a uniform partition, numerical experiments show that the dBN method moves the mesh points to nearly optimal locations.
翻译:本文研究用于求解一维扩散-反应问题的浅层Ritz方法。该方法能够提升非光滑问题的逼近阶数。通过采用与文献[9]类似的研究思路,我们提出一种阻尼块牛顿(dBN)方法以实现近乎最优的逼近阶。dBN方法通过交替优化浅层ReLU神经网络中线性参数与非线性参数的方式,对Ritz泛函进行优化。对于扩散-反应问题,该方法面临新的挑战:(1) 线性参数对应的质量矩阵为稠密矩阵,其病态程度甚至高于刚度矩阵;(2) 非线性参数对应的Hessian矩阵为稠密矩阵且可能出现奇异性。本文针对这些挑战提出了解决方案,最终构建出计算复杂度为${\cal O}(n)$的dBN方法。针对扩散-反应问题提出的方法框架同样适用于最小二乘逼近问题。数值实验表明,对于两类应用问题,在初始非线性参数取均匀剖分的情况下,dBN方法能够将网格点调整至近乎最优的位置。