This paper studies the shallow Ritz method for solving one-dimensional diffusion-reaction problems. The method is capable of improving the order of approximation for non-smooth problems. By following a similar approach to the one presented in [9], we present a damped block Newton (dBN) method to achieve nearly optimal order of approximation. The dBN method optimizes the Ritz functional by alternating between the linear and non-linear parameters of the shallow ReLU neural network (NN). For diffusion-reaction problems, new difficulties arise: (1) for the linear parameters, the mass matrix is dense and even more ill-conditioned than the stiffness matrix, and (2) for the non-linear parameters, the Hessian matrix is dense and may be singular. This paper addresses these challenges, resulting in a dBN method with computational cost of ${\cal O}(n)$. The ideas presented for diffusion-reaction problems can also be applied to least-squares approximation problems. For both applications, starting with the non-linear parameters as a uniform partition, numerical experiments show that the dBN method moves the mesh points to nearly optimal locations.
翻译:本文研究用于求解一维扩散-反应问题的浅层Ritz方法。该方法能够提升非光滑问题的逼近阶数。通过采用与文献[9]类似的研究思路,我们提出一种阻尼块牛顿(dBN)方法以实现近乎最优的逼近阶。dBN方法通过交替优化浅层ReLU神经网络中线性参数与非线性参数的方式,对Ritz泛函进行优化。对于扩散-反应问题,新的困难在于:(1)对于线性参数,质量矩阵是稠密的且其条件数甚至比刚度矩阵更差;(2)对于非线性参数,Hessian矩阵是稠密的且可能出现奇异性。本文针对这些挑战提出了解决方案,最终构建了计算成本为${\cal O}(n)$的dBN方法。针对扩散-反应问题提出的思想同样适用于最小二乘逼近问题。对于这两类应用场景,数值实验表明:以均匀剖分作为非线性参数初始值时,dBN方法能够将网格点移动至近乎最优的位置。