We propose a finite-element local basis-based operator learning framework for solving partial differential equations (PDEs). Operator learning aims to approximate mappings from input functions to output functions, where the latter are typically represented using basis functions. While non-learnable bases reduce training costs, learnable bases offer greater flexibility but often require deep network architectures with a large number of trainable parameters. Existing approaches typically rely on deep global bases; however, many PDE solutions exhibit local behaviors such as shocks, sharp gradients, etc., and in parametrized PDE settings, these localized features may appear in different regions of the domain across different training and testing samples. Motivated by the use of local bases in finite element methods (FEM) for function approximation, we develop a shallow neural network architecture that constructs adaptive FEM bases. By adopting suitable activation functions, such as ReLU, the FEM bases can be assembled exactly within the network, introducing no additional approximation error in the basis construction process. This design enables the learning procedure to naturally mimic the adaptive refinement mechanism of FEM, allowing the network to discover basis functions tailored to intrinsic solution features such as shocks. The proposed learnable adaptive bases are then employed to represent the solution (output function) of the PDE. This framework reduces the number of trainable parameters while maintaining high approximation accuracy, effectively combining the adaptivity of FEM with the expressive power of operator learning. To evaluate performance, we validate the proposed method on seven families of PDEs with diverse characteristics, demonstrating its accuracy, efficiency, and robustness.
翻译:暂无翻译