Solving partial differential equations with deep learning makes it possible to reduce simulation times by multiple orders of magnitude and unlock scientific methods that typically rely on large numbers of sequential simulations, such as optimization and uncertainty quantification. Two of the largest challenges of adopting scientific AI for industrial problem settings is that training datasets must be simulated in advance and that neural networks for solving large-scale PDEs exceed the memory capabilities of current GPUs. We introduce a distributed programming API in the Julia language for simulating training data in parallel on the cloud and without requiring users to manage the underlying HPC infrastructure. In addition, we show that model-parallel deep learning based on domain decomposition allows us to scale neural networks for solving PDEs to commercial-scale problem settings and achieve above 90% parallel efficiency. Combining our cloud API for training data generation and model-parallel deep learning, we train large-scale neural networks for solving the 3D Navier-Stokes equation and simulating 3D CO2 flow in porous media. For the CO2 example, we simulate a training dataset based on a commercial carbon capture and storage (CCS) project and train a neural network for CO2 flow simulation on a 3D grid with over 2 million cells that is 5 orders of magnitudes faster than a conventional numerical simulator and 3,200 times cheaper.
翻译:通过深层学习解决部分差异方程式,可以通过多个数量级的多个数量级来减少模拟时间,并释放通常依赖大量连续模拟的科学方法,例如优化和不确定性量化。在工业问题环境中采用科学AI的最大挑战有两个是,必须事先模拟培训数据集,解决大规模PDE的神经网络超过当前GPU的记忆能力。我们在Julia语言中引入一个分布式编程API,用于模拟云层上平行的培训数据,而不需要用户管理潜在的HPC基础设施。此外,我们展示基于域分解的模型-单数深层次学习使我们能够将解决PDE的神经网络规模扩大到商业规模的问题设置,并实现90%以上的平行效率。将我们的云性API用于培训数据生成和模型-分层深层学习的云性网络合并起来,我们培训大型的神经网络以解决3DNavier-Stokes方程式的3DCO2流,同时不要求用户管理基础的HPC基础设施。此外,我们模拟基于域分层分解的模型-平行深层学习能够将一个基于商业碳捕获和存储PDE网络的神经网络,在3M2级上将一个超过5CS-CS-CS-CS-CS-CS-CS-ILVLOVILULUD-C-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-V-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S-S