Graph sparsification is an area of interest in computer science and applied mathematics. Sparsification of a graph, in general, aims to reduce the number of edges in the network while preserving specific properties of the graph, like cuts and subgraph counts. Computing the sparsest cuts of a graph is known to be NP-hard, and sparsification routines exist for generating linear-sized sparsifiers in almost quadratic running time $O(n^{2 + \epsilon})$. Consequently, obtaining a sparsifier can be a computationally demanding task, and the complexity varies based on the level of sparsity required. We propose SGRDN to extend sparsification to complex reaction-diffusion systems. This approach seeks to sparsify the graph such that the inherent reaction-diffusion dynamics are strictly preserved on the resulting structure. By selectively considering a subset of trajectories, we frame the network sparsification issue as a data assimilation problem within a Reduced Order Model (ROM) space, imposing constraints to conserve the eigenmodes of the Laplacian matrix ($L = D - A$), the difference between the degree matrix ($D$) and the adjacency matrix ($A$) despite perturbations. We derive computationally efficient eigenvalue and eigenvector approximations for perturbed Laplacian matrices and integrate these as spectral preservation constraints in the optimization problem. To further validate the method's broad applicability, we conducted an additional experiment on Neural Ordinary Differential Equations (neural ODEs), where SGRDN successfully achieved parameter sparsity.
翻译:暂无翻译