Graph neural networks (GNNs) have recently emerged as a dominant paradigm for machine learning with graphs. Research on GNNs has mainly focused on the family of message passing neural networks (MPNNs). Similar to the Weisfeiler-Leman (WL) test of isomorphism, these models follow an iterative neighborhood aggregation procedure to update vertex representations, and they next compute graph representations by aggregating the representations of the vertices. Although very successful, MPNNs have been studied intensively in the past few years. Thus, there is a need for novel architectures which will allow research in the field to break away from MPNNs. In this paper, we propose a new graph neural network model, so-called $\pi$-GNN which learns a "soft" permutation (i.e., doubly stochastic) matrix for each graph, and thus projects all graphs into a common vector space. The learned matrices impose a "soft" ordering on the vertices of the input graphs, and based on this ordering, the adjacency matrices are mapped into vectors. These vectors can be fed into fully-connected or convolutional layers to deal with supervised learning tasks. In case of large graphs, to make the model more efficient in terms of running time and memory, we further relax the doubly stochastic matrices to row stochastic matrices. We empirically evaluate the model on graph classification and graph regression datasets and show that it achieves performance competitive with state-of-the-art models.
翻译:图像神经网络(GNNs)最近成为机器使用图形学习的主要模式。 关于 GNNs 的研究主要侧重于信息传递神经网络(MPNNS) 。 类似于 Weisfeiler- Leman (WL) 的异形测试, 这些模型遵循一个迭代邻居聚合程序来更新顶点表示式, 然后它们通过汇总顶点的表示式来计算图形表示式。 尽管非常成功, 过去几年中对 MPNs 进行了大量研究 。 因此, 需要建立新的结构, 使实地研究能够脱离 MPNNS 。 在本文中, 我们提议一个新的图形神经网络模型模型模型模型模型模型模型模型模型模型模型模型模型模型模型模型模型模型模型模型模型可以绘制出一个新的神经网络模型模型模型模型模型模型模型, 并用一个“$\pip$-GNNNN, ” 来学习每个图形的“软化” 组合式( 即双向声调) 矩阵, 从而将所有图表投影化到一个共同矢量空间。 这些模型模型模型矩阵可以进一步将“ ” 和正对正轨的直径数据库进行对比, 和直径矩阵的矩阵进行对比。