图注意力网络(Graph Attention Network,GAT),它通过注意力机制(Attention Mechanism)来对邻居节点做聚合操作,实现了对不同邻居权重的自适应分配,从而大大提高了图神经网络模型的表达能力。

VIP内容

本文提出了一个新的多粒度阅读理解框架,并且在NQ数据集上验证了其有效性。我们利用文档自身的层次结构特性,以四个粒度建模文档,并且同时考虑NQ中两个粒度答案的依赖关系。实验结果表明我们提出的方法是非常有效的,并且相比现有方法有了大幅度的提升。

整体系统架构,所有文档片段被独立的输入到模型中,最后汇总之后得到答案 我们针对这种NQ数据集提出了一个新的框架,整体系统架构如图3所示,我们将问题以及文档的每个片段独立的输入到模型中,通过BERT编码器进行编码,得到问题和文档片段的初步表示,然后用我们提出的图编码器用得到的表示进一步建模,最终得到一系列结构化的表示,汇总到答案选择模块得到答案。

成为VIP会员查看完整内容
0
31

最新内容

We present a novel method for graph partitioning, based on reinforcement learning and graph convolutional neural networks. The new reinforcement learning based approach is used to refine a given partitioning obtained on a coarser representation of the graph, and the algorithm is applied recursively. The neural network is implemented using graph attention layers, and trained using an advantage actor critic (A2C) agent. We present two variants, one for finding an edge separator that minimizes the normalized cut or quotient cut, and one that finds a small vertex separator. The vertex separators are then used to construct a nested dissection ordering for permuting a sparse matrix so that its triangular factorization will incur less fill-in. The partitioning quality is compared with partitions obtained using METIS and Scotch, and the nested dissection ordering is evaluated in the sparse solver SuperLU. Our results show that the proposed method achieves similar partitioning quality than METIS and Scotch. Furthermore, the method generalizes from one class of graphs to another, and works well on a variety of graphs from the SuiteSparse sparse matrix collection.

0
0
下载
预览

最新论文

We present a novel method for graph partitioning, based on reinforcement learning and graph convolutional neural networks. The new reinforcement learning based approach is used to refine a given partitioning obtained on a coarser representation of the graph, and the algorithm is applied recursively. The neural network is implemented using graph attention layers, and trained using an advantage actor critic (A2C) agent. We present two variants, one for finding an edge separator that minimizes the normalized cut or quotient cut, and one that finds a small vertex separator. The vertex separators are then used to construct a nested dissection ordering for permuting a sparse matrix so that its triangular factorization will incur less fill-in. The partitioning quality is compared with partitions obtained using METIS and Scotch, and the nested dissection ordering is evaluated in the sparse solver SuperLU. Our results show that the proposed method achieves similar partitioning quality than METIS and Scotch. Furthermore, the method generalizes from one class of graphs to another, and works well on a variety of graphs from the SuiteSparse sparse matrix collection.

0
0
下载
预览
Top