Relation extraction (RE) aims at extracting the relation between two entities from the text corpora. It is a crucial task for Knowledge Graph (KG) construction. Most existing methods predict the relation between an entity pair by learning the relation from the training sentences, which contain the targeted entity pair. In contrast to existing distant supervision approaches that suffer from insufficient training corpora to extract relations, our proposal of mining implicit mutual relation from the massive unlabeled corpora transfers the semantic information of entity pairs into the RE model, which is more expressive and semantically plausible. After constructing an entity proximity graph based on the implicit mutual relations, we preserve the semantic relations of entity pairs via embedding each vertex of the graph into a low-dimensional space. As a result, we can easily and flexibly integrate the implicit mutual relations and other entity information, such as entity types, into the existing RE methods. Our experimental results on a New York Times and another Google Distant Supervision datasets suggest that our proposed neural RE framework provides a promising improvement for the RE task, and significantly outperforms the state-of-the-art methods. Moreover, the component for mining implicit mutual relations is so flexible that can help to improve the performance of both CNN-based and RNN-based RE models significant.
翻译:我们的采矿建议旨在从文本公司中提取两个实体之间的关系,这是建立知识图(KG)的关键任务。大多数现有方法通过学习培训句中包含目标实体对子的关系来预测一个实体对子之间的关系。与现有的远程监督方法相比,由于培训公司缺乏足够的培训,因此无法从关系中提取关系,我们关于采矿的建议隐含了来自大规模无标签公司关系的相互关系。我们关于将实体对子的语义信息转移到了RE模式中,这种模式更明确、更具有说服力。在根据隐含的相互关系构建一个实体近距离图之后,我们通过将图表的每个顶端嵌入低维度空间来保持实体对子的语义关系。因此,我们可以很容易和灵活地将隐含的相互关系和其他实体信息(如实体类型)纳入现有的RE方法中。我们在《纽约时报》上的实验结果和另一个基于Google分流监督数据集表明,我们提议的神经资源框架为RE任务提供了有希望的改进,并且大大超越了实体对相配方的语义关系关系,通过将图表的状态-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-国家-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-机构-