关系抽取指的是检测和识别文本中实体之间的语义关系,并将表示同一语义关系的提及(mention)链接起来的任务

知识荟萃

关系抽取(Relation Extraction )荟萃

入门学习

综述

论文

2007

  1. Razvan Bunescu, Raymond Mooney. Learning to Extract Relations from the Web using Minimal Supervision. ACL 2007. [https://www.aclweb.org/anthology/P07-1073/]

2009

  1. Iris Hendrickx, Su Nam Kim, Zornitsa Kozareva, Preslav Nakov, Diarmuid Ó Séaghdha, Sebastian Padó, Marco Pennacchiotti, Lorenza Romano, Stan Szpakowicz. SemEval-2010 Task 8: Multi-Way Classification of Semantic Relations between Pairs of Nominals. ACL 2009. [https://www.aclweb.org/anthology/S10-1006/]
  2. Mike Mintz, Steven Bills, Rion Snow and Dan Jurafsky. Distant supervision for relation extraction without labeled data. ACL 2009. [https://web.stanford.edu/~jurafsky/mintz.pdf]

2010

  1. Sebastian Riedel, Limin Yao, and Andrew McCallum. Modeling Relations and Their Mentions withoutLabeled Text.ECML 2010. [https://link.springer.com/content/pdf/10.1007%2F978-3-642-15939-8_10.pdf]

2011

  1. Raphael Hoffmann, Congle Zhang, Xiao Ling, Luke Zettlemoyer, Daniel S. Weld. Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations. ACL-HLT 2011. [https://www.aclweb.org/anthology/P11-1055/]

2012

  1. Richard Socher,Brody Huval ,Christopher D. Manning ,Andrew Y. Ng. Semantic Compositionality through Recursive Matrix-Vector Spaces. EMNLP-CoNLL 2012. [https://ai.stanford.edu/~ang/papers/emnlp12-SemanticCompositionalityRecursiveMatrixVectorSpaces.pdf]
  2. Mihai Surdeanu, Julie Tibshirani, Ramesh Nallapati, Christopher D. Manning. Multi-instance Multi-label Learning for Relation Extraction. EMNLP-CoNLL 2012. [https://www.aclweb.org/anthology/D12-1042.pdf]

2013

  1. Tomas Mikolov,Ilya Sutskever,Kai Chen,Greg Corrado,Jeffrey Dean. Distributed Representations of Words and Phrases and their Compositionality. NIPS 2013. [https://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf]
  2. ChunYang LiuWenBo SunWenHan ChaoWanXiang Che. Convolution Neural Network for Relation Extraction. ADMA 2013. [https://link.springer.com/chapter/10.1007/978-3-642-53917-6_21]

2014

  1. Jeffrey Pennington, Richard Socher, Christopher D. Manning. GloVe: Global Vectors for Word Representation. EMNLP 2014. [https://www.aclweb.org/anthology/D14-1162.pdf]
  2. Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou and Jun Zhao. Relation Classification via Convolutional Deep Neural Network. COLING 2014. [https://www.aclweb.org/anthology/C14-1220.pdf]
  3. Mo Yu, Matthw R. Gormley and Mark Dredze. Factor-based Compositional Embedding Models. NIPS Workshop on Learning Semantics 2014. [https://www.cs.cmu.edu/~mgormley/papers/yu+gormley+dredze.nipsw.2014.pdf]

2015

  1. Cicero Nogueira dos Santos,Bing Xiang,Bowen Zhou. Classifying Relations by Ranking with Convolutional Neural Networks. ACL 2015. [https://www.aclweb.org/anthology/P15-1061.pdf]
  2. Dongxu Zhang, Dong Wang. Relation Classification via Recurrent Neural Network. arXiv preprint arXiv:1508.01006 (2015). [https://arxiv.org/abs/1508.01006]
  3. Thien Huu Nguyen, Ralph Grishman. Relation Extraction: Perspective from Convolutional Neural Networks. NAACL-HLT 2015. [https://www.aclweb.org/anthology/W15-1506.pdf]
  4. Shu Zhang, Dequan Zheng, Xinchen Hu, Ming Yang. Bidirectional Long Short-Term Memory Networks for Relation Classification .PACLIC 2015. [https://www.aclweb.org/anthology/Y15-1009/]
  5. Daojian Zeng, Kang Liu, Yubo Chen and Jun Zhao. Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks. EMNLP 2015. [http://www.emnlp2015.org/proceedings/EMNLP/pdf/EMNLP203.pdf]
  6. Yang Liu, Furu Wei, Sujian Li, Heng Ji, Ming Zhou,Houfeng Wang. A Dependency-Based Neural Network for Relation Classification. ACL 2015. [https://www.aclweb.org/anthology/P15-2047.pdf]
  7. Xu Yan, Lili Mou, Ge Li, Yunchuan Chen, Hao Peng, Zhi Jin. Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Path . EMNLP 2015. [https://arxiv.org/pdf/1508.03720.pdf]
  8. Kun Xu, Yansong Feng, Songfang Huang,Dongyan Zhao. Semantic Relation Classification via Convolutional Neural Networks with Simple Negative Sampling. [https://www.aclweb.org/anthology/D15-1062.pdf]

2016

  1. Yankai Lin, Shiqi Shen, Zhiyuan Liu,Huanbo Luan, Maosong Sun. Neural Relation Extraction with Selective Attention over Instances.ACL 2016 . [http://nlp.csai.tsinghua.edu.cn/~lyk/publications/acl2016_nre.pdf]
  2. Makoto Miwa,Mohit Bansal. End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures .ACL 2016. [https://www.aclweb.org/anthology/P16-1105.pdf]
  3. Peng Zhou, Wei Shi, Jun Tian, Zhenyu Qi, Bingchen Li, Hongwei Hao, Bo Xu. Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification. ACL 2016.  [https://www.aclweb.org/anthology/P16-2034/]
  4. Minguang Xiao, Cong Liu. Semantic Relation Classification via Hierarchical Recurrent Neural Network with Attention. COLING 2016. [https://www.aclweb.org/anthology/C16-1119/]
  5. Yankai Lin,Shiqi Shen,Zhiyuan Liu,Huanbo Luan, Maosong Sun. Neural Relation Extraction with Selective Attention over Instances. ACL 2016. [https://www.aclweb.org/anthology/P16-1200.pdf]
  6. Xiaotian Jiang, Quan Wang, Peng Li, Bin Wang. Relation Extraction with Multi-instance Multi-label Convolutional Neural Networks. COLING 2016. [https://www.aclweb.org/anthology/C16-1139.pdf]
  7. Yatian Shen , Xuanjing Huang. Attention-Based Convolutional Neural Network for Semantic Relation Extraction. COLING 2016. [https://www.aclweb.org/anthology/C16-1238.pdf]
  8. Linlin Wang, Zhu Cao, Gerard de Melo and Zhiyuan Liu. Relation Classification via Multi-Level Attention CNNs. ACL 2016. [https://www.aclweb.org/anthology/P16-1123.pdf]
  9. Yan Xu, Ran Jia, Lili Mou, Ge Li, Yunchuan Chen, Yangyang Lu, Zhi Jin. Improved Relation Classification by Deep Recurrent Neural Networks with Data Augmentation. COLING 2016. [https://arxiv.org/pdf/1601.03651.pdf]
  10. Rui Cai, Xiaodong Zhang and Houfeng Wang. Bidirectional Recurrent Convolutional Neural Network for Relation Classification. ACL 2016. [https://www.aclweb.org/anthology/P16-1072.pdf]

2017

  1. Yi Wu,David Bamman,Stuart Russell. Adversarial Training for Relation Extraction. EMNLP 2017. [https://www.aclweb.org/anthology/D17-1187.pdf]
  2. Tianyu Liu, Kexiang Wang, Baobao Chang, Zhifang Sui. A Soft-label Method for Noise-tolerant Distantly Supervised Relation Extraction. EMNLP 2017. [https://www.aclweb.org/anthology/D17-1189.pdf]
  3. Wenyuan Zeng, Yankai Lin, Zhiyuan Liu, Maosong Sun. Incorporating Relation Paths in Neural Relation Extraction. EMNLP 2017. [https://www.aclweb.org/anthology/D17-1186.pdf]
  4. Ji Young Lee, Franck Dernoncourt ,Peter Szolovits. MIT at SemEval-2017 Task 10: Relation Extraction with Convolutional Neural Networks. SemEval 2017. [https://arxiv.org/pdf/1704.01523.pdf]
  5. Yankai Lin, Shiqi Shen, Zhiyuan Liu, Huanbo Luan , Maosong Sun. Neural Relation Extraction with Selective Attention over Instances. ACL 2017. [https://www.aclweb.org/anthology/P16-1200.pdf]
  6. Desh Raj, Sunil Kumar Sahu, Ashish Anan. Learning local and global contexts using a convolutional recurrent network model for relation classification in biomedical text. CoNLL 2017. [https://www.aclweb.org/anthology/K17-1032.pdf]
  7. Hai Ye, Wenhan Chao, Zhunchen Luo, Zhoujun Li. Jointly Extracting Relations with Class Ties via Effective Deep Ranking. ACL 2017. [https://www.aclweb.org/anthology/P17-1166.pdf]
  8. Meishan Zhang, Yue Zhang , Guohong Fu. End-to-End Neural Relation Extraction with Global Optimization . EMNLP 2017. [https://www.aclweb.org/anthology/D17-1182.pdf]
  9. Fei Li, Meishan Zhang, Guohong Fu, Donghong Ji. A neural joint model for entity and relation extraction from biomedical text. BMC bioinformatics 2017. [https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-017-1609-9]
  10. Yuntian Feng, Hongjun Zhang, Wenning Hao, Gang Chen. Joint Extraction of Entities and Relations Using Reinforcement Learning and Deep Learning. Journal of Computational Intelligence and Neuroscience 2017. [https://www.hindawi.com/journals/cin/2017/7643065/]

2018

  1. Fenia Christopoulou,Makoto Miwa,Sophia Ananiadou. A Walk-based Model on Entity Graphs for Relation Extraction . ACL 2018. [https://www.aclweb.org/anthology/P18-2014.pdf]
  2. Pengda Qin, Weiran Xu, William Yang Wang. DSGAN: Generative Adversarial Training for Distant Supervision Relation Extraction. ACL 2018. [https://www.aclweb.org/anthology/P18-1046.pdf]
  3. Jun Feng, Minlie Huang, Li Zhao, Yang Yang, Xiaoyan Zhu. Reinforcement Learning for Relation Classification from Noisy Data. AAAI 2018. [https://tianjun.me/static/essay_resources/RelationExtraction/Paper/AAAI2018Denoising.pdf]
  4. Pengda Qin, Weiran Xu, William Yang Wang. Robust Distant Supervision Relation Extraction via Deep Reinforcement Learning. ACL 2018. [https://arxiv.org/pdf/1805.09927.pdf]
  5. Xu Han, Zhiyuan Liu, Maosong Sun. Neural Knowledge Acquisition via Mutual Attention between Knowledge Graph and Text. AAAI 2018. [http://nlp.csai.tsinghua.edu.cn/~lzy/publications/aaai2018_jointnre.pdf]
  6. Xu Han, Pengfei Yu, Zhiyuan Liu, Maosong Sun , Peng Li. Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention. EMNLP 2018. [https://www.aclweb.org/anthology/D18-1247.pdf]
  7. Shikhar Vashishth,Rishabh Joshi,Sai Suman Prayaga,Partha Talukdar,Chiranjib Bhattacharyya. RESIDE: Improving Distantly-Supervised Neural Relation Extractionusing Side Information. EMNLP 2018. [https://www.aclweb.org/anthology/D18-1157.pdf]
  8. Tianyi Liu, Xinsong Zhang, Wanhao Zhou, Weijia Jia. Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning. EMNLP 2018. [https://arxiv.org/pdf/1808.06738.pdf]
  9. Xu Han, Hao Zhu, Pengfei Yu, Ziyun Wang, Yuan Yao, Zhiyuan Liu, Maosong Sun. FewRel: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation. EMNLP 2018. [https://arxiv.org/pdf/1810.10147.pdf]
  10. Zhengqiu He, Wenliang Chen, Zhenghua Li,Meishan Zhang, Wei Zhang, Min Zhang. SEE: Syntax-aware Entity Embedding for Neural Relation Extraction. AAAI2018. [https://arxiv.org/pdf/1801.03603.pdf]

2019

  1. Joohong Lee, Sangwoo Seo,Yong Suk Choi. Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity Typing. arXiv 2019. [https://arxiv.org/pdf/1901.08163.pdf]
  2. Shanchan Wu, Yifan He. Enriching Pre-trained Language Model with Entity Information for Relation Classification. arXiv 2019. [https://arxiv.org/pdf/1905.08284.pdf]
  3. Yujin Yuan, Liyuan Liu, Siliang Tang, Zhongfei Zhang, Yueting Zhuang, Shiliang Pu, Fei Wu, Xiang Ren. Cross-relation Cross-bag Attention for Distantly-supervised Relation Extraction. AAAI2019 [https://arxiv.org/pdf/1812.10604.pdf]
  4. Shanchan Wu,Kai Fan,Qiong Zhang. Improving Distantly Supervised Relation Extraction with Neural Noise Converter and Conditional Optimal Selector. AAAI2019. [https://arxiv.org/pdf/1811.05616.pdf]
  5. Xinsong Zhang, Pengshuai Li, Weijia Jia,Hai Zhao. Multi-labeled Relation Extraction with Attentive Capsule Network. AAAI2019. [https://arxiv.org/pdf/1811.04354.pdf]
  6. Ryuichi Takanobu, Tianyang Zhang, Jiexi Liu, Minlie Huang. A Hierarchical Framework for Relation Extraction with Reinforcement Learning. AAAI2019 [https://arxiv.org/pdf/1811.03925.pdf]
  7. Sahil Garg, Aram Galstyan, Greg Ver Steeg, Irina Rish, Guillermo Cecchi, Shuyang Gao. Kernelized Hashcode Representations for Relation Extraction. AAAI2019 [https://arxiv.org/pdf/1711.04044.pdf]

2020

  1. Yang Li, Guodong Long, Tao Shen, Tianyi Zhou, Lina Yao, Huan Huo, Jing Jiang. Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction. AAAI2020. [https://arxiv.org/pdf/1911.11899.pdf]
  2. Tapas Nayak,Hwee Tou Ng. Effective Modeling of Encoder-Decoder Architecture for Joint Entity and Relation Extraction. AAAI2020. [https://arxiv.org/pdf/1911.09886.pdf]

视频教程

  1. CS 124: From Languages to Information,Dan Jurafsky [Week 5: Relation Extraction and Question]
  2. Michigan University: Coursera, Dragomir R. Radev  [Lecture 48: Relation Extraction]
  3. Stanford CS224U: Natural Language Understand|Lecture 7 – Relation Extraction.[https://www.bilibili.com/video/av56067156?p=7]

代码

  1. OpenNRE:https://github.com/thunlp/OpenNRE
  2. 清华大学自然语言处理实验室发布的一个大规模精标注关系抽取数据集FewRel:https://github.com/ProKil/FewRel#fewrel-dataset-toolkits-and-baseline-models

领域专家

  1. The University of Washington:Luke Zettlemoyer
  2. The University of Texas at Austin:Raymond J. Mooney
  3. Stanford University: Dan Jurafsky 、Bill MacCartney、Christopher Potts
  4. New York University:Ralph Grishman
  5. 中国科学院自动化研究所:赵军

Datasets

  1. ACE 2005 Multilingual Training Corpus:[https://catalog.ldc.upenn.edu/LDC2006T06]
  2. SemEval-2010 Task 8 Dataset:名词对之间语义关系的多途径分类。 [http://semeval2.fbk.eu/semeval2.php?location=tasks#T11]
  3. NYT Dataset:该数据集是通过将Freebase与《纽约时报》语料库的关系进行校准而生成的,其中使用2005-2006年的句子作为训练语料库,使用2007年的句子作为测试语料库。 [http://iesl.cs.umass.edu/riedel/ecml/]
  4. Few-shot Datasets:关系分类数据集FewRel,该数据集包含 100 个类别、70000 个实例,全面超越了以往的同类精标注数据集。 FewRel: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation [https://www.aclweb.org/anthology/D18-1514/]
  5. TAC Relation Extraction Dataset:TACRED是由斯坦福NLP小组开发的,是一个大型关系提取数据集,包含106,264个基于英文新闻和web文本的示例,在2009-2014年期间用于NIST TAC KBP英文槽填充评估。 [https://nlp.stanford.edu/projects/tacred/]

VIP内容

题目: Relation Adversarial Network for Low Resource Knowledge Graph Completion

摘要: 知识图谱补全(Knowledge Graph Completion, KGC)是一种通过链接预测或关系提取来填充缺少的链接来改进知识图谱的方法。KGC的主要困难之一是资源不足。之前的方法假设有足够训练的三元组来学习实体和关系的通用向量,或者假设有足够数量的标签句子来训练一个合格的关系提取模型。然而,在KGs中,少资源关系非常普遍,这些新增加的关系往往没有很多已知的样本去进行训练。在这项工作中,我们的目标是在具有挑战性的环境下只有有限可用的训练实例预测新的事实。我们提出了一个加权关系对抗性网络的通用框架,它利用对抗性过程来帮助将从多资源关系中学习到的知识/特征调整为不同但相关的少资源关系。具体地说,该框架利用了一个关系鉴别器来区分样本和不同的关系,帮助学习更容易从源关系转移到目标关系的关系不变量特征。实验结果表明,该方法在少资源设置下的链路预测和关系提取都优于以往的方法。

成为VIP会员查看完整内容
0
16

最新论文

Reader reviews of literary fiction on social media, especially those in persistent, dedicated forums, create and are in turn driven by underlying narrative frameworks. In their comments about a novel, readers generally include only a subset of characters and their relationships, thus offering a limited perspective on that work. Yet in aggregate, these reviews capture an underlying narrative framework comprised of different actants (people, places, things), their roles, and interactions that we label the "consensus narrative framework". We represent this framework in the form of an actant-relationship story graph. Extracting this graph is a challenging computational problem, which we pose as a latent graphical model estimation problem. Posts and reviews are viewed as samples of sub graphs/networks of the hidden narrative framework. Inspired by the qualitative narrative theory of Greimas, we formulate a graphical generative Machine Learning (ML) model where nodes represent actants, and multi-edges and self-loops among nodes capture context-specific relationships. We develop a pipeline of interlocking automated methods to extract key actants and their relationships, and apply it to thousands of reviews and comments posted on Goodreads.com. We manually derive the ground truth narrative framework from SparkNotes, and then use word embedding tools to compare relationships in ground truth networks with our extracted networks. We find that our automated methodology generates highly accurate consensus narrative frameworks: for our four target novels, with approximately 2900 reviews per novel, we report average coverage/recall of important relationships of > 80% and an average edge detection rate of >89\%. These extracted narrative frameworks can generate insight into how people (or classes of people) read and how they recount what they have read to others.

0
0
下载
预览
Top