关系抽取指的是检测和识别文本中实体之间的语义关系,并将表示同一语义关系的提及(mention)链接起来的任务。关系提取任务需要在一组工件(通常来自文本或XML文档)中对语义关系提及进行检测和分类。该任务与信息抽取(IE)的任务非常相似,但是IE还需要删除重复关系(消除歧义),并且通常指的是提取许多不同的关系。

知识荟萃

关系抽取(Relation Extraction )荟萃

入门学习

综述

论文

2007

  1. Razvan Bunescu, Raymond Mooney. Learning to Extract Relations from the Web using Minimal Supervision. ACL 2007. [https://www.aclweb.org/anthology/P07-1073/]

2009

  1. Iris Hendrickx, Su Nam Kim, Zornitsa Kozareva, Preslav Nakov, Diarmuid Ó Séaghdha, Sebastian Padó, Marco Pennacchiotti, Lorenza Romano, Stan Szpakowicz. SemEval-2010 Task 8: Multi-Way Classification of Semantic Relations between Pairs of Nominals. ACL 2009. [https://www.aclweb.org/anthology/S10-1006/]
  2. Mike Mintz, Steven Bills, Rion Snow and Dan Jurafsky. Distant supervision for relation extraction without labeled data. ACL 2009. [https://web.stanford.edu/~jurafsky/mintz.pdf]

2010

  1. Sebastian Riedel, Limin Yao, and Andrew McCallum. Modeling Relations and Their Mentions withoutLabeled Text.ECML 2010. [https://link.springer.com/content/pdf/10.1007%2F978-3-642-15939-8_10.pdf]

2011

  1. Raphael Hoffmann, Congle Zhang, Xiao Ling, Luke Zettlemoyer, Daniel S. Weld. Knowledge-Based Weak Supervision for Information Extraction of Overlapping Relations. ACL-HLT 2011. [https://www.aclweb.org/anthology/P11-1055/]

2012

  1. Richard Socher,Brody Huval ,Christopher D. Manning ,Andrew Y. Ng. Semantic Compositionality through Recursive Matrix-Vector Spaces. EMNLP-CoNLL 2012. [https://ai.stanford.edu/~ang/papers/emnlp12-SemanticCompositionalityRecursiveMatrixVectorSpaces.pdf]
  2. Mihai Surdeanu, Julie Tibshirani, Ramesh Nallapati, Christopher D. Manning. Multi-instance Multi-label Learning for Relation Extraction. EMNLP-CoNLL 2012. [https://www.aclweb.org/anthology/D12-1042.pdf]

2013

  1. Tomas Mikolov,Ilya Sutskever,Kai Chen,Greg Corrado,Jeffrey Dean. Distributed Representations of Words and Phrases and their Compositionality. NIPS 2013. [https://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf]
  2. ChunYang LiuWenBo SunWenHan ChaoWanXiang Che. Convolution Neural Network for Relation Extraction. ADMA 2013. [https://link.springer.com/chapter/10.1007/978-3-642-53917-6_21]

2014

  1. Jeffrey Pennington, Richard Socher, Christopher D. Manning. GloVe: Global Vectors for Word Representation. EMNLP 2014. [https://www.aclweb.org/anthology/D14-1162.pdf]
  2. Daojian Zeng, Kang Liu, Siwei Lai, Guangyou Zhou and Jun Zhao. Relation Classification via Convolutional Deep Neural Network. COLING 2014. [https://www.aclweb.org/anthology/C14-1220.pdf]
  3. Mo Yu, Matthw R. Gormley and Mark Dredze. Factor-based Compositional Embedding Models. NIPS Workshop on Learning Semantics 2014. [https://www.cs.cmu.edu/~mgormley/papers/yu+gormley+dredze.nipsw.2014.pdf]

2015

  1. Cicero Nogueira dos Santos,Bing Xiang,Bowen Zhou. Classifying Relations by Ranking with Convolutional Neural Networks. ACL 2015. [https://www.aclweb.org/anthology/P15-1061.pdf]
  2. Dongxu Zhang, Dong Wang. Relation Classification via Recurrent Neural Network. arXiv preprint arXiv:1508.01006 (2015). [https://arxiv.org/abs/1508.01006]
  3. Thien Huu Nguyen, Ralph Grishman. Relation Extraction: Perspective from Convolutional Neural Networks. NAACL-HLT 2015. [https://www.aclweb.org/anthology/W15-1506.pdf]
  4. Shu Zhang, Dequan Zheng, Xinchen Hu, Ming Yang. Bidirectional Long Short-Term Memory Networks for Relation Classification .PACLIC 2015. [https://www.aclweb.org/anthology/Y15-1009/]
  5. Daojian Zeng, Kang Liu, Yubo Chen and Jun Zhao. Distant Supervision for Relation Extraction via Piecewise Convolutional Neural Networks. EMNLP 2015. [http://www.emnlp2015.org/proceedings/EMNLP/pdf/EMNLP203.pdf]
  6. Yang Liu, Furu Wei, Sujian Li, Heng Ji, Ming Zhou,Houfeng Wang. A Dependency-Based Neural Network for Relation Classification. ACL 2015. [https://www.aclweb.org/anthology/P15-2047.pdf]
  7. Xu Yan, Lili Mou, Ge Li, Yunchuan Chen, Hao Peng, Zhi Jin. Classifying Relations via Long Short Term Memory Networks along Shortest Dependency Path . EMNLP 2015. [https://arxiv.org/pdf/1508.03720.pdf]
  8. Kun Xu, Yansong Feng, Songfang Huang,Dongyan Zhao. Semantic Relation Classification via Convolutional Neural Networks with Simple Negative Sampling. [https://www.aclweb.org/anthology/D15-1062.pdf]

2016

  1. Yankai Lin, Shiqi Shen, Zhiyuan Liu,Huanbo Luan, Maosong Sun. Neural Relation Extraction with Selective Attention over Instances.ACL 2016 . [http://nlp.csai.tsinghua.edu.cn/~lyk/publications/acl2016_nre.pdf]
  2. Makoto Miwa,Mohit Bansal. End-to-End Relation Extraction using LSTMs on Sequences and Tree Structures .ACL 2016. [https://www.aclweb.org/anthology/P16-1105.pdf]
  3. Peng Zhou, Wei Shi, Jun Tian, Zhenyu Qi, Bingchen Li, Hongwei Hao, Bo Xu. Attention-Based Bidirectional Long Short-Term Memory Networks for Relation Classification. ACL 2016.  [https://www.aclweb.org/anthology/P16-2034/]
  4. Minguang Xiao, Cong Liu. Semantic Relation Classification via Hierarchical Recurrent Neural Network with Attention. COLING 2016. [https://www.aclweb.org/anthology/C16-1119/]
  5. Yankai Lin,Shiqi Shen,Zhiyuan Liu,Huanbo Luan, Maosong Sun. Neural Relation Extraction with Selective Attention over Instances. ACL 2016. [https://www.aclweb.org/anthology/P16-1200.pdf]
  6. Xiaotian Jiang, Quan Wang, Peng Li, Bin Wang. Relation Extraction with Multi-instance Multi-label Convolutional Neural Networks. COLING 2016. [https://www.aclweb.org/anthology/C16-1139.pdf]
  7. Yatian Shen , Xuanjing Huang. Attention-Based Convolutional Neural Network for Semantic Relation Extraction. COLING 2016. [https://www.aclweb.org/anthology/C16-1238.pdf]
  8. Linlin Wang, Zhu Cao, Gerard de Melo and Zhiyuan Liu. Relation Classification via Multi-Level Attention CNNs. ACL 2016. [https://www.aclweb.org/anthology/P16-1123.pdf]
  9. Yan Xu, Ran Jia, Lili Mou, Ge Li, Yunchuan Chen, Yangyang Lu, Zhi Jin. Improved Relation Classification by Deep Recurrent Neural Networks with Data Augmentation. COLING 2016. [https://arxiv.org/pdf/1601.03651.pdf]
  10. Rui Cai, Xiaodong Zhang and Houfeng Wang. Bidirectional Recurrent Convolutional Neural Network for Relation Classification. ACL 2016. [https://www.aclweb.org/anthology/P16-1072.pdf]

2017

  1. Yi Wu,David Bamman,Stuart Russell. Adversarial Training for Relation Extraction. EMNLP 2017. [https://www.aclweb.org/anthology/D17-1187.pdf]
  2. Tianyu Liu, Kexiang Wang, Baobao Chang, Zhifang Sui. A Soft-label Method for Noise-tolerant Distantly Supervised Relation Extraction. EMNLP 2017. [https://www.aclweb.org/anthology/D17-1189.pdf]
  3. Wenyuan Zeng, Yankai Lin, Zhiyuan Liu, Maosong Sun. Incorporating Relation Paths in Neural Relation Extraction. EMNLP 2017. [https://www.aclweb.org/anthology/D17-1186.pdf]
  4. Ji Young Lee, Franck Dernoncourt ,Peter Szolovits. MIT at SemEval-2017 Task 10: Relation Extraction with Convolutional Neural Networks. SemEval 2017. [https://arxiv.org/pdf/1704.01523.pdf]
  5. Yankai Lin, Shiqi Shen, Zhiyuan Liu, Huanbo Luan , Maosong Sun. Neural Relation Extraction with Selective Attention over Instances. ACL 2017. [https://www.aclweb.org/anthology/P16-1200.pdf]
  6. Desh Raj, Sunil Kumar Sahu, Ashish Anan. Learning local and global contexts using a convolutional recurrent network model for relation classification in biomedical text. CoNLL 2017. [https://www.aclweb.org/anthology/K17-1032.pdf]
  7. Hai Ye, Wenhan Chao, Zhunchen Luo, Zhoujun Li. Jointly Extracting Relations with Class Ties via Effective Deep Ranking. ACL 2017. [https://www.aclweb.org/anthology/P17-1166.pdf]
  8. Meishan Zhang, Yue Zhang , Guohong Fu. End-to-End Neural Relation Extraction with Global Optimization . EMNLP 2017. [https://www.aclweb.org/anthology/D17-1182.pdf]
  9. Fei Li, Meishan Zhang, Guohong Fu, Donghong Ji. A neural joint model for entity and relation extraction from biomedical text. BMC bioinformatics 2017. [https://bmcbioinformatics.biomedcentral.com/articles/10.1186/s12859-017-1609-9]
  10. Yuntian Feng, Hongjun Zhang, Wenning Hao, Gang Chen. Joint Extraction of Entities and Relations Using Reinforcement Learning and Deep Learning. Journal of Computational Intelligence and Neuroscience 2017. [https://www.hindawi.com/journals/cin/2017/7643065/]

2018

  1. Fenia Christopoulou,Makoto Miwa,Sophia Ananiadou. A Walk-based Model on Entity Graphs for Relation Extraction . ACL 2018. [https://www.aclweb.org/anthology/P18-2014.pdf]
  2. Pengda Qin, Weiran Xu, William Yang Wang. DSGAN: Generative Adversarial Training for Distant Supervision Relation Extraction. ACL 2018. [https://www.aclweb.org/anthology/P18-1046.pdf]
  3. Jun Feng, Minlie Huang, Li Zhao, Yang Yang, Xiaoyan Zhu. Reinforcement Learning for Relation Classification from Noisy Data. AAAI 2018. [https://tianjun.me/static/essay_resources/RelationExtraction/Paper/AAAI2018Denoising.pdf]
  4. Pengda Qin, Weiran Xu, William Yang Wang. Robust Distant Supervision Relation Extraction via Deep Reinforcement Learning. ACL 2018. [https://arxiv.org/pdf/1805.09927.pdf]
  5. Xu Han, Zhiyuan Liu, Maosong Sun. Neural Knowledge Acquisition via Mutual Attention between Knowledge Graph and Text. AAAI 2018. [http://nlp.csai.tsinghua.edu.cn/~lzy/publications/aaai2018_jointnre.pdf]
  6. Xu Han, Pengfei Yu, Zhiyuan Liu, Maosong Sun , Peng Li. Hierarchical Relation Extraction with Coarse-to-Fine Grained Attention. EMNLP 2018. [https://www.aclweb.org/anthology/D18-1247.pdf]
  7. Shikhar Vashishth,Rishabh Joshi,Sai Suman Prayaga,Partha Talukdar,Chiranjib Bhattacharyya. RESIDE: Improving Distantly-Supervised Neural Relation Extractionusing Side Information. EMNLP 2018. [https://www.aclweb.org/anthology/D18-1157.pdf]
  8. Tianyi Liu, Xinsong Zhang, Wanhao Zhou, Weijia Jia. Neural Relation Extraction via Inner-Sentence Noise Reduction and Transfer Learning. EMNLP 2018. [https://arxiv.org/pdf/1808.06738.pdf]
  9. Xu Han, Hao Zhu, Pengfei Yu, Ziyun Wang, Yuan Yao, Zhiyuan Liu, Maosong Sun. FewRel: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation. EMNLP 2018. [https://arxiv.org/pdf/1810.10147.pdf]
  10. Zhengqiu He, Wenliang Chen, Zhenghua Li,Meishan Zhang, Wei Zhang, Min Zhang. SEE: Syntax-aware Entity Embedding for Neural Relation Extraction. AAAI2018. [https://arxiv.org/pdf/1801.03603.pdf]

2019

  1. Joohong Lee, Sangwoo Seo,Yong Suk Choi. Semantic Relation Classification via Bidirectional LSTM Networks with Entity-aware Attention using Latent Entity Typing. arXiv 2019. [https://arxiv.org/pdf/1901.08163.pdf]
  2. Shanchan Wu, Yifan He. Enriching Pre-trained Language Model with Entity Information for Relation Classification. arXiv 2019. [https://arxiv.org/pdf/1905.08284.pdf]
  3. Yujin Yuan, Liyuan Liu, Siliang Tang, Zhongfei Zhang, Yueting Zhuang, Shiliang Pu, Fei Wu, Xiang Ren. Cross-relation Cross-bag Attention for Distantly-supervised Relation Extraction. AAAI2019 [https://arxiv.org/pdf/1812.10604.pdf]
  4. Shanchan Wu,Kai Fan,Qiong Zhang. Improving Distantly Supervised Relation Extraction with Neural Noise Converter and Conditional Optimal Selector. AAAI2019. [https://arxiv.org/pdf/1811.05616.pdf]
  5. Xinsong Zhang, Pengshuai Li, Weijia Jia,Hai Zhao. Multi-labeled Relation Extraction with Attentive Capsule Network. AAAI2019. [https://arxiv.org/pdf/1811.04354.pdf]
  6. Ryuichi Takanobu, Tianyang Zhang, Jiexi Liu, Minlie Huang. A Hierarchical Framework for Relation Extraction with Reinforcement Learning. AAAI2019 [https://arxiv.org/pdf/1811.03925.pdf]
  7. Sahil Garg, Aram Galstyan, Greg Ver Steeg, Irina Rish, Guillermo Cecchi, Shuyang Gao. Kernelized Hashcode Representations for Relation Extraction. AAAI2019 [https://arxiv.org/pdf/1711.04044.pdf]

2020

  1. Yang Li, Guodong Long, Tao Shen, Tianyi Zhou, Lina Yao, Huan Huo, Jing Jiang. Self-Attention Enhanced Selective Gate with Entity-Aware Embedding for Distantly Supervised Relation Extraction. AAAI2020. [https://arxiv.org/pdf/1911.11899.pdf]
  2. Tapas Nayak,Hwee Tou Ng. Effective Modeling of Encoder-Decoder Architecture for Joint Entity and Relation Extraction. AAAI2020. [https://arxiv.org/pdf/1911.09886.pdf]

视频教程

  1. CS 124: From Languages to Information,Dan Jurafsky [Week 5: Relation Extraction and Question]
  2. Michigan University: Coursera, Dragomir R. Radev  [Lecture 48: Relation Extraction]
  3. Stanford CS224U: Natural Language Understand|Lecture 7 – Relation Extraction.[https://www.bilibili.com/video/av56067156?p=7]

代码

  1. OpenNRE:https://github.com/thunlp/OpenNRE
  2. 清华大学自然语言处理实验室发布的一个大规模精标注关系抽取数据集FewRel:https://github.com/ProKil/FewRel#fewrel-dataset-toolkits-and-baseline-models

领域专家

  1. The University of Washington:Luke Zettlemoyer
  2. The University of Texas at Austin:Raymond J. Mooney
  3. Stanford University: Dan Jurafsky 、Bill MacCartney、Christopher Potts
  4. New York University:Ralph Grishman
  5. 中国科学院自动化研究所:赵军

Datasets

  1. ACE 2005 Multilingual Training Corpus:[https://catalog.ldc.upenn.edu/LDC2006T06]
  2. SemEval-2010 Task 8 Dataset:名词对之间语义关系的多途径分类。 [http://semeval2.fbk.eu/semeval2.php?location=tasks#T11]
  3. NYT Dataset:该数据集是通过将Freebase与《纽约时报》语料库的关系进行校准而生成的,其中使用2005-2006年的句子作为训练语料库,使用2007年的句子作为测试语料库。 [http://iesl.cs.umass.edu/riedel/ecml/]
  4. Few-shot Datasets:关系分类数据集FewRel,该数据集包含 100 个类别、70000 个实例,全面超越了以往的同类精标注数据集。 FewRel: A Large-Scale Supervised Few-Shot Relation Classification Dataset with State-of-the-Art Evaluation [https://www.aclweb.org/anthology/D18-1514/]
  5. TAC Relation Extraction Dataset:TACRED是由斯坦福NLP小组开发的,是一个大型关系提取数据集,包含106,264个基于英文新闻和web文本的示例,在2009-2014年期间用于NIST TAC KBP英文槽填充评估。 [https://nlp.stanford.edu/projects/tacred/]

VIP内容

端到端关系抽取旨在识别命名实体,同时抽取其关系。近期研究大多采取 joint 方式建模这两项子任务,要么将二者统一在一个结构化预测网络中,要么通过共享表示进行多任务学习。

而近期来自普林斯顿大学的 Zexuan Zhong、陈丹琦介绍了一种非常简单的方法,并在标准基准(ACE04、ACE05 和 SciERC)上取得了新的 SOTA 成绩。该方法基于两个独立的预训练编码器构建而成,只使用实体模型为关系模型提供输入特征。通过一系列精心检验,该研究验证了学习不同的语境表示对实体和关系的重要性,即在关系模型的输入层融合实体信息,并集成全局语境信息。

此外,该研究还提出了这一方法的高效近似方法,只需要在推断时对两个编码器各执行一次,即可获得 8-16 倍的加速,同时准确率仅小幅下降。

成为VIP会员查看完整内容
0
31

最新论文

With the increasing legalization of medical and recreational use of cannabis, more research is needed to understand the association between depression and consumer behavior related to cannabis consumption. Big social media data has potential to provide deeper insights about these associations to public health analysts. In this interdisciplinary study, we demonstrate the value of incorporating domain-specific knowledge in the learning process to identify the relationships between cannabis use and depression. We develop an end-to-end knowledge infused deep learning framework (Gated-K-BERT) that leverages the pre-trained BERT language representation model and domain-specific declarative knowledge source (Drug Abuse Ontology (DAO)) to jointly extract entities and their relationship using gated fusion sharing mechanism. Our model is further tailored to provide more focus to the entities mention in the sentence through entity-position aware attention layer, where ontology is used to locate the target entities position. Experimental results show that inclusion of the knowledge-aware attentive representation in association with BERT can extract the cannabis-depression relationship with better coverage in comparison to the state-of-the-art relation extractor.

0
1
下载
预览
Top