事件抽取指的是从非结构化文本中抽取事件信息,并将其以结构化形式呈现出来的任务。例如从“毛泽东1893 年出生于湖南湘潭”这句话中抽取事件{类型:出生,人物:毛泽东,时间:1893 年,出生地:湖南湘潭}。 事件抽取任务通常包含事件类型识别和事件元素填充两个子任务。

VIP内容

论文题目: Meta-Learning with Dynamic-Memory-Based Prototypical Network for Few-Shot Event Detection

摘要: 事件检测(Event detection, ED)是事件抽取的一个子任务,包括识别触发器和对事件提及进行分类。现有的方法主要依赖于监督学习,并且需要大规模的带标记的事件数据集,不幸的是,这些数据集在许多实际的应用场景中并不容易获得。在本文中,我们将在有限标记数据条件下的ED任务考虑为一个小概率学习问题。提出了一种基于动态记忆的原型网络(DMB-PN),该网络利用动态记忆网络(DMN)不仅能更好地学习事件类型的原型,而且能对事件提及产生更健壮的句子编码。与传统的通过平均计算事件原型的网络不同,我们的模型更健壮,并且由于DMNs的多跳机制,能够多次从事件提及中提取上下文信息。实验结果表明,DMB-PN不仅比一系列基线模型更能有效地处理样本稀缺问题,而且在事件类型变化较大、实例数量极少时表现得更为稳健。

作者: Shumin Deng, Ningyu Zhang, Jiaojian Kang, Yichi Zhang, Wei Zhang, Huajun Chen

成为VIP会员查看完整内容
元学习与动态记忆为基础的原型网络的小样本突发事件检测.pdf
0
43

最新论文

Event extraction has long been treated as a sentence-level task in the IE community. We argue that this setting does not match human information-seeking behavior and leads to incomplete and uninformative extraction results. We propose a document-level neural event argument extraction model by formulating the task as conditional generation following event templates. We also compile a new document-level event extraction benchmark dataset WikiEvents which includes complete event and coreference annotation. On the task of argument extraction, we achieve an absolute gain of 7.6% F1 and 5.7% F1 over the next best model on the RAMS and WikiEvents datasets respectively. On the more challenging task of informative argument extraction, which requires implicit coreference reasoning, we achieve a 9.3% F1 gain over the best baseline. To demonstrate the portability of our model, we also create the first end-to-end zero-shot event extraction framework and achieve 97% of fully supervised model's trigger extraction performance and 82% of the argument extraction performance given only access to 10 out of the 33 types on ACE.

0
0
下载
预览
父主题
Top