事件抽取指的是从非结构化文本中抽取事件信息,并将其以结构化形式呈现出来的任务。例如从“毛泽东1893 年出生于湖南湘潭”这句话中抽取事件{类型:出生,人物:毛泽东,时间:1893 年,出生地:湖南湘潭}。 事件抽取任务通常包含事件类型识别和事件元素填充两个子任务。

VIP内容

事件参数抽取(EAE)是信息抽取时发现特定事件角色参数的重要任务。在本研究中,我们将EAE转换为一个基于问题的完形填空任务,并对固定离散标记模板性能进行实证分析。由于生成人工注释的问题模板通常是耗时且耗费劳动,我们进一步提出了一种名为“Learning to Ask”的新方法,该方法可以在无需人工注释的情况下学习EAE的优化问题模板。我们使用ACE-2005数据集进行实验,结果表明我们基于优化提问的方法在fewshot和全监督设定中都取得了最先进的性能。

成为VIP会员查看完整内容
0
6

最新内容

Few-shot Learning (FSL) is aimed to make predictions based on a limited number of samples. Structured data such as knowledge graphs and ontology libraries has been leveraged to benefit the few-shot setting in various tasks. However, the priors adopted by the existing methods suffer from challenging knowledge missing, knowledge noise, and knowledge heterogeneity, which hinder the performance for few-shot learning. In this study, we explore knowledge injection for FSL with pre-trained language models and propose ontology-enhanced prompt-tuning (OntoPrompt). Specifically, we develop the ontology transformation based on the external knowledge graph to address the knowledge missing issue, which fulfills and converts structure knowledge to text. We further introduce span-sensitive knowledge injection via a visible matrix to select informative knowledge to handle the knowledge noise issue. To bridge the gap between knowledge and text, we propose a collective training algorithm to optimize representations jointly. We evaluate our proposed OntoPrompt in three tasks, including relation extraction, event extraction, and knowledge graph completion, with eight datasets. Experimental results demonstrate that our approach can obtain better few-shot performance than baselines.

0
0
下载
预览

最新论文

Few-shot Learning (FSL) is aimed to make predictions based on a limited number of samples. Structured data such as knowledge graphs and ontology libraries has been leveraged to benefit the few-shot setting in various tasks. However, the priors adopted by the existing methods suffer from challenging knowledge missing, knowledge noise, and knowledge heterogeneity, which hinder the performance for few-shot learning. In this study, we explore knowledge injection for FSL with pre-trained language models and propose ontology-enhanced prompt-tuning (OntoPrompt). Specifically, we develop the ontology transformation based on the external knowledge graph to address the knowledge missing issue, which fulfills and converts structure knowledge to text. We further introduce span-sensitive knowledge injection via a visible matrix to select informative knowledge to handle the knowledge noise issue. To bridge the gap between knowledge and text, we propose a collective training algorithm to optimize representations jointly. We evaluate our proposed OntoPrompt in three tasks, including relation extraction, event extraction, and knowledge graph completion, with eight datasets. Experimental results demonstrate that our approach can obtain better few-shot performance than baselines.

0
0
下载
预览
参考链接
父主题
Top
微信扫码咨询专知VIP会员