高级检索

    融合上下文信息的篇章级事件时序关系抽取方法

    Document-Level Event Temporal Relation Extraction with Context Information

    • 摘要: 事件时序关系抽取是一项重要的自然语言理解任务,可以广泛应用于诸如知识图谱构建、问答系统等任务.已有事件时序关系抽取方法往往将该任务视为句子级事件对的分类问题,而基于有限的局部句子信息导致其抽取的事件时序关系的精度较低,且无法保证整体时序关系的全局一致性.针对此问题,提出一种融合上下文信息的篇章级事件时序关系抽取方法,使用基于双向长短期记忆(bidirectional long short-term memory, Bi-LSTM)的神经网络模型学习文章中事件对的时序关系表示,再利用自注意力机制融入上下文中其他事件对信息,从而得到更丰富的事件对时序关系表示用于时序关系分类.通过TB-Dense(timebank dense)和MATRES(multi-axis temporal relations for start-points)数据集的实验表明:此方法能够取得比当前主流的句子级方法更佳的抽取效果.

       

      Abstract: Event temporal relation extraction is an important natural language understanding task, which can be widely used in downstream tasks such as construction of knowledge graph, question answering system and narrative generation. Existing event temporal relation extraction methods often treat the task as a sentence-level event pair classification problem, and solve it by some classification model. However, based on limited local sentence information, the accuracy of the extraction of temporal relations among events is low and the global consistency of the temporal relations cannot be guaranteed. For this problem, this paper proposes a document-level event temporal relation extraction with context information, which uses the neural network model based on Bi-LSTM (bidirectional long short-term memory) to learn the temporal relation expressions of event pairs, and then uses the self-attention mechanism to combine the information of other event pairs in the context, to obtain a better event temporal relation expression for temporal relation classification. At last, that event temporal relation expression with context information will improve the global event temporal relation extraction by enhancing temporal relation classification of all event pairs in the document. Experiments on TB-Dense (timebank dense) dataset and MATRES (multi-axis temporal relations for start-points) dataset show that this method can achieve better results than the latest sentence-level methods.

       

    /

    返回文章
    返回