咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Event-Centric Question Answeri... 收藏
arXiv

Event-Centric Question Answering via Contrastive Learning and Invertible Event Transformation

作     者:Lu, Junru Tan, Xingwei Pergola, Gabriele Gui, Lin He, Yulan 

作者机构:Department of Computer Science University of Warwick United Kingdom Department of Informatics King's College London United Kingdom The Alan Turing Institute United Kingdom 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2022年

核心收录:

主  题:Linear transformations 

摘      要:Human reading comprehension often requires reasoning of event semantic relations in narratives, represented by Event-centric Question-Answering (QA). To address event-centric QA, we propose a novel QA model with contrastive learning and invertible event transformation, call TranCLR. Our proposed model utilizes an invertible transformation matrix to project semantic vectors of events into a common event embedding space, trained with contrastive learning, and thus naturally inject event semantic knowledge into mainstream QA pipelines. The transformation matrix is fine-tuned with the annotated event relation types between events that occurred in questions and those in answers, using event-aware question vectors. Experimental results on the Event Semantic Relation Reasoning (ESTER) dataset show significant improvements in both generative and extractive settings compared to the existing strong baselines, achieving over 8.4% gain in the token-level F1 score and 3.0% gain in Exact Match (EM) score under the multi-answer setting. Qualitative analysis reveals the high quality of the generated answers by TranCLR, demonstrating the feasibility of injecting event knowledge into QA model learning. Our code and models can be found at https://***/LuJunru/TranCLR. Copyright © 2022, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分