咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Improving Continual Relation E... 收藏
arXiv

Improving Continual Relation Extraction through Prototypical Contrastive Learning

作     者:Hu, Chengwei Yang, Deqing Jin, Haoliang Chen, Zhen Xiao, Yanghua 

作者机构:School of Data Science Fudan University Shanghai China School of Computer Science Fudan University Shanghai China 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2022年

核心收录:

主  题:Extraction 

摘      要:Continual relation extraction (CRE) aims to extract relations towards the continuous and iterative arrival of new data, of which the major challenge is the catastrophic forgetting of old tasks. In order to alleviate this critical problem for enhanced CRE performance, we propose a novel Continual Relation Extraction framework with Contrastive Learning, namely CRECL, which is built with a classification network and a prototypical contrastive network to achieve the incremental-class learning of CRE. Specifically, in the contrastive network a given instance is contrasted with the prototype of each candidate relations stored in the memory module. Such contrastive learning scheme ensures the data distributions of all tasks more distinguishable, so as to alleviate the catastrophic forgetting further. Our experiment results not only demonstrate our CRECL’s advantage over the state-of-the-art baselines on two public datasets, but also verify the effectiveness of CRECL’s contrastive learning on improving CRE performance. © 2022, CC BY.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分