版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Guangxi Power Grid Co Ltd Nanning Peoples R China Huazhong Univ Sci & Technol Sch Comp Sci & Technol Cognit Comp & Intelligent Informat Proc CCIIP Lab Wuhan Peoples R China
出 版 物:《KNOWLEDGE-BASED SYSTEMS》 (Knowl Based Syst)
年 卷 期:2025年第310卷
核心收录:
学科分类:08[工学] 0812[工学-计算机科学与技术(可授工学、理学学位)]
基 金:Guangxi Power Project of Science and Technology [046100KC23040005]
主 题:Document-level relation extraction Reasoning path Key information detection
摘 要:Document-level relation extraction (DocRE) aims to identify complicated relations among entity pairs in a document. It is challenging as a document usually has multiple entity pairs, and the two entities within a pair may also be scattered in a document with different entity-specific labels, even over along distance. Existing works usually focus on extracting crucial evidence sentences as a concise clue for relation inference. However, they suffer from two limitations: (1) indiscriminately utilizing all words in the sentences for relation inference may inevitably introduce noise problems (e.g., unrelated chunks/clauses involved), thus incurring a decline in performance. (2) they do not consider the noisy labeling phenomenon, as there exists a considerable amount of unlabeled instances in the DocRE scenario. To address the above issues, we propose a novel approach, named P ath-Aware R easoning N etwork (PARN) for DocRE. Specifically, PARN explicitly models the shortest inferring paths for each given entity pair over a word-level constructed document graph, and then incorporates a path-aware constraint into the self-attention mechanism, thereby directing the model s attention towards the crucial segments along these paths. Furthermore, a co-regularization loss is also proposed to alleviate the intrinsic noisy labeling problem, which increases the prediction consistency of the output. Extensive experiments on three DocRE datasets demonstrate the superiority of the proposed model, as compared to previous state-of-the-art methods.