版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:School of Software Engineering and Hubei Key Laboratory of Smart Internet Technology Huazhong University of Science and Technology (HUST) Wuhan China School of Electronic Information and Communications Huazhong University of Science and Technology Wuhan China Faculty of Artificial Intelligence in Education Central China Normal University Wuhan China School of Computer Science and Technology Huazhong University of Science and Technology Wuhan China
出 版 物:《IEEE Transactions on Audio, Speech and Language Processing》
年 卷 期:2025年第33卷
页 面:586-597页
基 金:Hubei Intelligent Edge Computing Research Institute Hubei Science and Technology Talent Service Project National Natural Science Foundation of China
主 题:Training Encoding Adaptation models Text recognition Knowledge engineering Feature extraction Bidirectional control Annotations Vocabulary Vectors
摘 要:Implicit discourse relation recognition (IDRR) aims at recognizing the discourse relation between two text segments without an explicit connective. Recently, prompt learning has been applied to the IDRR task with great performance improvements over various neural network-based approaches. However, the discrete nature of the state-art-of-art prompting approach requires manual design of templates and answers, a big hurdle for its practical applications. In this paper, we propose a continuous version of prompt learning together with connective knowledge distillation, called AdaptPrompt, to reduce manual design efforts via continuous prompting while further improving performance via knowledge transfer. In particular, we design and train a few virtual tokens to form continuous templates and automatically select the most suitable one by gradient search in the embedding space. We also design an answer-relation mapping rule to generate a few virtual answers as the answer space. Furthermore, we notice the importance of annotated connectives in the training dataset and design a teacher-student architecture for knowledge transfer. Experiments on the up-to-date PDTB Corpus V3.0 validate our design objectives in terms of the better relation recognition performance over the state-of-the-art competitors.