咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >PANDA: Prompt Transfer Meets K... 收藏
arXiv

PANDA: Prompt Transfer Meets Knowledge Distillation for Efficient Model Adaptation

作     者:Zhong, Qihuang Ding, Liang Liu, Juhua Du, Bo Tao, Dacheng 

作者机构:The School of Computer Science National Engineering Research Center for Multimedia Software Institute of Artificial Intelligence Hubei Key Laboratory of Multimedia and Network Communication Engineering Wuhan University Wuhan China The School of Computer Science Faculty of Engineering The University of Sydney Sydney Australia The College of Computing & Data Science Nanyang Technological University #32 Block N4 #02a-014 50 Nanyang Avenue Singapore639798 Singapore 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2022年

核心收录:

主  题:Distillation 

摘      要:Prompt Transfer (PoT) is a recently-proposed approach to improve prompt-tuning, by initializing the target prompt with the existing prompt trained on similar source tasks. However, such a vanilla PoT approach usually achieves sub-optimal performance, as (i) the PoT is sensitive to the similarity of source-target pair and (ii) directly fine-tuning the prompt initialized with source prompt on target task might lead to forgetting of the useful general knowledge learned from source task. To tackle these issues, we propose a new metric to accurately predict the prompt transferability (regarding (i)), and a novel PoT approach (namely PANDA) that leverages the knowledge distillation technique to alleviate the knowledge forgetting effectively (regarding (ii)). Extensive and systematic experiments on 189 combinations of 21 source and 9 target datasets across 5 scales of PLMs demonstrate that: 1) our proposed metric works well to predict the prompt transferability;2) our PANDA consistently outperforms the vanilla PoT approach by 2.3% average score (up to 24.1%) among all tasks and model sizes;3) with our PANDA approach, prompt-tuning can achieve competitive and even better performance than model-tuning in various PLM scales scenarios. We have publicly released our code in https://***/WHU-ZQH/PANDA. Copyright © 2022, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分