咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Pre-trained models for natural... 收藏

Pre-trained models for natural language processing: A survey

为自然语言处理的预先训练的模型: 调查

作     者:QIU XiPeng SUN TianXiang XU YiGe SHAO YunFan DAI Ning HUANG XuanJing QIU XiPeng;SUN TianXiang;XU YiGe;SHAO YunFan;DAI Ning;HUANG XuanJing

作者机构:School of Computer ScienceFudan UniversityShanghai 200433China Shanghai Key Laboratory of Intelligent Information ProcessingShanghai 200433China 

出 版 物:《Science China(Technological Sciences)》 (中国科学(技术科学英文版))

年 卷 期:2020年第63卷第10期

页      面:1872-1897页

核心收录:

学科分类:081203[工学-计算机应用技术] 08[工学] 0835[工学-软件工程] 0812[工学-计算机科学与技术(可授工学、理学学位)] 

基  金:the National Natural Science Foundation of China(Grant Nos.61751201 and 61672162) the Shanghai Municipal Science and Technology Major Project(Grant No.2018SHZDZX01)and ZJLab 

主  题:deep learning neural network natural language processing pre-trained model distributed representation word embedding self-supervised learning language modelling 

摘      要:Recently, the emergence of pre-trained models(PTMs) has brought natural language processing(NLP) to a new era. In this survey, we provide a comprehensive review of PTMs for NLP. We first briefly introduce language representation learning and its research progress. Then we systematically categorize existing PTMs based on a taxonomy from four different perspectives. Next,we describe how to adapt the knowledge of PTMs to downstream tasks. Finally, we outline some potential directions of PTMs for future research. This survey is purposed to be a hands-on guide for understanding, using, and developing PTMs for various NLP tasks.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分