咨询与建议

限定检索结果

文献类型

  • 241 篇 期刊文献
  • 205 篇 会议

馆藏范围

  • 446 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 400 篇 工学
    • 359 篇 计算机科学与技术...
    • 115 篇 电气工程
    • 64 篇 软件工程
    • 49 篇 信息与通信工程
    • 30 篇 控制科学与工程
    • 13 篇 电子科学与技术(可...
    • 4 篇 机械工程
    • 4 篇 仪器科学与技术
    • 4 篇 生物医学工程(可授...
    • 3 篇 材料科学与工程(可...
    • 3 篇 网络空间安全
    • 2 篇 土木工程
    • 2 篇 化学工程与技术
    • 2 篇 石油与天然气工程
    • 2 篇 交通运输工程
  • 63 篇 理学
    • 31 篇 物理学
    • 21 篇 生物学
    • 9 篇 数学
    • 8 篇 化学
    • 3 篇 系统科学
  • 59 篇 管理学
    • 37 篇 管理科学与工程(可...
    • 16 篇 图书情报与档案管...
    • 8 篇 公共管理
  • 39 篇 医学
    • 19 篇 基础医学(可授医学...
    • 15 篇 临床医学
    • 4 篇 公共卫生与预防医...
  • 25 篇 文学
    • 24 篇 外国语言文学
  • 8 篇 农学
  • 6 篇 教育学
    • 6 篇 教育学
  • 4 篇 法学
    • 2 篇 法学
    • 2 篇 社会学
  • 1 篇 经济学

主题

  • 446 篇 pre-trained lang...
  • 39 篇 natural language...
  • 33 篇 deep learning
  • 24 篇 task analysis
  • 23 篇 prompt learning
  • 21 篇 bert
  • 16 篇 transfer learnin...
  • 16 篇 transformers
  • 16 篇 training
  • 15 篇 few-shot learnin...
  • 15 篇 sentiment analys...
  • 14 篇 contrastive lear...
  • 14 篇 named entity rec...
  • 13 篇 transformer
  • 13 篇 knowledge graph
  • 13 篇 prompt tuning
  • 12 篇 bit error rate
  • 11 篇 multi-task learn...
  • 10 篇 knowledge graph ...
  • 10 篇 knowledge distil...

机构

  • 9 篇 alibaba grp peop...
  • 7 篇 tsinghua univ de...
  • 6 篇 chinese acad sci...
  • 6 篇 univ chinese aca...
  • 5 篇 yunnan univ sch ...
  • 4 篇 peking univ peop...
  • 4 篇 harbin inst tech...
  • 4 篇 chinese univ hon...
  • 4 篇 yuan ze univ dep...
  • 4 篇 beijing language...
  • 4 篇 peng cheng lab p...
  • 4 篇 renmin univ chin...
  • 3 篇 east china univ ...
  • 3 篇 hong kong polyte...
  • 3 篇 beihang univ sch...
  • 3 篇 beijing univ tec...
  • 3 篇 univ chinese aca...
  • 3 篇 univ manchester ...
  • 3 篇 guangxi normal u...
  • 3 篇 natl univ def te...

作者

  • 9 篇 wang chengyu
  • 8 篇 huang jun
  • 7 篇 qiu minghui
  • 6 篇 li peng
  • 6 篇 sun maosong
  • 6 篇 luo xudong
  • 5 篇 liu zhiyuan
  • 4 篇 zhou jie
  • 4 篇 liu cheng
  • 4 篇 li ge
  • 4 篇 chen xiang
  • 4 篇 ou yu-yen
  • 4 篇 ko youngjoong
  • 4 篇 lin yankai
  • 4 篇 zhang yu
  • 3 篇 zhang xuejie
  • 3 篇 jin zhi
  • 3 篇 wen ji-rong
  • 3 篇 yang hao
  • 3 篇 chen yufeng

语言

  • 442 篇 英文
  • 7 篇 德文
  • 7 篇 法文
  • 7 篇 意大利文
  • 4 篇 其他
检索条件"主题词=Pre-trained Language Model"
446 条 记 录,以下是1-10 订阅
排序:
pre-trained language model for code-mixed text in Indonesian, Javanese, and English using transformer
收藏 引用
SOCIAL NETWORK ANALYSIS AND MINING 2025年 第1期15卷 1-17页
作者: Hidayatullah, Ahmad Fathan Apong, Rosyzie Anna Lai, Daphne Teck Ching Qazi, Atika Univ Brunei Darussalam Sch Digital Sci Jalan Tungku Link BE-1410 Gadong Bandar Seri Beg Brunei Univ Islam Indonesia Dept Informat Jalan Kaliurang Km 14-5 Sleman 55584 Yogyakarta Indonesia Univ Brunei Darussalam Ctr Lifelong Learning Jalan Tungku Link BE-1410 Gadong Bandar Seri Beg Brunei
pre-trained language models (PLMs) have become increasingly popular due to their ability to achieve state-of-the-art performance on various natural language processing tasks with less training data and time. However, ... 详细信息
来源: 评论
PPTopicPLM: plug-and-play topic-enhanced pre-trained language model for short-text rumor detection
收藏 引用
JOURNAL OF SUPERCOMPUTING 2025年 第1期81卷 1-20页
作者: Zeng, Jiangfeng Li, Xinyu Ma, Xiao Cent China Normal Univ Sch Informat Management Luoyu St Wuhan 430079 Hubei Peoples R China Zhongnan Univ Econ & Law Sch Informat Engn Nanhu Ave Wuhan 430073 Hubei Peoples R China
Recently, lots of pre-trained language models (PLMs) have been investigated for rumor detection and obtained superior results. However, existing PLM-based approaches are challenged and limited when addressing short te... 详细信息
来源: 评论
Diachronic semantic encoding based on pre-trained language model for temporal knowledge graph reasoning
收藏 引用
KNOWLEDGE-BASED SYSTEMS 2025年 318卷
作者: Deng, Yunteng Song, Jia Yang, Zhongliang Long, Yilin Zeng, Li Zhou, Linna Beijing Univ Posts & Telecommun Sch Cyberspace Secur Beijing 100876 Peoples R China Shenzhen Stock Exchange Shenzhen 518010 Guangdong Peoples R China
Temporal Knowledge Graph Reasoning (TKGR) aims to infer missing facts at specific timestamps. However, most existing methods primarily focus on the local and global evolutionary characteristics of temporal knowledge g... 详细信息
来源: 评论
Talent Supply and Demand Matching Based on Prompt Learning and the pre-trained language model
收藏 引用
APPLIED SCIENCES-BASEL 2025年 第5期15卷 2536-2536页
作者: Li, Kunping Liu, Jianhua Zhuang, Cunbo Beijing Inst Technol Sch Mech Engn Lab Digital Mfg Beijing 100081 Peoples R China Beijing Inst Technol Tangshan Res Inst Hebei Key Lab Intelligent Assembly & Detect Techno Tangshan 063000 Peoples R China
In the context of the accelerating new technological revolution and industrial transformation, the issue of talent supply and demand matching has become increasingly urgent. precise matching talent supply and demand i... 详细信息
来源: 评论
Explainable reasoning over temporal knowledge graphs by pre-trained language model
收藏 引用
INFORMATION PROCESSING & MANAGEMENT 2025年 第1期62卷
作者: Li, Qing Wu, Guanzhong Northwestern Polytech Univ Sch Comp Sci Xian 710000 Shaanxi Peoples R China
Temporal knowledge graph reasoning (TKGR) has been considered as a crucial task for modeling the evolving knowledge, aiming to infer the unknown connections between entities at specific times. Traditional TKGR methods... 详细信息
来源: 评论
Efficient word segmentation for enhancing Chinese spelling check in pre-trained language model
收藏 引用
KNOWLEDGE AND INFORMATION SYSTEMS 2025年 第1期67卷 603-632页
作者: Li, Fangfang Jiang, Jie Tang, Dafu Shan, Youran Duan, Junwen Zhang, Shichao Cent South Univ Comp Sci & Engn Changsha 410083 Peoples R China Natl Univ Def Technol Coll Syst Engn Changsha 410073 Peoples R China Zhejiang Coll Secur Technol Coll Artificial Intelligence Wenzhou 325016 Peoples R China Guangxi Normal Univ Key Lab Educ Blockchain & Intelligent Technol Minist Educ Guilin 541004 Peoples R China Guangxi Normal Univ Guangxi Key Lab Multisource Informat Min & Secur Guilin 541004 Peoples R China
In existing pre-trained language models, Chinese spelling check (CSC) often considers phonetic and graphic details at the character level, and ignores the essential role of word segmentation. To address this issue, an... 详细信息
来源: 评论
Few-shot medical relation extraction via prompt tuning enhanced pre-trained language model
收藏 引用
NEUROCOMPUTING 2025年 633卷
作者: He, Guoxiu Huang, Chen East China Normal Univ Sch Econ & Management Shanghai 200062 Peoples R China Singapore Univ Technol & Design Singapore 487372 Singapore
Medical relation extraction is crucial for developing structured information to support intelligent healthcare systems. However, acquiring large volumes of labeled medical data is challenging due to the specialized na... 详细信息
来源: 评论
Graph-aware pre-trained language model for political sentiment analysis in Filipino social media
收藏 引用
ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE 2025年 146卷
作者: Aquino, Jean Aristide Liew, Di Jie Chang, Yung-Chun Taipei Med Univ Grad Inst Data Sci Taipei Taiwan Taipei Med Univ Hosp Clin Big Data Res Ctr Taipei Taiwan
Elections are emotionally and sentimentally charged events that offer unique opportunities for analysis of sentiments not typically observed during non-election periods. Unlike recurring phenomena, elections are inher... 详细信息
来源: 评论
MCFC: A Momentum-Driven Clicked Feature Compressed pre-trained language model for Information Retrieval  13th
MCFC: A Momentum-Driven Clicked Feature Compressed Pre-train...
收藏 引用
13th International Conference on Natural language Processing and Chinese Computing
作者: Li, Dongyang Ding, Ruixue Xie, Pengjun He, Xiaofeng East China Normal Univ Shanghai Peoples R China Alibaba Grp Hangzhou Peoples R China
Information Retrieval (IR) pre-trained language models are trained from large-scale retrieval-based corpora to promote the task-specific knowledge capacity. previous works focus on general retrieval pre-trained datase... 详细信息
来源: 评论
Prompting disentangled embeddings for knowledge graph completion with pre-trained language model
收藏 引用
EXPERT SYSTEMS WITH APPLICATIONS 2025年 268卷
作者: Geng, Yuxia Chen, Jiaoyan Zeng, Yuhang Chen, Zhuo Zhang, Wen Pan, Jeff Z. Wang, Yuxiang Xu, Xiaoliang Powerchina Huadong Engn Corp Ltd Hangzhou 311112 Peoples R China Hangzhou Dianzi Univ Sch Comp Sci Hangzhou 310018 Peoples R China UNIV MANCHESTER Dept Comp Sci MANCHESTER M13 9PL England Hangzhou Dianzi Univ HDU ITMO Joint Inst Hangzhou 310018 Peoples R China Zhejiang Univ Coll Comp Sci & Technol Hangzhou 310028 Peoples R China Zhejiang Univ Sch Software Technol Ningbo 315048 Peoples R China Univ Edinburgh Sch Informat Edinburgh EH8 9AB Scotland
Both graph structures and textual information play a critical role in Knowledge Graph Completion (KGC). With the success of pre-trained language models (PLMs) such as BERT, they have been applied for text encoding for... 详细信息
来源: 评论