咨询与建议

限定检索结果

文献类型

  • 242 篇 期刊文献
  • 206 篇 会议

馆藏范围

  • 448 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 404 篇 工学
    • 363 篇 计算机科学与技术...
    • 118 篇 电气工程
    • 66 篇 软件工程
    • 52 篇 信息与通信工程
    • 32 篇 控制科学与工程
    • 15 篇 电子科学与技术(可...
    • 4 篇 机械工程
    • 4 篇 仪器科学与技术
    • 4 篇 生物医学工程(可授...
    • 4 篇 网络空间安全
    • 3 篇 材料科学与工程(可...
    • 3 篇 交通运输工程
    • 2 篇 土木工程
    • 2 篇 化学工程与技术
    • 2 篇 石油与天然气工程
  • 64 篇 理学
    • 31 篇 物理学
    • 21 篇 生物学
    • 10 篇 数学
    • 8 篇 化学
    • 3 篇 系统科学
  • 60 篇 管理学
    • 38 篇 管理科学与工程(可...
    • 17 篇 图书情报与档案管...
    • 8 篇 公共管理
  • 38 篇 医学
    • 19 篇 基础医学(可授医学...
    • 15 篇 临床医学
    • 4 篇 公共卫生与预防医...
  • 25 篇 文学
    • 24 篇 外国语言文学
  • 8 篇 农学
  • 6 篇 教育学
    • 6 篇 教育学
  • 4 篇 法学
    • 2 篇 法学
    • 2 篇 社会学
  • 1 篇 经济学

主题

  • 448 篇 pre-trained lang...
  • 39 篇 natural language...
  • 33 篇 deep learning
  • 24 篇 task analysis
  • 23 篇 prompt learning
  • 21 篇 bert
  • 16 篇 transfer learnin...
  • 16 篇 transformers
  • 16 篇 training
  • 15 篇 contrastive lear...
  • 15 篇 few-shot learnin...
  • 15 篇 sentiment analys...
  • 14 篇 knowledge graph
  • 14 篇 named entity rec...
  • 13 篇 transformer
  • 13 篇 prompt tuning
  • 12 篇 bit error rate
  • 11 篇 multi-task learn...
  • 11 篇 semantics
  • 10 篇 knowledge graph ...

机构

  • 9 篇 alibaba grp peop...
  • 7 篇 tsinghua univ de...
  • 6 篇 chinese acad sci...
  • 6 篇 univ chinese aca...
  • 5 篇 yunnan univ sch ...
  • 4 篇 peking univ peop...
  • 4 篇 harbin inst tech...
  • 4 篇 chinese univ hon...
  • 4 篇 yuan ze univ dep...
  • 4 篇 beijing language...
  • 4 篇 peng cheng lab p...
  • 4 篇 renmin univ chin...
  • 3 篇 east china univ ...
  • 3 篇 hong kong polyte...
  • 3 篇 beihang univ sch...
  • 3 篇 beijing univ tec...
  • 3 篇 univ chinese aca...
  • 3 篇 univ manchester ...
  • 3 篇 guangxi normal u...
  • 3 篇 natl univ def te...

作者

  • 9 篇 wang chengyu
  • 8 篇 huang jun
  • 7 篇 qiu minghui
  • 6 篇 li peng
  • 6 篇 sun maosong
  • 6 篇 luo xudong
  • 5 篇 liu zhiyuan
  • 4 篇 zhou jie
  • 4 篇 liu cheng
  • 4 篇 li ge
  • 4 篇 chen xiang
  • 4 篇 ou yu-yen
  • 4 篇 ko youngjoong
  • 4 篇 lin yankai
  • 4 篇 zhang yu
  • 3 篇 zhang xuejie
  • 3 篇 jin zhi
  • 3 篇 wen ji-rong
  • 3 篇 yang hao
  • 3 篇 chen yufeng

语言

  • 443 篇 英文
  • 7 篇 德文
  • 7 篇 法文
  • 7 篇 意大利文
  • 5 篇 其他
检索条件"主题词=Pre-trained Language Model"
448 条 记 录,以下是111-120 订阅
排序:
Bridging the Fairness Gap: Enhancing pre-trained models with LLM-Generated Sentences
Bridging the Fairness Gap: Enhancing Pre-trained Models with...
收藏 引用
2025 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2025
作者: Yu, Liu Guo, Ludie Kuang, Ping Zhou, Fan University of Electronic Science and Technology of China Sichuan Chengdu610054 China
pre-trained language models (PLMs) are trained on data that inherently contains gender biases, leading to undesirable impacts. Traditional debiasing methods often rely on external corpora, which may lack quality, dive... 详细信息
来源: 评论
TF-BERT: Tensor-based fusion BERT for multimodal sentiment analysis
收藏 引用
NEURAL NETWORKS 2025年 185卷 107222页
作者: Hou, Jingming Omar, Nazlia Tiun, Sabrina Saad, Saidah He, Qian Univ Kebangsaan Malaysia Fac Informat Sci & Technol Ctr Artificial Intelligence Technol Bangi 43600 Selangor Malaysia Guilin Univ Elect Technol State & Local Joint Engn Res Ctr Satellite Nav & L Guangxi Collaborat Innovat Ctr Cloud Comp & Big Da Guangxi Key Lab Cryptog & Informat Secur Guilin Peoples R China
Multimodal Sentiment Analysis (MSA) has gained significant attention due to the limitations of unimodal sentiment recognition in complex real-world applications. Traditional approaches typically focus on using the Tra... 详细信息
来源: 评论
Parameter-efficient online knowledge distillation for pretrained language models
收藏 引用
EXPERT SYSTEMS WITH APPLICATIONS 2025年 265卷
作者: Wang, Yukun Wang, Jin Zhang, Xuejie Yunnan Univ Sch Informat Sci & Engn Kunming Peoples R China
With the advancement of natural language processing (NLP), the size of datasets and pre-trained language models (PLMs) has exponentially grown. These vast models exhibit robust capabilities in generation, comprehensio... 详细信息
来源: 评论
MCKP: Multi-aspect contextual knowledge-enhanced prompting for conversational recommender systems
收藏 引用
INFORMATION SCIENCES 2025年 686卷
作者: Wang, Yulin Zhang, Yihao Zhu, Junlin Li, Yao Zhou, Wei Chongqing Univ Technol Sch Artificial Intelligence Chongqing 400054 Peoples R China Chongqing Univ Sch Big Data & Software Engn Chongqing 400044 Peoples R China
Empowering conversational recommender systems (CRSs) with knowledge facilitates the generation of high-quality human-like recommendation proposals to users. Despite substantial endeavors in developing knowledge-based ... 详细信息
来源: 评论
Leveraging sensory knowledge into Text-to-Text Transfer Transformer for enhanced emotion analysis
收藏 引用
INFORMATION PROCESSING & MANAGEMENT 2025年 第1期62卷
作者: Zhao, Qingqing Xia, Yuhan Long, Yunfei Xu, Ge Wang, Jia Chinese Acad Social Sci Inst Linguist Beijing 100732 Peoples R China Univ Essex Sch Comp Sci & Elect Engn Colchester CO4 3SQ Essex England Minjiang Univ Coll Comp & Control Engn Fuzhou 350108 Peoples R China Xian Jiaotong Liverpool Univ Dept Intelligent Sci Suzhou 215123 Peoples R China
This study proposes an innovative model (i.e., SensoryT5), which integrates sensory knowledge into the T5 (Text-to-Text Transfer Transformer) framework for emotion classification tasks. By embedding sensory knowledge ... 详细信息
来源: 评论
PRIME, a temperature-guided language model revolutionizes protein engineering
收藏 引用
Acta Pharmaceutica Sinica B 2025年
作者: Yuanxi Yu Qianhui Wang Yike Zou Zhangjiang Institute for Advanced Study Shanghai Jiao Tong University Shanghai 201203 China School of Pharmaceutical Sciences Shanghai Jiao Tong University Shanghai 200240 China
来源: 评论
Do pretrained language models Indeed Understand Software Engineering Tasks?
收藏 引用
IEEE TRANSACTIONS ON SOFTWARE ENGINEERING 2023年 第10期49卷 4639-4655页
作者: Li, Yao Zhang, Tao Luo, Xiapu Cai, Haipeng Fang, Sen Yuan, Dawei Macau Univ Sci & Technol Sch Comp Sci & Engn Macau 999078 Peoples R China Hong Kong Polytech Univ Dept Comp Hong Kong 999077 Peoples R China Washington State Univ Sch Elect Engn & Comp Sci Pullman WA 99163 USA
Artificial intelligence (AI) for software engineering (SE) tasks has recently achieved promising performance. In this article, we investigate to what extent the pre-trained language model truly understands those SE ta... 详细信息
来源: 评论
SETEM: Self-ensemble training with pre-trained language models for Entity Matching
收藏 引用
KNOWLEDGE-BASED SYSTEMS 2024年 293卷
作者: Ding, Huahua Dai, Chaofan Wu, Yahui Ma, Wubin Zhou, Haohao Natl Univ Def Technol Sci & Technol Informat Syst Engn Lab Changsha 410073 Peoples R China
Entity Matching (EM) aims to determine whether records in two datasets refer to the same real -world entity. Existing work often uses pre -trained language models (PLMs) for feature representation, converting EM to a ... 详细信息
来源: 评论
On the Effectiveness of pre-trained language models for Legal Natural language Processing: An Empirical Study
收藏 引用
IEEE ACCESS 2022年 10卷 75835-75858页
作者: Song, Dezhao Gao, Sally He, Baosheng Schilder, Frank Thomson Reuters Eagan MN 55123 USA Thomson Reuters New York NY 10036 USA Meta Platforms Inc Menlo Pk CA 94025 USA
We present the first comprehensive empirical evaluation of pre-trained language models (PLMs) for legal natural language processing (NLP) in order to examine their effectiveness in this domain. Our study covers eight ... 详细信息
来源: 评论
Semantic Importance-Aware Communications Using pre-trained language models
收藏 引用
IEEE COMMUNICATIONS LETTERS 2023年 第9期27卷 2328-2332页
作者: Guo, Shuaishuai Wang, Yanhu Li, Shujing Saeed, Nasir Shandong Univ Sch Control Sci & Engn Jinan 250061 Peoples R China Shandong Univ Shandong Key Lab Wireless Commun Technol Jinan 250061 Peoples R China United Arab Emirates Univ UAEU Dept Elect & Commun Engn Al Ain U Arab Emirates
This letter proposes a semantic importance-aware communication (SIAC) scheme using pre-trained language models (e.g., ChatGPT, BERT, etc.). Specifically, we propose a cross-layer design with a pre-trained language mod... 详细信息
来源: 评论