咨询与建议

限定检索结果

文献类型

  • 241 篇 期刊文献
  • 205 篇 会议

馆藏范围

  • 446 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 400 篇 工学
    • 359 篇 计算机科学与技术...
    • 115 篇 电气工程
    • 64 篇 软件工程
    • 49 篇 信息与通信工程
    • 30 篇 控制科学与工程
    • 13 篇 电子科学与技术(可...
    • 4 篇 机械工程
    • 4 篇 仪器科学与技术
    • 4 篇 生物医学工程(可授...
    • 3 篇 材料科学与工程(可...
    • 3 篇 网络空间安全
    • 2 篇 土木工程
    • 2 篇 化学工程与技术
    • 2 篇 石油与天然气工程
    • 2 篇 交通运输工程
  • 63 篇 理学
    • 31 篇 物理学
    • 21 篇 生物学
    • 9 篇 数学
    • 8 篇 化学
    • 3 篇 系统科学
  • 59 篇 管理学
    • 37 篇 管理科学与工程(可...
    • 16 篇 图书情报与档案管...
    • 8 篇 公共管理
  • 39 篇 医学
    • 19 篇 基础医学(可授医学...
    • 15 篇 临床医学
    • 4 篇 公共卫生与预防医...
  • 25 篇 文学
    • 24 篇 外国语言文学
  • 8 篇 农学
  • 6 篇 教育学
    • 6 篇 教育学
  • 4 篇 法学
    • 2 篇 法学
    • 2 篇 社会学
  • 1 篇 经济学

主题

  • 446 篇 pre-trained lang...
  • 39 篇 natural language...
  • 33 篇 deep learning
  • 24 篇 task analysis
  • 23 篇 prompt learning
  • 21 篇 bert
  • 16 篇 transfer learnin...
  • 16 篇 transformers
  • 16 篇 training
  • 15 篇 few-shot learnin...
  • 15 篇 sentiment analys...
  • 14 篇 contrastive lear...
  • 14 篇 named entity rec...
  • 13 篇 transformer
  • 13 篇 knowledge graph
  • 13 篇 prompt tuning
  • 12 篇 bit error rate
  • 11 篇 multi-task learn...
  • 10 篇 knowledge graph ...
  • 10 篇 knowledge distil...

机构

  • 9 篇 alibaba grp peop...
  • 7 篇 tsinghua univ de...
  • 6 篇 chinese acad sci...
  • 6 篇 univ chinese aca...
  • 5 篇 yunnan univ sch ...
  • 4 篇 peking univ peop...
  • 4 篇 harbin inst tech...
  • 4 篇 chinese univ hon...
  • 4 篇 yuan ze univ dep...
  • 4 篇 beijing language...
  • 4 篇 peng cheng lab p...
  • 4 篇 renmin univ chin...
  • 3 篇 east china univ ...
  • 3 篇 hong kong polyte...
  • 3 篇 beihang univ sch...
  • 3 篇 beijing univ tec...
  • 3 篇 univ chinese aca...
  • 3 篇 univ manchester ...
  • 3 篇 guangxi normal u...
  • 3 篇 natl univ def te...

作者

  • 9 篇 wang chengyu
  • 8 篇 huang jun
  • 7 篇 qiu minghui
  • 6 篇 li peng
  • 6 篇 sun maosong
  • 6 篇 luo xudong
  • 5 篇 liu zhiyuan
  • 4 篇 zhou jie
  • 4 篇 liu cheng
  • 4 篇 li ge
  • 4 篇 chen xiang
  • 4 篇 ou yu-yen
  • 4 篇 ko youngjoong
  • 4 篇 lin yankai
  • 4 篇 zhang yu
  • 3 篇 zhang xuejie
  • 3 篇 jin zhi
  • 3 篇 wen ji-rong
  • 3 篇 yang hao
  • 3 篇 chen yufeng

语言

  • 442 篇 英文
  • 7 篇 德文
  • 7 篇 法文
  • 7 篇 意大利文
  • 4 篇 其他
检索条件"主题词=Pre-trained Language Model"
446 条 记 录,以下是41-50 订阅
排序:
CAM-BERT: Chinese Aerospace Manufacturing pre-trained language model  6
CAM-BERT: Chinese Aerospace Manufacturing Pre-trained Langua...
收藏 引用
6th International Conference on Natural language Processing (ICNLP)
作者: Dai, Jinchi Wang, Shengren Wang, Peiyan Li, Ruiting Chen, Jiaxin Li, Xinrong Shenyang Aerosp Univ Dept Sch Comp Sci Shenyang Peoples R China AVIC Shenyang Aircraft Co Ltd Dept Network Informat Ctr Shenyang Peoples R China
In the era of intelligent manufacturing and Industry 4.0, there is a growing demand for specialized Chinese pre-trained language models designed for the aerospace manufacturing. This is essential to overcome the limit... 详细信息
来源: 评论
Commonsense Knowledge Base Completion with Relational Graph Attention Network and pre-trained language model  22
Commonsense Knowledge Base Completion with Relational Graph ...
收藏 引用
31st ACM International Conference on Information and Knowledge Management (CIKM)
作者: Ju, Jinghao Yang, Deqing Liu, Jingping Fudan Univ Sch Data Sci Shanghai Peoples R China East China Univ Sci & Technol Sch Informat Sci & Engn Shanghai Peoples R China
Many commonsense knowledge graphs (CKGs) still suffer from incompleteness although they have been applied in many natural language processing tasks successfully. Due to the scale and sparsity of CKGs, existing knowled... 详细信息
来源: 评论
BERT4CTR: An Efficient Framework to Combine pre-trained language model with Non-textual Features for CTR prediction  23
BERT4CTR: An Efficient Framework to Combine Pre-trained Lang...
收藏 引用
29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (KDD)
作者: Wang, Dong Salamatian, Kave Xia, Yunqing Deng, Weiwei Zhang, Qi Microsoft Corp STCA Beijing Peoples R China Univ Savoie Annecy France
Although deep pre-trained language models have shown promising benefit in a large set of industrial scenarios, including Click-Through-Rate (CTR) prediction, how to integrate pre-trained language models that handle on... 详细信息
来源: 评论
Named-Entity Recognition for a Low-resource language using pre-trained language model  22
Named-Entity Recognition for a Low-resource Language using P...
收藏 引用
37th Annual ACM Symposium on Applied Computing
作者: Yohannes, Hailemariam Mehari Amagasa, Toshiyuki Univ Tsukuba Syst & Informat Engn Tsukuba Ibaraki Japan Univ Tsukuba Ctr Computat Sci Tsukuba Ibaraki Japan
This paper proposes a method for Named-Entity Recognition (NER) for a low-resource language, Tigrinya, using a pre-trained language model. Tigrinya is a morphologically rich language, although one of the underrepresen... 详细信息
来源: 评论
BioHanBERT: A Hanzi-aware pre-trained language model for Chinese Biomedical Text Mining  21
BioHanBERT: A Hanzi-aware Pre-trained Language Model for Chi...
收藏 引用
21st IEEE International Conference on Data Mining (IEEE ICDM)
作者: Wang, Xiaosu Xiong, Yun Niu, Hao Yue, Jingwen Zhu, Yangyong Yu, Philip S. Fudan Univ Shanghai Key Lab Data Sci Sch Comp Sci Shanghai Peoples R China Fudan Univ Shanghai Inst Adv Commun & Data Sci Shanghai Peoples R China Univ Illinois Dept Comp Sci Chicago IL USA
Unsupervised pre-trained language models (PLMs) have boosted the development of effective biomedical text mining models. But the biomedical texts contain a huge number of long-tail concepts and terminologies, which ma... 详细信息
来源: 评论
A Sentence Quality Evaluation Framework for Machine Reading Comprehension Incorporating pre-trained language model  19th
A Sentence Quality Evaluation Framework for Machine Reading ...
收藏 引用
19th International Conference on Advanced Intelligent Computing Technology and Applications (ICIC)
作者: Meng, Fan-Jun He, Ji-Fei Xu, Xing-Jian Zhao, Ya-Juan Sun, Li-Jun Inner Mongolia Normal Univ Coll Comp Sci & Technol Hohhot 010022 Peoples R China
Multi-choice Machine Reading Comprehension (MRC) task involves selecting the correct answer from a limited set of options given a passage and a question. MRCtasks have experienced two main peaks: the explosion of deep... 详细信息
来源: 评论
NMT Enhancement based on Knowledge Graph Mining with pre-trained language model  22
NMT Enhancement based on Knowledge Graph Mining with Pre-tra...
收藏 引用
22nd IEEE International Conference on Advanced Communication Technology (ICACT)
作者: Yang, Hao Qin, Ying Deng, Yao Wang, Minghan Huawei Co Ltd Translate Serv Ctr Beijing Peoples R China
pre-trained language models like Bert, RoBERT a, GPT, etc. have achieved SOTA effects on multiple NLP tasks (e.g. sentiment classification, information extraction, event extraction, etc.). We propose a simple method b... 详细信息
来源: 评论
Drug-BERT : pre-trained language model Specialized for Korean Drug Crime  19
Drug-BERT : Pre-trained Language Model Specialized for Korea...
收藏 引用
19th IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB)
作者: Lee, Jeong Min Lee, Suyeon Byon, Sungwon Jung, Eui-Suk Baek, Myung-Sun Elect & Telecommun Res Inst Daejeon South Korea Univ Sci & Technol Daejeon South Korea Yonsei Univ Dept Artificial Intelligence Seoul South Korea
We propose Drug-BERT, a specialized pre-trained language model designed for detecting drug-related content in the Korean language. Given the severity of the current drug issue in South Korea, effective responses are i... 详细信息
来源: 评论
Question Answering based Clinical Text Structuring Using pre-trained language model
Question Answering based Clinical Text Structuring Using Pre...
收藏 引用
IEEE International Conference on Bioinformatics and Biomedicine (BIBM)
作者: Qiu, Jiahui Zhou, Yangming Ma, Zhiyuan Ruan, Tong Liu, Jinlin Sun, Jing East China Univ Sci & Technol Sch Informat Sci & Engn Shanghai 200237 Peoples R China Shanghai Jiao Tong Univ Ruijin Hosp Sch Med Shanghai 200025 Peoples R China
Clinical text structuring is a critical and fundamental task for clinical research. Traditional methods such as task-specific end-to-end models and pipeline models usually suffer from the lack of dataset and error pro... 详细信息
来源: 评论
Software Vulnerabilities Detection Based on a pre-trained language model  22
Software Vulnerabilities Detection Based on a Pre-trained La...
收藏 引用
IEEE 22nd International Conference on Trust, Security and Privacy in Computing and Communications (TrustCom) / BigDataSE Conference / CSE Conference / EUC Conference / ISCI Conference
作者: Xu, Wenlin Li, Tong Wang, Jinsong Duan, Haibo Tang, Yahui Yunnan Univ Sch Informat Sci & Engn Kunming Yunnan Peoples R China Yunnan Agr Univ Sch Big Data Kunming Yunnan Peoples R China Yunnan Univ Finance & Econ Informat Management Ctr Kunming Yunnan Peoples R China Chongqing Univ Posts & Telecommun Sch Software Chongqing Peoples R China
Software vulnerabilities detection is crucial in cyber security which protects the software systems from malicious attacks. The majority of earlier techniques relied on security professionals to provide software featu... 详细信息
来源: 评论