咨询与建议

限定检索结果

文献类型

  • 1 篇 会议

馆藏范围

  • 1 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 1 篇 教育学
    • 1 篇 教育学
    • 1 篇 心理学(可授教育学...
  • 1 篇 工学
    • 1 篇 信息与通信工程
    • 1 篇 控制科学与工程
    • 1 篇 计算机科学与技术...
    • 1 篇 软件工程
  • 1 篇 管理学
    • 1 篇 图书情报与档案管...

主题

  • 1 篇 text classificat...
  • 1 篇 natural language...
  • 1 篇 bidirectional en...
  • 1 篇 convolutional ne...

机构

  • 1 篇 yancheng teacher...
  • 1 篇 chengdu universi...
  • 1 篇 henan university...

作者

  • 1 篇 wang ziying
  • 1 篇 wang chenxu
  • 1 篇 li yulin

语言

  • 1 篇 英文
检索条件"主题词=Bidirectional Encoder Representations Transformer"
1 条 记 录,以下是1-10 订阅
排序:
A Novel Approach for Text Classification by Combining Pre-trained BERT Model with CNN Classifier  6
A Novel Approach for Text Classification by Combining Pre-tr...
收藏 引用
2023 IEEE 6th International Conference on Information Systems and Computer Aided Education, ICISCAE 2023
作者: Wang, Chenxu Li, Yulin Wang, Ziying Yancheng Teachers University College of Information Engineering College of Software Jiangsu Yancheng China Henan University of Science and Technology College of Software Henan Luoyang China Chengdu University of Technology College of Computer Science and Cyber Security Oxford Brookes College Sichuan Chengdu China
Language model pre-training has emerged as a highly effective approach for acquiring universal language representations. Among the state-of-the-art models in this field, BERT (bidirectional encoder representations fro... 详细信息
来源: 评论