咨询与建议

限定检索结果

文献类型

  • 1 篇 期刊文献

馆藏范围

  • 1 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 1 篇 工学
    • 1 篇 电气工程
    • 1 篇 信息与通信工程
    • 1 篇 计算机科学与技术...

主题

  • 1 篇 patent relevance...
  • 1 篇 soft position em...
  • 1 篇 prefix sequence ...
  • 1 篇 data augmentatio...

机构

  • 1 篇 jiangsu univ sci...
  • 1 篇 jianghan univ sc...

作者

  • 1 篇 lou yinxia
  • 1 篇 wang dongsheng
  • 1 篇 shi xingchen
  • 1 篇 wang fei

语言

  • 1 篇 英文
检索条件"主题词=Prefix sequence compression"
1 条 记 录,以下是1-10 订阅
排序:
P-BERT: Toward Long sequence Modeling by Enabling Language Representation With prefix sequence compression, Soft Position Embedding, and Data Augmentation for Patent Relevance Assessment
收藏 引用
IEEE ACCESS 2025年 13卷 41928-41942页
作者: Wang, Fei Shi, Xingchen Wang, Dongsheng Lou, Yinxia Jiangsu Univ Sci & Technol Sch Comp Sci Zhenjiang 212003 Peoples R China Jianghan Univ Sch Artificial Intelligence Wuhan 430056 Peoples R China
Recent works have increasingly adopted pre-trained language models, such as BERT, to model technical semantics for patent relevance assessment. However, existing truncation and divide-merge strategies, used to handle ... 详细信息
来源: 评论