咨询与建议

限定检索结果

文献类型

  • 1 篇 会议

馆藏范围

  • 1 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 1 篇 工学
    • 1 篇 电气工程
    • 1 篇 计算机科学与技术...

主题

  • 1 篇 pipeline paralle...
  • 1 篇 operator paralle...
  • 1 篇 model parallelis...
  • 1 篇 distributed deep...

机构

  • 1 篇 swiss fed inst t...

作者

  • 1 篇 li shigang
  • 1 篇 hoefler torsten

语言

  • 1 篇 英文
检索条件"主题词=pipeline parallelism data parallelism"
1 条 记 录,以下是1-10 订阅
排序:
Chimera: Efficiently Training Large-Scale Neural Networks with Bidirectional pipelines  21
Chimera: Efficiently Training Large-Scale Neural Networks wi...
收藏 引用
International Conference for High Performance Computing, Networking, Storage and Analysis (SC21)
作者: Li, Shigang Hoefler, Torsten Swiss Fed Inst Technol Zurich Switzerland
Training large deep learning models at scale is very challenging. This paper proposes Chimera, a novel pipeline parallelism scheme which combines bidirectional pipelines for efficiently training large-scale models. Ch... 详细信息
来源: 评论