咨询与建议

限定检索结果

文献类型

  • 3 篇 期刊文献
  • 1 篇 会议

馆藏范围

  • 4 篇 电子文献
  • 0 种 纸本馆藏

日期分布

学科分类号

  • 3 篇 工学
    • 2 篇 计算机科学与技术...
    • 1 篇 电气工程
  • 1 篇 理学
    • 1 篇 数学
    • 1 篇 统计学(可授理学、...
  • 1 篇 医学
    • 1 篇 基础医学(可授医学...
    • 1 篇 临床医学

主题

  • 4 篇 wake-sleep algor...
  • 1 篇 language modelin...
  • 1 篇 mt neurons
  • 1 篇 variational auto...
  • 1 篇 deep learning
  • 1 篇 mst neurons
  • 1 篇 optical flow
  • 1 篇 natural gradient
  • 1 篇 learning
  • 1 篇 helmholtz machin...
  • 1 篇 neural network
  • 1 篇 q-states neurons
  • 1 篇 learning theory
  • 1 篇 statistical dyna...
  • 1 篇 fisher-rao metri...
  • 1 篇 two-layered neur...
  • 1 篇 factor analysis
  • 1 篇 latent variable

机构

  • 1 篇 tohoku univ res ...
  • 1 篇 tohoku univ gsis...
  • 1 篇 saarland univ sp...
  • 1 篇 univ chinese aca...
  • 1 篇 max planck inst ...
  • 1 篇 univ leipzig d-0...
  • 1 篇 tokyo inst techn...
  • 1 篇 santa fe inst sa...

作者

  • 1 篇 ando m
  • 1 篇 niu shuzi
  • 1 篇 kabashima y
  • 1 篇 katayama k
  • 1 篇 ay nihat
  • 1 篇 shen xiaoyu
  • 1 篇 shimasaki s
  • 1 篇 horiguchi t
  • 1 篇 klakow dietrich
  • 1 篇 su hui

语言

  • 2 篇 英文
  • 2 篇 其他
检索条件"主题词=Wake-sleep algorithm"
4 条 记 录,以下是1-10 订阅
排序:
Models of MT and MST areas using wake-sleep algorithm
收藏 引用
NEURAL NETWORKS 2004年 第3期17卷 339-351页
作者: Katayama, K Ando, M Horiguchi, T Tohoku Univ GSIS Dept Math & Comp Sci Sendai Miyagi 9808579 Japan Tohoku Univ Res Inst Elect Commun Sendai Miyagi 9808577 Japan
We present two-layered neural network models with Q ( greater than or equal to 2)-states neurons for a system with middle temporal (MT) neurons and medial superior temporal (MST) neurons by using a wake-sleep algorith... 详细信息
来源: 评论
Dynamical analysis of the wake-sleep algorithm
收藏 引用
ELECTRONICS AND COMMUNICATIONS IN JAPAN PART III-FUNDAMENTAL ELECTRONIC SCIENCE 2002年 第1期85卷 41-49页
作者: Shimasaki, S Kabashima, Y Tokyo Inst Technol Interdisciplinary Grad Sch Sci & Engn Yokohama Kanagawa 2268502 Japan
We employ statistical dynamics to study the convergence of the wake-sleep (W-S) algorithm, which is a learning algorithm for neural network models having hidden units. Although there have been several reports on the e... 详细信息
来源: 评论
On the locality of the natural gradient for learning in deep Bayesian networks
收藏 引用
INFORMATION GEOMETRY 2023年 第1期6卷 1-49页
作者: Ay, Nihat Max Planck Inst Math Sci D-04103 Leipzig Germany Univ Leipzig D-04109 Leipzig Germany Santa Fe Inst Santa Fe NM 87501 USA
We study the natural gradient method for learning in deep Bayesian networks, including neural networks. There are two natural geometries associated with such learning systems consisting of visible and hidden units. On... 详细信息
来源: 评论
wake-sleep Variational Autoencoders for Language Modeling  24th
Wake-Sleep Variational Autoencoders for Language Modeling
收藏 引用
24th International Conference on Neural Information Processing (ICONIP)
作者: Shen, Xiaoyu Su, Hui Niu, Shuzi Klakow, Dietrich Saarland Univ Spoken Language Syst LSV Saarbrucken Germany Univ Chinese Acad Sci Software Inst Beijing Peoples R China
Variational Autoencoders (VAEs) are known to easily suffer from the KL-vanishing problem when combining with powerful autoregressive models like recurrent neural networks (RNNs), which prohibits their wide application... 详细信息
来源: 评论