版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:The Redwood Center for Theoretical Neuroscience The University of California BerkeleyCA94720 United States The Intelligent Systems Lab Research Institutes of Sweden Kista16440 Sweden International Research and Training Center for Information Technologies and Systems Kiev03680 Ukraine The Department of Computer Science Electrical and Space Engineering Luleå University of Technology Luleå97187 Sweden IBM Research Zurich8803 Switzerland
出 版 物:《arXiv》 (arXiv)
年 卷 期:2021年
核心收录:
主 题:Vectors
摘 要:This is Part II of the two-part comprehensive survey devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer to a family of computational models that use high-dimensional distributed representations and rely on the algebraic properties of their key operations to incorporate the advantages of structured symbolic representations and vector distributed representations. Holographic Reduced Representations [1], [2] is an influential HDC/VSA model that is well-known in the machine learning domain and often used to refer to the whole family. However, for the sake of consistency, we use HDC/VSA to refer to the field. Part I of this survey [3] covered foundational aspects of the field, such as the historical context leading to the development of HDC/VSA, key elements of any HDC/VSA model, known HDC/VSA models, and the transformation of input data of various types into high-dimensional vectors suitable for HDC/VSA. This second part surveys existing applications, the role of HDC/VSA in cognitive computing and architectures, as well as directions for future work. Most of the applications lie within the Machine Learning/Artificial Intelligence domain, however, we also cover other applications to provide a complete picture. The survey is written to be useful for both newcomers and practitioners. Copyright © 2021, The Authors. All rights reserved.