版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:Crypton AI London SW7 2AZ England Maniyar Capital London SW7 2AZ England Imperial Coll London Dept Elect & Elect Engn London SW7 2AZ England
出 版 物:《IEEE SIGNAL PROCESSING MAGAZINE》 (IEEE Signal Process Mag)
年 卷 期:2022年第39卷第5期
页 面:63-70页
核心收录:
主 题:Complexity theory Tensors Data analysis Software libraries Neural networks Market research Mathematical models Big Data Machine learning Notch filters Computational efficiency Approximation error Signal processing algorithms
摘 要:Tensors (multiway arrays) and tensor decompositions (TDs) have recently received tremendous attention in the data analytics community, due to their ability to mitigate the curse of dimensionality associated with modern large-dimensional big data [1], [2]. Indeed, TDs allow for data volume (e.g., the parameter complexity) to be reduced from scaling exponentially to scaling linearly in the tensor dimensions, which facilitates applications in areas including the compression and interpretability of neural networks [1], [3], multimodal learning [1], and completion of knowledge graphs [4], [5]. At the heart of TD techniques is the tensor contraction product (TCP), an operator used for representing even the most unmanageable higher-order tensors through a set of small-scale core tensors that are interconnected via TCP operations [2].