咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Accelerating Tensor Contractio... 收藏

Accelerating Tensor Contraction Products via Tensor-Train Decomposition [Tips & Tricks]

作     者:Kisil, Ilya Calvi, Giuseppe G. Konstantinidis, Kriton Xu, Yao Lei Mandic, Danilo P. 

作者机构:Crypton AI London SW7 2AZ England Maniyar Capital London SW7 2AZ England Imperial Coll London Dept Elect & Elect Engn London SW7 2AZ England 

出 版 物:《IEEE SIGNAL PROCESSING MAGAZINE》 (IEEE Signal Process Mag)

年 卷 期:2022年第39卷第5期

页      面:63-70页

核心收录:

学科分类:0808[工学-电气工程] 08[工学] 

主  题:Complexity theory Tensors Data analysis Software libraries Neural networks Market research Mathematical models Big Data Machine learning Notch filters Computational efficiency Approximation error Signal processing algorithms 

摘      要:Tensors (multiway arrays) and tensor decompositions (TDs) have recently received tremendous attention in the data analytics community, due to their ability to mitigate the curse of dimensionality associated with modern large-dimensional big data [1], [2]. Indeed, TDs allow for data volume (e.g., the parameter complexity) to be reduced from scaling exponentially to scaling linearly in the tensor dimensions, which facilitates applications in areas including the compression and interpretability of neural networks [1], [3], multimodal learning [1], and completion of knowledge graphs [4], [5]. At the heart of TD techniques is the tensor contraction product (TCP), an operator used for representing even the most unmanageable higher-order tensors through a set of small-scale core tensors that are interconnected via TCP operations [2].

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分