咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >High-order tensor completion v... 收藏

High-order tensor completion via gradient-based optimization under tensor train format

经由在张肌火车格式下面的基于坡度的优化的高顺序的张肌结束

作     者:Yuan, Longhao Zhao, Qibin Gui, Lihua Cao, Jianting 

作者机构:Saitama Inst Technol Grad Sch Engn Fukaya Saitama Japan RIKEN Ctr Adv Intelligence Project AIP Tensor Learning Unit Tokyo Japan Guangdong Univ Technol Sch Automat Guangzhou Guangdong Peoples R China Hangzhou Dianzi Univ Sch Comp Sci & Technol Hangzhou Zhejiang Peoples R China 

出 版 物:《SIGNAL PROCESSING-IMAGE COMMUNICATION》 (信号处理:图像通信)

年 卷 期:2019年第73卷

页      面:53-61页

核心收录:

学科分类:0808[工学-电气工程] 08[工学] 

基  金:JSPS KAKENHI, Japan [17K00326, 15H04002, 18K04178] JST CREST, Japan [JP-MJCR1784] National Natural Science Foundation of China 

主  题:Tensor completion Visual data recovery Tensor train decomposition Higher-order tensorization Gradient-based optimization 

摘      要:Tensor train (TT) decomposition has drawn people s attention due to its powerful representation ability and performance stability in high-order tensors. In this paper, we propose a novel approach to recover the missing entries of incomplete data represented by higher-order tensors. We attempt to find the low-rank TT decomposition of the incomplete data which captures the latent features of the whole data and then reconstruct the missing entries. By applying gradient descent algorithms, tensor completion problem is efficiently solved by optimization models. We propose two TT-based algorithms: Tensor Train Weighted Optimization (TT-WOPT) and Tensor Train Stochastic Gradient Descent (IT-SGD) to optimize TT decomposition factors. In addition, a method named Visual Data Tensorization (VDT) is proposed to transform visual data into higher-order tensors, resulting in the performance improvement of our algorithms. The experiments in synthetic data and visual data show high efficiency and performance of our algorithms compared to the state-of-the-art completion algorithms, especially in high-order, high missing rate, and large-scale tensor completion situations.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分