版权所有:内蒙古大学图书馆 技术提供:维普资讯• 智图
内蒙古自治区呼和浩特市赛罕区大学西街235号 邮编: 010021
作者机构:School of Computer Science and Engineering Center for CyberSecurity University of Electronic Science and Technology of China Xi Yuan Ave. Sichuan Chengdu611731 China
出 版 物:《Journal of Ambient Intelligence and Humanized Computing》 (J. Ambient Intell. Humanized Comput.)
年 卷 期:2023年第14卷第6期
页 面:7679-7693页
核心收录:
学科分类:07[理学] 08[工学] 070105[理学-运筹学与控制论] 071101[理学-系统理论] 0710[理学-生物学] 0810[工学-信息与通信工程] 0711[理学-系统科学] 0835[工学-软件工程] 0714[理学-统计学(可授理学、经济学学位)] 0836[工学-生物工程] 081101[工学-控制理论与控制工程] 0701[理学-数学] 0811[工学-控制科学与工程] 0812[工学-计算机科学与技术(可授工学、理学学位)]
摘 要:Time series prediction is a subset of temporal data mining, which seeks to forecast its values in the future by using the accessible historical observations within the specified time periods. Deep neural networks have shown their superiority in predicting time series according to recent studies. Unfortunately, most models overlook differences and interdependencies between variables when tackling the multivariate long sequence time-series forecasting problem. In order to explicitly learn them and take them into account at a fine-grained level to resolve the dynamics of variable dependencies, we introduce Graph Convolutional Networks into Transformer in this article. To counteract the local insensitivity of the Transformer, we further integrate the Temporal Convolutional Network as a component of the self-attention layer. The experimental evaluation results on four open datasets show that our model performs noticeably better than a diverse range of state-of-the-art benchmarks. © 2023, The Author(s), under exclusive licence to Springer-Verlag GmbH Germany, part of Springer Nature.