咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >MSA-GCN:Multiscale Adaptive Gr... 收藏
arXiv

MSA-GCN:Multiscale Adaptive Graph Convolution Network for Gait Emotion Recognition

作     者:Yin, Yunfei Jing, Li Huang, Faliang Yang, Guangchao Wang, Zhuowei 

作者机构:The School of of Computer Science and Technology Chongqing University Chongqing China Guangxi Key Lab of Human-Machine Interaction and Intelligent Decision Nanning Normal University Nanning530001 China  Faculty of Engineering and Information Technology University of Technology Sydney Australia 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2022年

核心收录:

主  题:Emotion Recognition 

摘      要:Gait emotion recognition plays a crucial role in the intelligent system. Most of the existing methods recognize emotions by focusing on local actions over time. However, they ignore that the effective distances of different emotions in the time domain are different, and the local actions during walking are quite similar. Thus, emotions should be represented by global states instead of indirect local actions. To address these issues, a novel MultiScale Adaptive Graph Convolution Network (MSA-GCN) is presented in this work through constructing dynamic temporal receptive fields and designing multiscale information aggregation to recognize emotions. In our model, a adaptive selective spatial-temporal graph convolution is designed to select the convolution kernel dynamically to obtain the soft spatiotemporal features of different emotions. Moreover, a Cross-Scale mapping Fusion Mechanism (CSFM) is designed to construct an adaptive adjacency matrix to enhance information interaction and reduce redundancy. Compared with previous state-of-the-art methods, the proposed method achieves the best performance on two public datasets, improving the mAP by 2%. We also conduct extensive ablations studies to show the effectiveness of different components in our methods. Copyright © 2022, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分