咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >From Recognition to Prediction... 收藏
arXiv

From Recognition to Prediction: Leveraging Sequence Reasoning for Action Anticipation

作     者:Liu, Xin Hao, Chao Yu, Zitong Yue, Huanjing Yang, Jingyu 

作者机构:School of Electrical and Information Engineering Tianjin University China Computer Vision and Pattern Recognition Laboratory School of Engineering Sciences Lappeenranta-Lahti University of Technology LUT Finland School of Computing and Information Technology Great Bay University China 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2024年

核心收录:

主  题:Video analysis 

摘      要:The action anticipation task refers to predicting what action will happen based on observed videos, which requires the model to have a strong ability to summarize the present and then reason about the future. Experience and common sense suggest that there is a significant correlation between different actions, which provides valuable prior knowledge for the action anticipation task. However, previous methods have not effectively modeled this underlying statistical relationship. To address this issue, we propose a novel end-to-end video modeling architecture that utilizes attention mechanisms, named Anticipation via Recognition and Reasoning (ARR). ARR decomposes the action anticipation task into action recognition and sequence reasoning tasks, and effectively learns the statistical relationship between actions by next action prediction (NAP). In comparison to existing temporal aggregation strategies, ARR is able to extract more effective features from observable videos to make more reasonable predictions. In addition, to address the challenge of relationship modeling that requires extensive training data, we propose an innovative approach for the unsupervised pre-training of the decoder, which leverages the inherent temporal dynamics of video to enhance the reasoning capabilities of the network. Extensive experiments on the Epic-kitchen-100, EGTEA Gaze+, and 50salads datasets demonstrate the efficacy of the proposed methods. The code is available at https://***/linuxsino/ARR. Copyright © 2024, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分