咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >Render in-between: Motion guid... 收藏
arXiv

Render in-between: Motion guided video synthesis for action interpolation

作     者:Ho, Hsuan-I Chen, Xu Song, Jie Hilliges, Otmar 

作者机构:Department of Computer Science ETH Zürich Max Planck Institute for Intelligent Systems Tübingen Germany 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2021年

核心收录:

主  题:Signal sampling 

摘      要:Frame rate greatly impacts the style and viewing experience of a video. Especially for footage that depicts fast motion, higher frame rate equates to smoother imagery and reduced motion blur artifacts. Upsampling of low-frame-rate videos of human activity is an interesting yet challenging task with many potential applications ranging from gaming to entertainment and sports broadcasting. The main difficulty in synthesizing video frames in this setting stems from the highly complex and non-linear nature of human motion and the complex appearance and texture of the body. We propose to address these issues in a motion-guided frame-upsampling framework that is capable of producing realistic human motion and appearance. A novel motion model is trained to interpolate the non-linear skeletal motion between frames by leveraging a large-scale motion-capture dataset (AMASS). The high-frame-rate pose predictions are then used by a neural rendering pipeline to produce the full-frame output, taking the pose and background consistency into consideration. Our pipeline only requires low-frame-rate videos and unpaired human motion data but does not require high-frame-rate videos for training. Furthermore, we contribute the first evaluation dataset that consists of high-quality and high-frame-rate videos of human activities for this task. Compared with state-of-the-art video interpolation techniques, our method produces interpolated frames with better quality and accuracy, which is evident by state-of-the-art results on pixel-level, distributional metrics and comparative user evaluations. Our code and the collected dataset are available at https://***/Render-In-Between. Copyright © 2021, The Authors. All rights reserved.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分