咨询与建议

看过本文的还看了

相关文献

该作者的其他文献

文献详情 >DI-Net: Decomposed Implicit Ga... 收藏
arXiv

DI-Net: Decomposed Implicit Garment Transfer Network for Digital Clothed 3D Human

作     者:Zhong, Xiaojing Su, Yukun Wu, Zhonghua Lin, Guosheng Wu, Qingyao 

作者机构:School of Software Engineering South China University of Technology Guangzhou China School of Computer Science and Engineering Nanyang Technological University Singapore 

出 版 物:《arXiv》 (arXiv)

年 卷 期:2023年

核心收录:

主  题:Textures 

摘      要:3D virtual try-on enjoys many potential applications and hence has attracted wide attention. However, it remains a challenging task that has not been adequately solved. Existing 2D virtual try-on methods cannot be directly extended to 3D since they lack the ability to perceive the depth of each pixel. Besides, 3D virtual try-on approaches are mostly built on the fixed topological structure and with heavy computation. To deal with these problems, we propose a Decomposed Implicit garment transfer network (DI-Net), which can effortlessly reconstruct a 3D human mesh with the newly try-on result and preserve the texture from an arbitrary perspective. Specifically, DI-Net consists of two modules: 1) A complementary warping module that warps the reference image to have the same pose as the source image through dense correspondence learning and sparse flow learning;2) A geometry-aware decomposed transfer module that decomposes the garment transfer into image layout based transfer and texture based transfer, achieving surface and texture reconstruction by constructing pixel-aligned implicit functions. Experimental results show the effectiveness and superiority of our method in the 3D virtual try-on task, which can yield more high-quality results over other existing methods. © 2023, CC BY.

读者评论 与其他读者分享你的观点

用户名:未登录
我的评分