3D Reconstruction of Clothes using a Human Body Model and its Application to Image-based Virtual Try-On

Image-based virtual try-on (VTON) approaches are getting attention since they do not require 3D modeling. However, 2D cloth warping algorithms cannot cover 3D spatial transformations for diverse target human poses. To solve this problem, we propose a 2D and 3D hybrid method. First, a 3D clothing mesh is reconstructed leveraging a 3D human body model in a rest pose. Due to the correspondence, resulting 3D clothing models can be easily transferred to the target human models with different poses and shapes estimated from 2D images. Finally, the deformed clothing models can be rendered and blended with target human representations. Experimental results with an open dataset show that shapes of reconstructed clothing are more natural, compared to the 2D image-based deformation results, when human poses and shapes are estimated accurately.

PDF Abstract

Datasets


Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods


No methods listed for this paper. Add relevant methods here