Interacting Two-Hand 3D Pose and Shape Reconstruction from Single Color Image

Baowen Zhang

Institute of Software, Chinese Academy of Sciences
University of Chinese Academy of Sciences
Southeast University
Google
Simon Fraser University
Alibaba


We propose a novel deep learning architecture, which can estimate 3D hand pose as well as fine-grained hand shapes of the interacting hands from single color image.



Abstract

We propose a novel deep learning framework to reconstruct 3D hand poses and shapes of two interacting hands from a single color image. Previous methods designed for single hand cannot be easily applied for the two hand scenario because of the heavy inter-hand occlusion and larger solution space. In order to address the occlusion and similar appearance between hands that may confuse the network, we design a hand pose-aware attention module to extract features associated to each individual hand respectively. We then leverage the two hand context presented in interaction and propose a context-aware cascaded refinement that improves the hand pose and shape accuracy of each hand conditioned on the context between interacting hands. Extensive experiments on the main benchmark datasets demonstrate that our method predicts accurate 3D hand pose and shape from single color image, and achieves the state-of-the-art performance.




Network Architecture

Our network first predict 2.5D heatmap for the joints of the two hands. Then use three branches to recover MANO model parameters of each hand and the relative transformation of two hands. Finally, refine the hand shape parameters jointly in a cascaded manner to respect the correlation context between the interacting hands.



Paper, Code and Data

B. Zhang, Y. Wang, X. Deng., Y. Zhang, P. Tan, C. Ma, H. Wang

Interacting Two-Hand 3D Pose and Shape Reconstruction from Single Color Image.

ICCV, 2021.

[Paper]     [Supp]     [Bibtex]     [Code]    



Results


Qualitative comparison of the interacting hand reconstruction with our method and the state-of-the-art single hand reconstruction methods.



Acknowledgements

This work was supported in part by National Natural Science Foundation of China (No. 62076061, No. 61473276, No. 61872346), Beijing Natural Science Foundation (4212029, L182052), and Newton Prize 2019 China Award (NP2PB/100047). The websiteis modified from this template.