TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Pose2Mesh: Graph Convolutional Network for 3D Human Pose a...

Pose2Mesh: Graph Convolutional Network for 3D Human Pose and Mesh Recovery from a 2D Human Pose

Hongsuk Choi, Gyeongsik Moon, Kyoung Mu Lee

2020-08-20ECCV 2020 83D Human Pose Estimation3D Hand Pose Estimation
PaperPDFCode(official)Code

Abstract

Most of the recent deep learning-based 3D human pose and mesh estimation methods regress the pose and shape parameters of human mesh models, such as SMPL and MANO, from an input image. The first weakness of these methods is an appearance domain gap problem, due to different image appearance between train data from controlled environments, such as a laboratory, and test data from in-the-wild environments. The second weakness is that the estimation of the pose parameters is quite challenging owing to the representation issues of 3D rotations. To overcome the above weaknesses, we propose Pose2Mesh, a novel graph convolutional neural network (GraphCNN)-based system that estimates the 3D coordinates of human mesh vertices directly from the 2D human pose. The 2D human pose as input provides essential human body articulation information, while having a relatively homogeneous geometric property between the two domains. Also, the proposed system avoids the representation issues, while fully exploiting the mesh topology using a GraphCNN in a coarse-to-fine manner. We show that our Pose2Mesh outperforms the previous 3D human pose and mesh estimation methods on various benchmark datasets. For the codes, see https://github.com/hongsukchoi/Pose2Mesh_RELEASE.

Results

TaskDatasetMetricValueModel
HandFreiHANDPA-F@15mm0.969Pose2Mesh
HandFreiHANDPA-F@5mm0.674Pose2Mesh
HandFreiHANDPA-MPJPE7.7Pose2Mesh
HandFreiHANDPA-MPVPE7.8Pose2Mesh
HandHO-3D v2AUC_J0.754Pose2Mesh
HandHO-3D v2AUC_V0.749Pose2Mesh
HandHO-3D v2F@15mm0.909Pose2Mesh
HandHO-3D v2F@5mm0.441Pose2Mesh
HandHO-3D v2PA-MPJPE (mm)12.5Pose2Mesh
HandHO-3D v2PA-MPVPE12.7Pose2Mesh
Pose EstimationFreiHANDPA-F@15mm0.969Pose2Mesh
Pose EstimationFreiHANDPA-F@5mm0.674Pose2Mesh
Pose EstimationFreiHANDPA-MPJPE7.7Pose2Mesh
Pose EstimationFreiHANDPA-MPVPE7.8Pose2Mesh
Pose EstimationHO-3D v2AUC_J0.754Pose2Mesh
Pose EstimationHO-3D v2AUC_V0.749Pose2Mesh
Pose EstimationHO-3D v2F@15mm0.909Pose2Mesh
Pose EstimationHO-3D v2F@5mm0.441Pose2Mesh
Pose EstimationHO-3D v2PA-MPJPE (mm)12.5Pose2Mesh
Pose EstimationHO-3D v2PA-MPVPE12.7Pose2Mesh
Hand Pose EstimationFreiHANDPA-F@15mm0.969Pose2Mesh
Hand Pose EstimationFreiHANDPA-F@5mm0.674Pose2Mesh
Hand Pose EstimationFreiHANDPA-MPJPE7.7Pose2Mesh
Hand Pose EstimationFreiHANDPA-MPVPE7.8Pose2Mesh
Hand Pose EstimationHO-3D v2AUC_J0.754Pose2Mesh
Hand Pose EstimationHO-3D v2AUC_V0.749Pose2Mesh
Hand Pose EstimationHO-3D v2F@15mm0.909Pose2Mesh
Hand Pose EstimationHO-3D v2F@5mm0.441Pose2Mesh
Hand Pose EstimationHO-3D v2PA-MPJPE (mm)12.5Pose2Mesh
Hand Pose EstimationHO-3D v2PA-MPVPE12.7Pose2Mesh
3DFreiHANDPA-F@15mm0.969Pose2Mesh
3DFreiHANDPA-F@5mm0.674Pose2Mesh
3DFreiHANDPA-MPJPE7.7Pose2Mesh
3DFreiHANDPA-MPVPE7.8Pose2Mesh
3DHO-3D v2AUC_J0.754Pose2Mesh
3DHO-3D v2AUC_V0.749Pose2Mesh
3DHO-3D v2F@15mm0.909Pose2Mesh
3DHO-3D v2F@5mm0.441Pose2Mesh
3DHO-3D v2PA-MPJPE (mm)12.5Pose2Mesh
3DHO-3D v2PA-MPVPE12.7Pose2Mesh
3D Hand Pose EstimationFreiHANDPA-F@15mm0.969Pose2Mesh
3D Hand Pose EstimationFreiHANDPA-F@5mm0.674Pose2Mesh
3D Hand Pose EstimationFreiHANDPA-MPJPE7.7Pose2Mesh
3D Hand Pose EstimationFreiHANDPA-MPVPE7.8Pose2Mesh
3D Hand Pose EstimationHO-3D v2AUC_J0.754Pose2Mesh
3D Hand Pose EstimationHO-3D v2AUC_V0.749Pose2Mesh
3D Hand Pose EstimationHO-3D v2F@15mm0.909Pose2Mesh
3D Hand Pose EstimationHO-3D v2F@5mm0.441Pose2Mesh
3D Hand Pose EstimationHO-3D v2PA-MPJPE (mm)12.5Pose2Mesh
3D Hand Pose EstimationHO-3D v2PA-MPVPE12.7Pose2Mesh
1 Image, 2*2 StitchiFreiHANDPA-F@15mm0.969Pose2Mesh
1 Image, 2*2 StitchiFreiHANDPA-F@5mm0.674Pose2Mesh
1 Image, 2*2 StitchiFreiHANDPA-MPJPE7.7Pose2Mesh
1 Image, 2*2 StitchiFreiHANDPA-MPVPE7.8Pose2Mesh
1 Image, 2*2 StitchiHO-3D v2AUC_J0.754Pose2Mesh
1 Image, 2*2 StitchiHO-3D v2AUC_V0.749Pose2Mesh
1 Image, 2*2 StitchiHO-3D v2F@15mm0.909Pose2Mesh
1 Image, 2*2 StitchiHO-3D v2F@5mm0.441Pose2Mesh
1 Image, 2*2 StitchiHO-3D v2PA-MPJPE (mm)12.5Pose2Mesh
1 Image, 2*2 StitchiHO-3D v2PA-MPVPE12.7Pose2Mesh

Related Papers

Systematic Comparison of Projection Methods for Monocular 3D Human Pose Estimation on Fisheye Images2025-06-24ExtPose: Robust and Coherent Pose Estimation by Extending ViTs2025-06-18PoseGRAF: Geometric-Reinforced Adaptive Fusion for Monocular 3D Human Pose Estimation2025-06-17Monocular 3D Hand Pose Estimation with Implicit Camera Alignment2025-06-10Learning Pyramid-structured Long-range Dependencies for 3D Human Pose Estimation2025-06-03UPTor: Unified 3D Human Pose Dynamics and Trajectory Prediction for Human-Robot Interaction2025-05-20PoseBench3D: A Cross-Dataset Analysis Framework for 3D Human Pose Estimation2025-05-16HDiffTG: A Lightweight Hybrid Diffusion-Transformer-GCN Architecture for 3D Human Pose Estimation2025-05-07