TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/PARE: Part Attention Regressor for 3D Human Body Estimation

PARE: Part Attention Regressor for 3D Human Body Estimation

Muhammed Kocabas, Chun-Hao P. Huang, Otmar Hilliges, Michael J. Black

2021-04-17ICCV 2021 103D Human Pose Estimation3D human pose and shape estimation3D Multi-Person Pose Estimation
PaperPDFCode(official)

Abstract

Despite significant progress, we show that state of the art 3D human pose and shape estimation methods remain sensitive to partial occlusion and can produce dramatically wrong predictions although much of the body is observable. To address this, we introduce a soft attention mechanism, called the Part Attention REgressor (PARE), that learns to predict body-part-guided attention masks. We observe that state-of-the-art methods rely on global feature representations, making them sensitive to even small occlusions. In contrast, PARE's part-guided attention mechanism overcomes these issues by exploiting information about the visibility of individual body parts while leveraging information from neighboring body-parts to predict occluded parts. We show qualitatively that PARE learns sensible attention masks, and quantitative evaluation confirms that PARE achieves more accurate and robust reconstruction results than existing approaches on both occlusion-specific and standard benchmarks. The code and data are available for research purposes at {\small \url{https://pare.is.tue.mpg.de/}}

Results

TaskDatasetMetricValueModel
3D Human Pose EstimationAGORAB-MPJPE146.2PARE
3D Human Pose EstimationAGORAB-MVE140.9PARE
3D Human Pose EstimationAGORAB-NMJE174PARE
3D Human Pose EstimationAGORAB-NMVE167.7PARE
3D Human Pose EstimationEMDBAverage MPJAE (deg)24.673PARE
3D Human Pose EstimationEMDBAverage MPJAE-PA (deg)22.3842PARE
3D Human Pose EstimationEMDBAverage MPJPE (mm)113.887PARE
3D Human Pose EstimationEMDBAverage MPJPE-PA (mm)72.203PARE
3D Human Pose EstimationEMDBAverage MVE (mm)133.247PARE
3D Human Pose EstimationEMDBAverage MVE-PA (mm)85.3788PARE
3D Human Pose EstimationEMDBJitter (10m/s^3)75.1137PARE
3D Human Pose EstimationAGORAB-MPJPE146.2PARE
3D Human Pose EstimationAGORAB-MVE140.9PARE
3D Human Pose EstimationAGORAB-NMJE174PARE
3D Human Pose EstimationAGORAB-NMVE167.7PARE
Pose EstimationAGORAB-MPJPE146.2PARE
Pose EstimationAGORAB-MVE140.9PARE
Pose EstimationAGORAB-NMJE174PARE
Pose EstimationAGORAB-NMVE167.7PARE
Pose EstimationEMDBAverage MPJAE (deg)24.673PARE
Pose EstimationEMDBAverage MPJAE-PA (deg)22.3842PARE
Pose EstimationEMDBAverage MPJPE (mm)113.887PARE
Pose EstimationEMDBAverage MPJPE-PA (mm)72.203PARE
Pose EstimationEMDBAverage MVE (mm)133.247PARE
Pose EstimationEMDBAverage MVE-PA (mm)85.3788PARE
Pose EstimationEMDBJitter (10m/s^3)75.1137PARE
Pose EstimationAGORAB-MPJPE146.2PARE
Pose EstimationAGORAB-MVE140.9PARE
Pose EstimationAGORAB-NMJE174PARE
Pose EstimationAGORAB-NMVE167.7PARE
3DAGORAB-MPJPE146.2PARE
3DAGORAB-MVE140.9PARE
3DAGORAB-NMJE174PARE
3DAGORAB-NMVE167.7PARE
3DEMDBAverage MPJAE (deg)24.673PARE
3DEMDBAverage MPJAE-PA (deg)22.3842PARE
3DEMDBAverage MPJPE (mm)113.887PARE
3DEMDBAverage MPJPE-PA (mm)72.203PARE
3DEMDBAverage MVE (mm)133.247PARE
3DEMDBAverage MVE-PA (mm)85.3788PARE
3DEMDBJitter (10m/s^3)75.1137PARE
3DAGORAB-MPJPE146.2PARE
3DAGORAB-MVE140.9PARE
3DAGORAB-NMJE174PARE
3DAGORAB-NMVE167.7PARE
3D Multi-Person Pose EstimationAGORAB-MPJPE146.2PARE
3D Multi-Person Pose EstimationAGORAB-MVE140.9PARE
3D Multi-Person Pose EstimationAGORAB-NMJE174PARE
3D Multi-Person Pose EstimationAGORAB-NMVE167.7PARE
1 Image, 2*2 StitchiAGORAB-MPJPE146.2PARE
1 Image, 2*2 StitchiAGORAB-MVE140.9PARE
1 Image, 2*2 StitchiAGORAB-NMJE174PARE
1 Image, 2*2 StitchiAGORAB-NMVE167.7PARE
1 Image, 2*2 StitchiEMDBAverage MPJAE (deg)24.673PARE
1 Image, 2*2 StitchiEMDBAverage MPJAE-PA (deg)22.3842PARE
1 Image, 2*2 StitchiEMDBAverage MPJPE (mm)113.887PARE
1 Image, 2*2 StitchiEMDBAverage MPJPE-PA (mm)72.203PARE
1 Image, 2*2 StitchiEMDBAverage MVE (mm)133.247PARE
1 Image, 2*2 StitchiEMDBAverage MVE-PA (mm)85.3788PARE
1 Image, 2*2 StitchiEMDBJitter (10m/s^3)75.1137PARE
1 Image, 2*2 StitchiAGORAB-MPJPE146.2PARE
1 Image, 2*2 StitchiAGORAB-MVE140.9PARE
1 Image, 2*2 StitchiAGORAB-NMJE174PARE
1 Image, 2*2 StitchiAGORAB-NMVE167.7PARE

Related Papers

Systematic Comparison of Projection Methods for Monocular 3D Human Pose Estimation on Fisheye Images2025-06-24ExtPose: Robust and Coherent Pose Estimation by Extending ViTs2025-06-18PoseGRAF: Geometric-Reinforced Adaptive Fusion for Monocular 3D Human Pose Estimation2025-06-17Learning Pyramid-structured Long-range Dependencies for 3D Human Pose Estimation2025-06-03UPTor: Unified 3D Human Pose Dynamics and Trajectory Prediction for Human-Robot Interaction2025-05-20PoseBench3D: A Cross-Dataset Analysis Framework for 3D Human Pose Estimation2025-05-16HDiffTG: A Lightweight Hybrid Diffusion-Transformer-GCN Architecture for 3D Human Pose Estimation2025-05-07Continuous Normalizing Flows for Uncertainty-Aware Human Pose Estimation2025-05-04