TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Mo2Cap2: Real-time Mobile 3D Motion Capture with a Cap-mou...

Mo2Cap2: Real-time Mobile 3D Motion Capture with a Cap-mounted Fisheye Camera

Weipeng Xu, Avishek Chatterjee, Michael Zollhoefer, Helge Rhodin, Pascal Fua, Hans-Peter Seidel, Christian Theobalt

2018-03-15Egocentric Pose EstimationPose Estimation3D Pose Estimation
PaperPDF

Abstract

We propose the first real-time approach for the egocentric estimation of 3D human body pose in a wide range of unconstrained everyday activities. This setting has a unique set of challenges, such as mobility of the hardware setup, and robustness to long capture sessions with fast recovery from tracking failures. We tackle these challenges based on a novel lightweight setup that converts a standard baseball cap to a device for high-quality pose estimation based on a single cap-mounted fisheye camera. From the captured egocentric live stream, our CNN based 3D pose estimation approach runs at 60Hz on a consumer-level GPU. In addition to the novel hardware setup, our other main contributions are: 1) a large ground truth training corpus of top-down fisheye images and 2) a novel disentangled 3D pose estimation approach that takes the unique properties of the egocentric viewpoint into account. As shown by our evaluation, we achieve lower 3D joint error as well as better 2D overlay than the existing baselines.

Results

TaskDatasetMetricValueModel
3D Human Pose EstimationGlobalEgoMocap Test DatasetAverage MPJPE (mm)102.3Mo2Cap2
3D Human Pose EstimationGlobalEgoMocap Test DatasetPA-MPJPE74.46Mo2Cap2
3D Human Pose EstimationSceneEgoAverage MPJPE (mm)200.3Mo2Cap2
3D Human Pose EstimationSceneEgoPA-MPJPE121.2Mo2Cap2
Pose EstimationGlobalEgoMocap Test DatasetAverage MPJPE (mm)102.3Mo2Cap2
Pose EstimationGlobalEgoMocap Test DatasetPA-MPJPE74.46Mo2Cap2
Pose EstimationSceneEgoAverage MPJPE (mm)200.3Mo2Cap2
Pose EstimationSceneEgoPA-MPJPE121.2Mo2Cap2
3DGlobalEgoMocap Test DatasetAverage MPJPE (mm)102.3Mo2Cap2
3DGlobalEgoMocap Test DatasetPA-MPJPE74.46Mo2Cap2
3DSceneEgoAverage MPJPE (mm)200.3Mo2Cap2
3DSceneEgoPA-MPJPE121.2Mo2Cap2
1 Image, 2*2 StitchiGlobalEgoMocap Test DatasetAverage MPJPE (mm)102.3Mo2Cap2
1 Image, 2*2 StitchiGlobalEgoMocap Test DatasetPA-MPJPE74.46Mo2Cap2
1 Image, 2*2 StitchiSceneEgoAverage MPJPE (mm)200.3Mo2Cap2
1 Image, 2*2 StitchiSceneEgoPA-MPJPE121.2Mo2Cap2

Related Papers

$π^3$: Scalable Permutation-Equivariant Visual Geometry Learning2025-07-17Revisiting Reliability in the Reasoning-based Pose Estimation Benchmark2025-07-17DINO-VO: A Feature-based Visual Odometry Leveraging a Visual Foundation Model2025-07-17From Neck to Head: Bio-Impedance Sensing for Head Pose Estimation2025-07-17AthleticsPose: Authentic Sports Motion Dataset on Athletic Field and Evaluation of Monocular 3D Pose Estimation Ability2025-07-17SpatialTrackerV2: 3D Point Tracking Made Easy2025-07-16SGLoc: Semantic Localization System for Camera Pose Estimation from 3D Gaussian Splatting Representation2025-07-16Efficient Calisthenics Skills Classification through Foreground Instance Selection and Depth Estimation2025-07-16