TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/EagerMOT: 3D Multi-Object Tracking via Sensor Fusion

EagerMOT: 3D Multi-Object Tracking via Sensor Fusion

Aleksandr Kim, Aljoša Ošep, Laura Leal-Taixé

2021-04-29Sensor FusionMotion PlanningMulti-Object Tracking and SegmentationMulti-Object TrackingObject TrackingMultiple Object Tracking3D Multi-Object Tracking
PaperPDFCodeCode(official)Code

Abstract

Multi-object tracking (MOT) enables mobile robots to perform well-informed motion planning and navigation by localizing surrounding objects in 3D space and time. Existing methods rely on depth sensors (e.g., LiDAR) to detect and track targets in 3D space, but only up to a limited sensing range due to the sparsity of the signal. On the other hand, cameras provide a dense and rich visual signal that helps to localize even distant objects, but only in the image domain. In this paper, we propose EagerMOT, a simple tracking formulation that eagerly integrates all available object observations from both sensor modalities to obtain a well-informed interpretation of the scene dynamics. Using images, we can identify distant incoming objects, while depth estimates allow for precise trajectory localization as soon as objects are within the depth-sensing range. With EagerMOT, we achieve state-of-the-art results across several MOT tasks on the KITTI and NuScenes datasets. Our code is available at https://github.com/aleksandrkim61/EagerMOT.

Results

TaskDatasetMetricValueModel
VideoKITTI Test (Online Methods)HOTA74.39EagerMOT
VideoKITTI Test (Online Methods)IDSW239EagerMOT
VideoKITTI Test (Online Methods)MOTA87.82EagerMOT
Multi-Object TrackingnuScenesAMOTA0.68EagerMOT
Multi-Object TrackingnuScenesMOTA0.57EagerMOT
Multi-Object TrackingnuScenesRecall0.73EagerMOT
Multi-Object TrackingnuScenesAMOTA0.66PolarMOT
Object TrackingnuScenesAMOTA0.68EagerMOT
Object TrackingnuScenesMOTA0.57EagerMOT
Object TrackingnuScenesRecall0.73EagerMOT
Object TrackingnuScenesAMOTA0.66PolarMOT
Object TrackingKITTI Test (Online Methods)HOTA74.39EagerMOT
Object TrackingKITTI Test (Online Methods)IDSW239EagerMOT
Object TrackingKITTI Test (Online Methods)MOTA87.82EagerMOT
3D Multi-Object TrackingnuScenesAMOTA0.68EagerMOT
3D Multi-Object TrackingnuScenesMOTA0.57EagerMOT
3D Multi-Object TrackingnuScenesRecall0.73EagerMOT
3D Multi-Object TrackingnuScenesAMOTA0.66PolarMOT
Multi-Object Tracking and SegmentationKITTI MOTSAssA73.75EagerMOT
Multi-Object Tracking and SegmentationKITTI MOTSDetA76.11EagerMOT
Multi-Object Tracking and SegmentationKITTI MOTSHOTA74.66EagerMOT
Multiple Object TrackingKITTI Test (Online Methods)HOTA74.39EagerMOT
Multiple Object TrackingKITTI Test (Online Methods)IDSW239EagerMOT
Multiple Object TrackingKITTI Test (Online Methods)MOTA87.82EagerMOT

Related Papers

MVA 2025 Small Multi-Object Tracking for Spotting Birds Challenge: Dataset, Methods, and Results2025-07-17Vision-based Perception for Autonomous Vehicles in Obstacle Avoidance Scenarios2025-07-16YOLOv8-SMOT: An Efficient and Robust Framework for Real-Time Small Object Tracking via Slice-Assisted Training and Adaptive Association2025-07-16Data-Driven Meta-Analysis and Public-Dataset Evaluation for Sensor-Based Gait Age Estimation2025-07-15HiM2SAM: Enhancing SAM2 with Hierarchical Motion Estimation and Memory Optimization towards Long-term Tracking2025-07-10Event-RGB Fusion for Spacecraft Pose Estimation Under Harsh Lighting2025-07-08Robustifying 3D Perception through Least-Squares Multi-Agent Graphs Object Tracking2025-07-07UMDATrack: Unified Multi-Domain Adaptive Tracking Under Adverse Weather Conditions2025-07-01