TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/ATOM: Accurate Tracking by Overlap Maximization

ATOM: Accurate Tracking by Overlap Maximization

Martin Danelljan, Goutam Bhat, Fahad Shahbaz Khan, Michael Felsberg

2018-11-19CVPR 2019 6Visual Object TrackingVisual TrackingObject TrackingGeneral Classification
PaperPDFCodeCode(official)CodeCode

Abstract

While recent years have witnessed astonishing improvements in visual tracking robustness, the advancements in tracking accuracy have been limited. As the focus has been directed towards the development of powerful classifiers, the problem of accurate target state estimation has been largely overlooked. In fact, most trackers resort to a simple multi-scale search in order to estimate the target bounding box. We argue that this approach is fundamentally limited since target estimation is a complex task, requiring high-level knowledge about the object. We address this problem by proposing a novel tracking architecture, consisting of dedicated target estimation and classification components. High level knowledge is incorporated into the target estimation through extensive offline learning. Our target estimation component is trained to predict the overlap between the target object and an estimated bounding box. By carefully integrating target-specific information, our approach achieves previously unseen bounding box accuracy. We further introduce a classification component that is trained online to guarantee high discriminative power in the presence of distractors. Our final tracking framework sets a new state-of-the-art on five challenging benchmarks. On the new large-scale TrackingNet dataset, our tracker ATOM achieves a relative gain of 15% over the previous best approach, while running at over 30 FPS. Code and models are available at https://github.com/visionml/pytracking.

Results

TaskDatasetMetricValueModel
Object TrackingFE108Averaged Precision71.3ATOM
Object TrackingFE108Success Rate46.5ATOM
Object TrackingLaSOTAUC51.4ATOM
Object TrackingLaSOTNormalized Precision57.6ATOM
Object TrackingLaSOTPrecision50.5ATOM
Object TrackingGOT-10kAverage Overlap61ATOM
Object TrackingGOT-10kSuccess Rate 0.574.2ATOM
Object TrackingTrackingNetAccuracy70.34ATOM
Object TrackingTrackingNetNormalized Precision77.11ATOM
Object TrackingTrackingNetPrecision64.84ATOM
Visual Object TrackingLaSOTAUC51.4ATOM
Visual Object TrackingLaSOTNormalized Precision57.6ATOM
Visual Object TrackingLaSOTPrecision50.5ATOM
Visual Object TrackingGOT-10kAverage Overlap61ATOM
Visual Object TrackingGOT-10kSuccess Rate 0.574.2ATOM
Visual Object TrackingTrackingNetAccuracy70.34ATOM
Visual Object TrackingTrackingNetNormalized Precision77.11ATOM
Visual Object TrackingTrackingNetPrecision64.84ATOM

Related Papers

MVA 2025 Small Multi-Object Tracking for Spotting Birds Challenge: Dataset, Methods, and Results2025-07-17YOLOv8-SMOT: An Efficient and Robust Framework for Real-Time Small Object Tracking via Slice-Assisted Training and Adaptive Association2025-07-16HiM2SAM: Enhancing SAM2 with Hierarchical Motion Estimation and Memory Optimization towards Long-term Tracking2025-07-10What You Have is What You Track: Adaptive and Robust Multimodal Tracking2025-07-08Robustifying 3D Perception through Least-Squares Multi-Agent Graphs Object Tracking2025-07-07UMDATrack: Unified Multi-Domain Adaptive Tracking Under Adverse Weather Conditions2025-07-01Mamba-FETrack V2: Revisiting State Space Model for Frame-Event based Visual Object Tracking2025-06-30Visual and Memory Dual Adapter for Multi-Modal Object Tracking2025-06-30