TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Towards Grand Unification of Object Tracking

Towards Grand Unification of Object Tracking

Bin Yan, Yi Jiang, Peize Sun, Dong Wang, Zehuan Yuan, Ping Luo, Huchuan Lu

2022-07-14Visual Object TrackingMulti-Object Tracking and SegmentationMulti-Object TrackingObject TrackingMultiple Object TrackingVideo Object Tracking
PaperPDFCode(official)

Abstract

We present a unified method, termed Unicorn, that can simultaneously solve four tracking problems (SOT, MOT, VOS, MOTS) with a single network using the same model parameters. Due to the fragmented definitions of the object tracking problem itself, most existing trackers are developed to address a single or part of tasks and overspecialize on the characteristics of specific tasks. By contrast, Unicorn provides a unified solution, adopting the same input, backbone, embedding, and head across all tracking tasks. For the first time, we accomplish the great unification of the tracking network architecture and learning paradigm. Unicorn performs on-par or better than its task-specific counterparts in 8 tracking datasets, including LaSOT, TrackingNet, MOT17, BDD100K, DAVIS16-17, MOTS20, and BDD100K MOTS. We believe that Unicorn will serve as a solid step towards the general vision model. Code is available at https://github.com/MasterBin-IIAU/Unicorn.

Results

TaskDatasetMetricValueModel
VideoBDD100K valmIDF154Unicorn
VideoBDD100K valmMOTA41.2Unicorn
VideoNT-VOT211AUC34.52Unicorn
VideoNT-VOT211Precision47.77Unicorn
Multi-Object TrackingMOTS20IDF165.9Unicorn
Multi-Object TrackingMOTS20sMOTSA65.3Unicorn
Multi-Object TrackingMOT17HOTA61.7Unicorn
Multi-Object TrackingMOT17IDF175.5Unicorn
Multi-Object TrackingMOT17MOTA77.2Unicorn
Object TrackingMOTS20IDF165.9Unicorn
Object TrackingMOTS20sMOTSA65.3Unicorn
Object TrackingMOT17HOTA61.7Unicorn
Object TrackingMOT17IDF175.5Unicorn
Object TrackingMOT17MOTA77.2Unicorn
Object TrackingLaSOTAUC68.5Unicorn
Object TrackingLaSOTNormalized Precision76.6Unicorn
Object TrackingLaSOTPrecision74.1Unicorn
Object TrackingTrackingNetAccuracy83Unicorn
Object TrackingTrackingNetNormalized Precision86.4Unicorn
Object TrackingTrackingNetPrecision82.2Unicorn
Object TrackingBDD100K valmIDF154Unicorn
Object TrackingBDD100K valmMOTA41.2Unicorn
Object TrackingNT-VOT211AUC34.52Unicorn
Object TrackingNT-VOT211Precision47.77Unicorn
Multi-Object Tracking and SegmentationBDD100K valmMOTSA29.6Unicorn
Multiple Object TrackingBDD100K valmIDF154Unicorn
Multiple Object TrackingBDD100K valmMOTA41.2Unicorn
Visual Object TrackingLaSOTAUC68.5Unicorn
Visual Object TrackingLaSOTNormalized Precision76.6Unicorn
Visual Object TrackingLaSOTPrecision74.1Unicorn
Visual Object TrackingTrackingNetAccuracy83Unicorn
Visual Object TrackingTrackingNetNormalized Precision86.4Unicorn
Visual Object TrackingTrackingNetPrecision82.2Unicorn

Related Papers

MVA 2025 Small Multi-Object Tracking for Spotting Birds Challenge: Dataset, Methods, and Results2025-07-17YOLOv8-SMOT: An Efficient and Robust Framework for Real-Time Small Object Tracking via Slice-Assisted Training and Adaptive Association2025-07-16HiM2SAM: Enhancing SAM2 with Hierarchical Motion Estimation and Memory Optimization towards Long-term Tracking2025-07-10Robustifying 3D Perception through Least-Squares Multi-Agent Graphs Object Tracking2025-07-07UMDATrack: Unified Multi-Domain Adaptive Tracking Under Adverse Weather Conditions2025-07-01Mamba-FETrack V2: Revisiting State Space Model for Frame-Event based Visual Object Tracking2025-06-30Visual and Memory Dual Adapter for Multi-Modal Object Tracking2025-06-30R1-Track: Direct Application of MLLMs to Visual Object Tracking via Reinforcement Learning2025-06-27