TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Tracking-by-Trackers with a Distilled and Reinforced Model

Tracking-by-Trackers with a Distilled and Reinforced Model

Matteo Dunnhofer, Niki Martinel, Christian Micheloni

2020-07-08Visual Object TrackingVisual TrackingObject TrackingKnowledge DistillationVideo Object Tracking
PaperPDFCode(official)

Abstract

Visual object tracking was generally tackled by reasoning independently on fast processing algorithms, accurate online adaptation methods, and fusion of trackers. In this paper, we unify such goals by proposing a novel tracking methodology that takes advantage of other visual trackers, offline and online. A compact student model is trained via the marriage of knowledge distillation and reinforcement learning. The first allows to transfer and compress tracking knowledge of other trackers. The second enables the learning of evaluation measures which are then exploited online. After learning, the student can be ultimately used to build (i) a very fast single-shot tracker, (ii) a tracker with a simple and effective online adaptation mechanism, (iii) a tracker that performs fusion of other trackers. Extensive validation shows that the proposed algorithms compete with real-time state-of-the-art trackers.

Results

TaskDatasetMetricValueModel
VideoNT-VOT211AUC23.58TRAS
VideoNT-VOT211Precision30.64TRAS
Object TrackingUAV123AUC0.679TRASFUST
Object TrackingUAV123Precision0.873TRASFUST
Object TrackingLaSOTAUC57.6TRASFUST
Object TrackingGOT-10kAverage Overlap61.7TRASFUST
Object TrackingGOT-10kSuccess Rate 0.572.9TRASFUST
Object TrackingOTB-2015AUC0.701TRASFUST
Object TrackingOTB-2015Precision0.931TRASFUST
Object TrackingNT-VOT211AUC23.58TRAS
Object TrackingNT-VOT211Precision30.64TRAS
Visual Object TrackingUAV123AUC0.679TRASFUST
Visual Object TrackingUAV123Precision0.873TRASFUST
Visual Object TrackingLaSOTAUC57.6TRASFUST
Visual Object TrackingGOT-10kAverage Overlap61.7TRASFUST
Visual Object TrackingGOT-10kSuccess Rate 0.572.9TRASFUST
Visual Object TrackingOTB-2015AUC0.701TRASFUST
Visual Object TrackingOTB-2015Precision0.931TRASFUST

Related Papers

Visual-Language Model Knowledge Distillation Method for Image Quality Assessment2025-07-21MVA 2025 Small Multi-Object Tracking for Spotting Birds Challenge: Dataset, Methods, and Results2025-07-17Uncertainty-Aware Cross-Modal Knowledge Distillation with Prototype Learning for Multimodal Brain-Computer Interfaces2025-07-17YOLOv8-SMOT: An Efficient and Robust Framework for Real-Time Small Object Tracking via Slice-Assisted Training and Adaptive Association2025-07-16DVFL-Net: A Lightweight Distilled Video Focal Modulation Network for Spatio-Temporal Action Recognition2025-07-16HanjaBridge: Resolving Semantic Ambiguity in Korean LLMs via Hanja-Augmented Pre-Training2025-07-15Feature Distillation is the Better Choice for Model-Heterogeneous Federated Learning2025-07-14KAT-V1: Kwai-AutoThink Technical Report2025-07-11