Knowledge Distillation on COCO 2017 val
Metric: AP@0.5 (higher is better)
LeaderboardDataset
Loading chart...
Results
Submit a result| # | Model↕ | AP@0.5▼ | Extra Data | Paper | Date↕ | Code |
|---|---|---|---|---|---|---|
| 1 | ReviewKD++(T: faster rcnn(resnet101), S:faster rcnn(resnet50)) | 61.8 | No | Improving Knowledge Distillation via Regularizin... | 2023-05-26 | Code |
| 2 | ReviewKD++(T: faster rcnn(resnet101), S:faster rcnn(resnet18)) | 57.96 | No | Improving Knowledge Distillation via Regularizin... | 2023-05-26 | Code |
| 3 | ReviewKD++(T: faster rcnn(resnet101), S:faster rcnn(mobilenet-v2)) | 55.18 | No | Improving Knowledge Distillation via Regularizin... | 2023-05-26 | Code |