TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/torchdistill: A Modular, Configuration-Driven Framework fo...

torchdistill: A Modular, Configuration-Driven Framework for Knowledge Distillation

Yoshitomo Matsubara

2020-11-25Image ClassificationModel CompressionInstance SegmentationKnowledge DistillationNeural Network CompressionObject Detection
PaperPDFCode(official)

Abstract

While knowledge distillation (transfer) has been attracting attentions from the research community, the recent development in the fields has heightened the need for reproducible studies and highly generalized frameworks to lower barriers to such high-quality, reproducible deep learning research. Several researchers voluntarily published frameworks used in their knowledge distillation studies to help other interested researchers reproduce their original work. Such frameworks, however, are usually neither well generalized nor maintained, thus researchers are still required to write a lot of code to refactor/build on the frameworks for introducing new methods, models, datasets and designing experiments. In this paper, we present our developed open-source framework built on PyTorch and dedicated for knowledge distillation studies. The framework is designed to enable users to design experiments by declarative PyYAML configuration files, and helps researchers complete the recently proposed ML Code Completeness Checklist. Using the developed framework, we demonstrate its various efficient training strategies, and implement a variety of knowledge distillation methods. We also reproduce some of their original experimental results on the ImageNet and COCO datasets presented at major machine learning conferences such as ICLR, NeurIPS, CVPR and ECCV, including recent state-of-the-art methods. All the source code, configurations, log files and trained model weights are publicly available at https://github.com/yoshitomo-matsubara/torchdistill .

Results

TaskDatasetMetricValueModel
Object DetectionCOCO test-devbox mAP36.9Mask R-CNN (Bottleneck-injected ResNet-50, FPN)
Object DetectionCOCO test-devbox mAP35.9Faster R-CNN (Bottleneck-injected ResNet-50 and FPN)
3DCOCO test-devbox mAP36.9Mask R-CNN (Bottleneck-injected ResNet-50, FPN)
3DCOCO test-devbox mAP35.9Faster R-CNN (Bottleneck-injected ResNet-50 and FPN)
Instance SegmentationCOCO test-devmask AP33.6Mask R-CNN (Bottleneck-injected ResNet-50, FPN)
2D ClassificationCOCO test-devbox mAP36.9Mask R-CNN (Bottleneck-injected ResNet-50, FPN)
2D ClassificationCOCO test-devbox mAP35.9Faster R-CNN (Bottleneck-injected ResNet-50 and FPN)
2D Object DetectionCOCO test-devbox mAP36.9Mask R-CNN (Bottleneck-injected ResNet-50, FPN)
2D Object DetectionCOCO test-devbox mAP35.9Faster R-CNN (Bottleneck-injected ResNet-50 and FPN)
16kCOCO test-devbox mAP36.9Mask R-CNN (Bottleneck-injected ResNet-50, FPN)
16kCOCO test-devbox mAP35.9Faster R-CNN (Bottleneck-injected ResNet-50 and FPN)

Related Papers

LINR-PCGC: Lossless Implicit Neural Representations for Point Cloud Geometry Compression2025-07-21Visual-Language Model Knowledge Distillation Method for Image Quality Assessment2025-07-21Automatic Classification and Segmentation of Tunnel Cracks Based on Deep Learning and Visual Explanations2025-07-18Adversarial attacks to image classification systems using evolutionary algorithms2025-07-17Efficient Adaptation of Pre-trained Vision Transformer underpinned by Approximately Orthogonal Fine-Tuning Strategy2025-07-17Federated Learning for Commercial Image Sources2025-07-17MUPAX: Multidimensional Problem Agnostic eXplainable AI2025-07-17SCORE: Scene Context Matters in Open-Vocabulary Remote Sensing Instance Segmentation2025-07-17