TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/PODNet: Pooled Outputs Distillation for Small-Tasks Increm...

PODNet: Pooled Outputs Distillation for Small-Tasks Incremental Learning

Arthur Douillard, Matthieu Cord, Charles Ollion, Thomas Robert, Eduardo Valle

2020-04-28ECCV 2020 8Continual LearningRepresentation LearningClass Incremental LearningIncremental Learning
PaperPDFCode(official)Code

Abstract

Lifelong learning has attracted much attention, but existing works still struggle to fight catastrophic forgetting and accumulate knowledge over long stretches of incremental learning. In this work, we propose PODNet, a model inspired by representation learning. By carefully balancing the compromise between remembering the old classes and learning new ones, PODNet fights catastrophic forgetting, even over very long runs of small incremental tasks --a setting so far unexplored by current works. PODNet innovates on existing art with an efficient spatial-based distillation-loss applied throughout the model and a representation comprising multiple proxy vectors for each class. We validate those innovations thoroughly, comparing PODNet with three state-of-the-art models on three datasets: CIFAR100, ImageNet100, and ImageNet1000. Our results showcase a significant advantage of PODNet over existing art, with accuracy gains of 12.10, 6.51, and 2.85 percentage points, respectively. Code is available at https://github.com/arthurdouillard/incremental_learning.pytorch

Results

TaskDatasetMetricValueModel
Incremental LearningCIFAR-100 - 50 classes + 10 steps of 5 classesAverage Incremental Accuracy63.19PODNet (CNN)
Incremental LearningImageNet-100 - 50 classes + 25 steps of 2 classesAverage Incremental Accuracy67.28PODNet
Incremental LearningCIFAR-100 - 50 classes + 5 steps of 10 classesAverage Incremental Accuracy64.83PODNet (CNN)
Incremental LearningCIFAR-100-B0(5steps of 20 classes)Average Incremental Accuracy66.7PODNet
Incremental LearningImageNet - 500 classes + 5 steps of 100 classesAverage Incremental Accuracy66.95PODNet
Incremental LearningCIFAR-100 - 50 classes + 25 steps of 2 classesAverage Incremental Accuracy60.72PODNet
Incremental LearningCIFAR-100 - 50 classes + 50 steps of 1 classAverage Incremental Accuracy57.98PODNet
Incremental LearningImageNet - 500 classes + 10 steps of 50 classesAverage Incremental Accuracy64.13PODNet
Incremental LearningImageNet-100 - 50 classes + 50 steps of 1 classAverage Incremental Accuracy62.08PODNet
Incremental LearningImageNet-100 - 50 classes + 10 steps of 5 classesAverage Incremental Accuracy73.14PODNet
Incremental LearningImageNet-100 - 50 classes + 5 steps of 10 classesAverage Incremental Accuracy75.82PODNet

Related Papers

Touch in the Wild: Learning Fine-Grained Manipulation with a Portable Visuo-Tactile Gripper2025-07-20Spectral Bellman Method: Unifying Representation and Exploration in RL2025-07-17Boosting Team Modeling through Tempo-Relational Representation Learning2025-07-17RegCL: Continual Adaptation of Segment Anything Model via Model Merging2025-07-16Information-Theoretic Generalization Bounds of Replay-based Continual Learning2025-07-16PROL : Rehearsal Free Continual Learning in Streaming Data via Prompt Online Learning2025-07-16Similarity-Guided Diffusion for Contrastive Sequential Recommendation2025-07-16Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16