TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/FOSTER: Feature Boosting and Compression for Class-Increme...

FOSTER: Feature Boosting and Compression for Class-Incremental Learning

Fu-Yun Wang, Da-Wei Zhou, Han-Jia Ye, De-Chuan Zhan

2022-04-10Class Incremental Learningclass-incremental learningIncremental Learning
PaperPDFCode(official)Code

Abstract

The ability to learn new concepts continually is necessary in this ever-changing world. However, deep neural networks suffer from catastrophic forgetting when learning new categories. Many works have been proposed to alleviate this phenomenon, whereas most of them either fall into the stability-plasticity dilemma or take too much computation or storage overhead. Inspired by the gradient boosting algorithm to gradually fit the residuals between the target model and the previous ensemble model, we propose a novel two-stage learning paradigm FOSTER, empowering the model to learn new categories adaptively. Specifically, we first dynamically expand new modules to fit the residuals between the target and the output of the original model. Next, we remove redundant parameters and feature dimensions through an effective distillation strategy to maintain the single backbone model. We validate our method FOSTER on CIFAR-100 and ImageNet-100/1000 under different settings. Experimental results show that our method achieves state-of-the-art performance. Code is available at: https://github.com/G-U-N/ECCV22-FOSTER.

Results

TaskDatasetMetricValueModel
Incremental LearningCIFAR-100 - 50 classes + 10 steps of 5 classesAverage Incremental Accuracy67.95FOSTER
Incremental LearningImageNet-100 - 50 classes + 25 steps of 2 classesAverage Incremental Accuracy69.34FOSTER
Incremental LearningCIFAR-100 - 50 classes + 5 steps of 10 classesAverage Incremental Accuracy69.46FOSTER
Incremental LearningImageNet100 - 20 stepsAverage Incremental Accuracy74.49FOSTER
Incremental LearningCIFAR100-B0(10steps of 10 classes)Average Incremental Accuracy72.9FOSTER
Incremental LearningImageNet - 10 stepsAverage Incremental Accuracy68.34FOSTER
Incremental LearningImageNet100 - 10 stepsAverage Incremental Accuracy77.75FOSTER
Incremental LearningCIFAR100B020Step(5ClassesPerStep)Average Incremental Accuracy70.65FOSTER
Incremental LearningCIFAR-100 - 50 classes + 25 steps of 2 classesAverage Incremental Accuracy63.83FOSTER
Incremental LearningImageNet-100 - 50 classes + 10 steps of 5 classesAverage Incremental Accuracy77.54FOSTER
Incremental LearningImageNet-100 - 50 classes + 5 steps of 10 classesAverage Incremental Accuracy80.22FOSTER

Related Papers

The Bayesian Approach to Continual Learning: An Overview2025-07-11Balancing the Past and Present: A Coordinated Replay Framework for Federated Class-Incremental Learning2025-07-10Addressing Imbalanced Domain-Incremental Learning through Dual-Balance Collaborative Experts2025-07-09DuET: Dual Incremental Object Detection via Exemplar-Free Task Arithmetic2025-06-26Class-Incremental Learning for Honey Botanical Origin Classification with Hyperspectral Images: A Study with Continual Backpropagation2025-06-12Hyperbolic Dual Feature Augmentation for Open-Environment2025-06-10L3A: Label-Augmented Analytic Adaptation for Multi-Label Class Incremental Learning2025-06-01CL-LoRA: Continual Low-Rank Adaptation for Rehearsal-Free Class-Incremental Learning2025-05-30