TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Large Scale Incremental Learning

Large Scale Incremental Learning

Yue Wu, Yinpeng Chen, Lijuan Wang, Yuancheng Ye, Zicheng Liu, Yandong Guo, Yun Fu

2019-05-30CVPR 2019 6Class Incremental Learningclass-incremental learningIncremental Learning
PaperPDFCodeCodeCodeCodeCode

Abstract

Modern machine learning suffers from catastrophic forgetting when learning new classes incrementally. The performance dramatically degrades due to the missing data of old classes. Incremental learning methods have been proposed to retain the knowledge acquired from the old classes, by using knowledge distilling and keeping a few exemplars from the old classes. However, these methods struggle to scale up to a large number of classes. We believe this is because of the combination of two factors: (a) the data imbalance between the old and new classes, and (b) the increasing number of visually similar classes. Distinguishing between an increasing number of visually similar classes is particularly challenging, when the training data is unbalanced. We propose a simple and effective method to address this data imbalance issue. We found that the last fully connected layer has a strong bias towards the new classes, and this bias can be corrected by a linear model. With two bias parameters, our method performs remarkably well on two large datasets: ImageNet (1000 classes) and MS-Celeb-1M (10000 classes), outperforming the state-of-the-art algorithms by 11.1% and 13.2% respectively.

Results

TaskDatasetMetricValueModel
Incremental LearningCIFAR-100 - 50 classes + 10 steps of 5 classesAverage Incremental Accuracy53.21BiC
Incremental LearningCIFAR-100 - 50 classes + 5 steps of 10 classesAverage Incremental Accuracy56.86BiC
Incremental LearningCIFAR-100-B0(5steps of 20 classes)Average Incremental Accuracy73.1BiC
Incremental LearningImageNet - 10 steps# M Params11.68BiC
Incremental LearningImageNet - 10 stepsAverage Incremental Accuracy Top-584BiC
Incremental LearningImageNet - 10 stepsFinal Accuracy Top-573.2BiC
Incremental LearningImageNet100 - 10 steps# M Params11.22BiC
Incremental LearningImageNet100 - 10 stepsAverage Incremental Accuracy Top-590.6BiC
Incremental LearningImageNet100 - 10 stepsFinal Accuracy Top-584.4BiC
Incremental LearningCIFAR-100 - 50 classes + 25 steps of 2 classesAverage Incremental Accuracy48.96BiC
Incremental LearningCIFAR-100 - 50 classes + 50 steps of 1 classAverage Incremental Accuracy47.09BiC
Incremental LearningImageNet-100 - 50 classes + 50 steps of 1 classAverage Incremental Accuracy46.49BiC

Related Papers

The Bayesian Approach to Continual Learning: An Overview2025-07-11Balancing the Past and Present: A Coordinated Replay Framework for Federated Class-Incremental Learning2025-07-10Addressing Imbalanced Domain-Incremental Learning through Dual-Balance Collaborative Experts2025-07-09DuET: Dual Incremental Object Detection via Exemplar-Free Task Arithmetic2025-06-26Class-Incremental Learning for Honey Botanical Origin Classification with Hyperspectral Images: A Study with Continual Backpropagation2025-06-12Hyperbolic Dual Feature Augmentation for Open-Environment2025-06-10L3A: Label-Augmented Analytic Adaptation for Multi-Label Class Incremental Learning2025-06-01CL-LoRA: Continual Low-Rank Adaptation for Rehearsal-Free Class-Incremental Learning2025-05-30