TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Constrained Few-shot Class-incremental Learning

Constrained Few-shot Class-incremental Learning

Michael Hersche, Geethan Karunaratne, Giovanni Cherubini, Luca Benini, Abu Sebastian, Abbas Rahimi

2022-03-30CVPR 2022 1Few-Shot Class-Incremental LearningClass Incremental Learningclass-incremental learningIncremental Learning
PaperPDFCodeCode(official)

Abstract

Continually learning new classes from fresh data without forgetting previous knowledge of old classes is a very challenging research problem. Moreover, it is imperative that such learning must respect certain memory and computational constraints such as (i) training samples are limited to only a few per class, (ii) the computational cost of learning a novel class remains constant, and (iii) the memory footprint of the model grows at most linearly with the number of classes observed. To meet the above constraints, we propose C-FSCIL, which is architecturally composed of a frozen meta-learned feature extractor, a trainable fixed-size fully connected layer, and a rewritable dynamically growing memory that stores as many vectors as the number of encountered classes. C-FSCIL provides three update modes that offer a trade-off between accuracy and compute-memory cost of learning novel classes. C-FSCIL exploits hyperdimensional embedding that allows to continually express many more classes than the fixed dimensions in the vector space, with minimal interference. The quality of class vector representations is further improved by aligning them quasi-orthogonally to each other by means of novel loss functions. Experiments on the CIFAR100, miniImageNet, and Omniglot datasets show that C-FSCIL outperforms the baselines with remarkable accuracy and compression. It also scales up to the largest problem size ever tried in this few-shot setting by learning 423 novel classes on top of 1200 base classes with less than 1.6% accuracy drop. Our code is available at https://github.com/IBM/constrained-FSCIL.

Results

TaskDatasetMetricValueModel
Continual LearningCIFAR-100Average Accuracy61.67C-FSCIL
Continual LearningCIFAR-100Last Accuracy50.47C-FSCIL
Continual Learningmini-ImagenetAverage Accuracy61.61C-FSCIL
Continual Learningmini-ImagenetLast Accuracy 51.41C-FSCIL
Class Incremental LearningCIFAR-100Average Accuracy61.67C-FSCIL
Class Incremental LearningCIFAR-100Last Accuracy50.47C-FSCIL
Class Incremental Learningmini-ImagenetAverage Accuracy61.61C-FSCIL
Class Incremental Learningmini-ImagenetLast Accuracy 51.41C-FSCIL

Related Papers

The Bayesian Approach to Continual Learning: An Overview2025-07-11Balancing the Past and Present: A Coordinated Replay Framework for Federated Class-Incremental Learning2025-07-10Addressing Imbalanced Domain-Incremental Learning through Dual-Balance Collaborative Experts2025-07-09DuET: Dual Incremental Object Detection via Exemplar-Free Task Arithmetic2025-06-26Class-Incremental Learning for Honey Botanical Origin Classification with Hyperspectral Images: A Study with Continual Backpropagation2025-06-12Hyperbolic Dual Feature Augmentation for Open-Environment2025-06-10L3A: Label-Augmented Analytic Adaptation for Multi-Label Class Incremental Learning2025-06-01CL-LoRA: Continual Low-Rank Adaptation for Rehearsal-Free Class-Incremental Learning2025-05-30