TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Self-Supervised Prototypical Transfer Learning for Few-Sho...

Self-Supervised Prototypical Transfer Learning for Few-Shot Classification

Carlos Medina, Arnout Devos, Matthias Grossglauser

2020-06-19Few-Shot LearningMeta-LearningUnsupervised Few-Shot Image ClassificationSelf-Supervised LearningTransfer LearningGeneral ClassificationClassification
PaperPDFCodeCode(official)

Abstract

Most approaches in few-shot learning rely on costly annotated data related to the goal task domain during (pre-)training. Recently, unsupervised meta-learning methods have exchanged the annotation requirement for a reduction in few-shot classification performance. Simultaneously, in settings with realistic domain shift, common transfer learning has been shown to outperform supervised meta-learning. Building on these insights and on advances in self-supervised learning, we propose a transfer learning approach which constructs a metric embedding that clusters unlabeled prototypical samples and their augmentations closely together. This pre-trained embedding is a starting point for few-shot classification by summarizing class clusters and fine-tuning. We demonstrate that our self-supervised prototypical transfer learning approach ProtoTransfer outperforms state-of-the-art unsupervised meta-learning methods on few-shot tasks from the mini-ImageNet dataset. In few-shot experiments with domain shift, our approach even has comparable performance to supervised methods, but requires orders of magnitude fewer labels.

Results

TaskDatasetMetricValueModel
Image ClassificationMini-Imagenet 5-way (1-shot)Accuracy45.67ProtoTransfer
Image ClassificationMini-Imagenet 5-way (5-shot)Accuracy62.99ProtoTransfer
Few-Shot Image ClassificationMini-Imagenet 5-way (1-shot)Accuracy45.67ProtoTransfer
Few-Shot Image ClassificationMini-Imagenet 5-way (5-shot)Accuracy62.99ProtoTransfer

Related Papers

RaMen: Multi-Strategy Multi-Modal Learning for Bundle Construction2025-07-18GLAD: Generalizable Tuning for Vision-Language Models2025-07-17A Semi-Supervised Learning Method for the Identification of Bad Exposures in Large Imaging Surveys2025-07-17Disentangling coincident cell events using deep transfer learning and compressive sensing2025-07-17Adversarial attacks to image classification systems using evolutionary algorithms2025-07-17Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16Imbalanced Regression Pipeline Recommendation2025-07-16CLID-MU: Cross-Layer Information Divergence Based Meta Update Strategy for Learning with Noisy Labels2025-07-16