TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Transductive Information Maximization For Few-Shot Learning

Transductive Information Maximization For Few-Shot Learning

Malik Boudiaf, Ziko Imtiaz Masud, Jérôme Rony, José Dolz, Pablo Piantanida, Ismail Ben Ayed

2020-08-25Few-Shot LearningMeta-LearningFew-Shot Image Classification
PaperPDFCode(official)Code

Abstract

We introduce Transductive Infomation Maximization (TIM) for few-shot learning. Our method maximizes the mutual information between the query features and their label predictions for a given few-shot task, in conjunction with a supervision loss based on the support set. Furthermore, we propose a new alternating-direction solver for our mutual-information loss, which substantially speeds up transductive-inference convergence over gradient-based optimization, while yielding similar accuracy. TIM inference is modular: it can be used on top of any base-training feature extractor. Following standard transductive few-shot settings, our comprehensive experiments demonstrate that TIM outperforms state-of-the-art methods significantly across various datasets and networks, while used on top of a fixed feature extractor trained with simple cross-entropy on the base classes, without resorting to complex meta-learning schemes. It consistently brings between 2% and 5% improvement in accuracy over the best performing method, not only on all the well-established few-shot benchmarks but also on more challenging scenarios,with domain shifts and larger numbers of classes.

Results

TaskDatasetMetricValueModel
Image ClassificationCUB 200 5-way 5-shotAccuracy90.8TIM-GD
Image ClassificationMini-Imagenet 20-way (1-shot)Accuracy39.3TIM-GD
Image ClassificationMini-Imagenet 10-way (5-shot)Accuracy72.8TIM-GD
Image ClassificationMini-Imagenet 20-way (5-shot)Accuracy59.5TIM-GD
Image ClassificationMini-Imagenet 5-way (1-shot)Accuracy77.8TIM-GD
Image ClassificationMini-Imagenet 10-way (1-shot)Accuracy56.1TIM-GD
Image ClassificationMini-ImageNet to CUB - 5 shot learningAccuracy71TIM-GD
Image ClassificationTiered ImageNet 5-way (1-shot)Accuracy82.1TIM-GD
Image ClassificationTiered ImageNet 5-way (5-shot)Accuracy89.8TIM-GD
Few-Shot Image ClassificationCUB 200 5-way 5-shotAccuracy90.8TIM-GD
Few-Shot Image ClassificationMini-Imagenet 20-way (1-shot)Accuracy39.3TIM-GD
Few-Shot Image ClassificationMini-Imagenet 10-way (5-shot)Accuracy72.8TIM-GD
Few-Shot Image ClassificationMini-Imagenet 20-way (5-shot)Accuracy59.5TIM-GD
Few-Shot Image ClassificationMini-Imagenet 5-way (1-shot)Accuracy77.8TIM-GD
Few-Shot Image ClassificationMini-Imagenet 10-way (1-shot)Accuracy56.1TIM-GD
Few-Shot Image ClassificationMini-ImageNet to CUB - 5 shot learningAccuracy71TIM-GD
Few-Shot Image ClassificationTiered ImageNet 5-way (1-shot)Accuracy82.1TIM-GD
Few-Shot Image ClassificationTiered ImageNet 5-way (5-shot)Accuracy89.8TIM-GD

Related Papers

GLAD: Generalizable Tuning for Vision-Language Models2025-07-17Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16Imbalanced Regression Pipeline Recommendation2025-07-16CLID-MU: Cross-Layer Information Divergence Based Meta Update Strategy for Learning with Noisy Labels2025-07-16Mixture of Experts in Large Language Models2025-07-15Iceberg: Enhancing HLS Modeling with Synthetic Data2025-07-14Meta-Reinforcement Learning for Fast and Data-Efficient Spectrum Allocation in Dynamic Wireless Networks2025-07-13ViT-ProtoNet for Few-Shot Image Classification: A Multi-Benchmark Evaluation2025-07-12