TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Few-Shot Learning as Domain Adaptation: Algorithm and Anal...

Few-Shot Learning as Domain Adaptation: Algorithm and Analysis

Jiechao Guan, Zhiwu Lu, Tao Xiang, Ji-Rong Wen

2020-02-06Few-Shot LearningMeta-LearningFew-Shot Image ClassificationDomain Adaptation
PaperPDF

Abstract

To recognize the unseen classes with only few samples, few-shot learning (FSL) uses prior knowledge learned from the seen classes. A major challenge for FSL is that the distribution of the unseen classes is different from that of those seen, resulting in poor generalization even when a model is meta-trained on the seen classes. This class-difference-caused distribution shift can be considered as a special case of domain shift. In this paper, for the first time, we propose a domain adaptation prototypical network with attention (DAPNA) to explicitly tackle such a domain shift problem in a meta-learning framework. Specifically, armed with a set transformer based attention module, we construct each episode with two sub-episodes without class overlap on the seen classes to simulate the domain shift between the seen and unseen classes. To align the feature distributions of the two sub-episodes with limited training samples, a feature transfer network is employed together with a margin disparity discrepancy (MDD) loss. Importantly, theoretical analysis is provided to give the learning bound of our DAPNA. Extensive experiments show that our DAPNA outperforms the state-of-the-art FSL alternatives, often by significant margins.

Results

TaskDatasetMetricValueModel
Image ClassificationMini-ImageNet-CUB 5-way (5-shot)Accuracy68.33DAPNA
Image ClassificationMini-ImageNet-CUB 5-way (1-shot)Accuracy49.44DAPNA
Image ClassificationMini-Imagenet 5-way (5-shot)Accuracy84.07DAPNA
Image ClassificationMini-Imagenet 5-way (1-shot)Accuracy71.88DAPNA
Image ClassificationTiered ImageNet 5-way (1-shot)Accuracy69.14DAPNA
Image ClassificationTiered ImageNet 5-way (5-shot)Accuracy85.82DAPNA
Few-Shot Image ClassificationMini-ImageNet-CUB 5-way (5-shot)Accuracy68.33DAPNA
Few-Shot Image ClassificationMini-ImageNet-CUB 5-way (1-shot)Accuracy49.44DAPNA
Few-Shot Image ClassificationMini-Imagenet 5-way (5-shot)Accuracy84.07DAPNA
Few-Shot Image ClassificationMini-Imagenet 5-way (1-shot)Accuracy71.88DAPNA
Few-Shot Image ClassificationTiered ImageNet 5-way (1-shot)Accuracy69.14DAPNA
Few-Shot Image ClassificationTiered ImageNet 5-way (5-shot)Accuracy85.82DAPNA

Related Papers

GLAD: Generalizable Tuning for Vision-Language Models2025-07-17A Privacy-Preserving Semantic-Segmentation Method Using Domain-Adaptation Technique2025-07-17Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16Imbalanced Regression Pipeline Recommendation2025-07-16CLID-MU: Cross-Layer Information Divergence Based Meta Update Strategy for Learning with Noisy Labels2025-07-16Mixture of Experts in Large Language Models2025-07-15Iceberg: Enhancing HLS Modeling with Synthetic Data2025-07-14Domain Borders Are There to Be Crossed With Federated Few-Shot Adaptation2025-07-14