TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Bayesian Meta-Learning for the Few-Shot Setting via Deep K...

Bayesian Meta-Learning for the Few-Shot Setting via Deep Kernels

Massimiliano Patacchiola, Jack Turner, Elliot J. Crowley, Michael O'Boyle, Amos Storkey

2019-10-11NeurIPS 2020 12Uncertainty QuantificationFew-Shot LearningMeta-LearningGaussian ProcessesFew-Shot Image ClassificationBayesian InferenceDomain Adaptation
PaperPDFCodeCode(official)Code

Abstract

Recently, different machine learning methods have been introduced to tackle the challenging few-shot learning scenario that is, learning from a small labeled dataset related to a specific task. Common approaches have taken the form of meta-learning: learning to learn on the new problem given the old. Following the recognition that meta-learning is implementing learning in a multi-level model, we present a Bayesian treatment for the meta-learning inner loop through the use of deep kernels. As a result we can learn a kernel that transfers to new tasks; we call this Deep Kernel Transfer (DKT). This approach has many advantages: is straightforward to implement as a single optimizer, provides uncertainty quantification, and does not require estimation of task-specific parameters. We empirically demonstrate that DKT outperforms several state-of-the-art algorithms in few-shot classification, and is the state of the art for cross-domain adaptation and regression. We conclude that complex meta-learning routines can be replaced by a simpler Bayesian model without loss of accuracy.

Results

TaskDatasetMetricValueModel
Image ClassificationCUB 200 5-way 5-shotAccuracy85.64DKT + BNCosSim
Image ClassificationMini-ImageNet-CUB 5-way (5-shot)Accuracy56.4DKT + BNCosSim
Image ClassificationCUB 200 5-way 1-shotAccuracy72.27DKT + BNCosSim
Image ClassificationMini-ImageNet-CUB 5-way (1-shot)Accuracy40.22DKT + CosSim
Image ClassificationMini-Imagenet 5-way (5-shot)Accuracy64DKT + BNCosSim
Image ClassificationMini-Imagenet 5-way (1-shot)Accuracy62.96DKT + BNCosSim
Image ClassificationOMNIGLOT-EMNIST 5-way (1-shot)Accuracy75.4DKT + BNCosSim
Image ClassificationOMNIGLOT-EMNIST 5-way (5-shot)Accuracy90.3DKT + BNCosSim
Few-Shot Image ClassificationCUB 200 5-way 5-shotAccuracy85.64DKT + BNCosSim
Few-Shot Image ClassificationMini-ImageNet-CUB 5-way (5-shot)Accuracy56.4DKT + BNCosSim
Few-Shot Image ClassificationCUB 200 5-way 1-shotAccuracy72.27DKT + BNCosSim
Few-Shot Image ClassificationMini-ImageNet-CUB 5-way (1-shot)Accuracy40.22DKT + CosSim
Few-Shot Image ClassificationMini-Imagenet 5-way (5-shot)Accuracy64DKT + BNCosSim
Few-Shot Image ClassificationMini-Imagenet 5-way (1-shot)Accuracy62.96DKT + BNCosSim
Few-Shot Image ClassificationOMNIGLOT-EMNIST 5-way (1-shot)Accuracy75.4DKT + BNCosSim
Few-Shot Image ClassificationOMNIGLOT-EMNIST 5-way (5-shot)Accuracy90.3DKT + BNCosSim

Related Papers

GLAD: Generalizable Tuning for Vision-Language Models2025-07-17A Privacy-Preserving Semantic-Segmentation Method Using Domain-Adaptation Technique2025-07-17Distributional Reinforcement Learning on Path-dependent Options2025-07-16Are encoders able to learn landmarkers for warm-starting of Hyperparameter Optimization?2025-07-16Imbalanced Regression Pipeline Recommendation2025-07-16CLID-MU: Cross-Layer Information Divergence Based Meta Update Strategy for Learning with Noisy Labels2025-07-16Interpretable Bayesian Tensor Network Kernel Machines with Automatic Rank and Feature Selection2025-07-15Joint space-time wind field data extrapolation and uncertainty quantification using nonparametric Bayesian dictionary learning2025-07-15