TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/On Feature Collapse and Deep Kernel Learning for Single Fo...

On Feature Collapse and Deep Kernel Learning for Single Forward Pass Uncertainty

Joost van Amersfoort, Lewis Smith, Andrew Jesson, Oscar Key, Yarin Gal

2021-02-22Gaussian ProcessesGeneral Classification
PaperPDFCodeCodeCode(official)

Abstract

Inducing point Gaussian process approximations are often considered a gold standard in uncertainty estimation since they retain many of the properties of the exact GP and scale to large datasets. A major drawback is that they have difficulty scaling to high dimensional inputs. Deep Kernel Learning (DKL) promises a solution: a deep feature extractor transforms the inputs over which an inducing point Gaussian process is defined. However, DKL has been shown to provide unreliable uncertainty estimates in practice. We study why, and show that with no constraints, the DKL objective pushes "far-away" data points to be mapped to the same features as those of training-set points. With this insight we propose to constrain DKL's feature extractor to approximately preserve distances through a bi-Lipschitz constraint, resulting in a feature space favorable to DKL. We obtain a model, DUE, which demonstrates uncertainty quality outperforming previous DKL and other single forward pass uncertainty methods, while maintaining the speed and accuracy of standard neural networks.

Related Papers

Fast Gaussian Processes under Monotonicity Constraints2025-07-09MathOptAI.jl: Embed trained machine learning predictors into JuMP models2025-07-03Scalable Machine Learning Algorithms using Path Signatures2025-06-21Gaussian Processes and Reproducing Kernels: Connections and Equivalences2025-06-20Effect Decomposition of Functional-Output Computer Experiments via Orthogonal Additive Gaussian Processes2025-06-15Statistical Machine Learning for Astronomy -- A Textbook2025-06-13Accurate and Uncertainty-Aware Multi-Task Prediction of HEA Properties Using Prior-Guided Deep Gaussian Processes2025-06-13Tailored Architectures for Time Series Forecasting: Evaluating Deep Learning Models on Gaussian Process-Generated Data2025-06-10