Out-of-Distribution Detection Using Outlier Detection Methods
Jan Diers, Christian Pigorsch
Abstract
Out-of-distribution detection (OOD) deals with anomalous input to neural networks. In the past, specialized methods have been proposed to reject predictions on anomalous input. Similarly, it was shown that feature extraction models in combination with outlier detection algorithms are well suited to detect anomalous input. We use outlier detection algorithms to detect anomalous input as reliable as specialized methods from the field of OOD. No neural network adaptation is required; detection is based on the model's softmax score. Our approach works unsupervised using an Isolation Forest and can be further improved by using a supervised learning method such as Gradient Boosting.
Results
| Task | Dataset | Metric | Value | Model |
|---|---|---|---|---|
| Out-of-Distribution Detection | CIFAR-10 vs CIFAR-100 | AUROC | 91.95 | Isolation Forest on EfficientNet Softmax values |
Related Papers
Robust Spatiotemporal Epidemic Modeling with Integrated Adaptive Outlier Detection2025-07-12Universal Embeddings of Tabular Data2025-07-08Safe Domain Randomization via Uncertainty-Aware Out-of-Distribution Detection and Policy Adaptation2025-07-08FA: Forced Prompt Learning of Vision-Language Models for Out-of-Distribution Detection2025-07-06Out-of-distribution detection in 3D applications: a review2025-07-01Generative Adversarial Evasion and Out-of-Distribution Detection for UAV Cyber-Attacks2025-06-26Enclosing Prototypical Variational Autoencoder for Explainable Out-of-Distribution Detection2025-06-17A Variational Information Theoretic Approach to Out-of-Distribution Detection2025-06-17