ZClassifier: Temperature Tuning and Manifold Approximation via KL Divergence on Logit Space

Shim Soon Yong

Abstract

We introduce a novel classification framework, ZClassifier, that replaces conventional deterministic logits with diagonal Gaussian-distributed logits. Our method simultaneously addresses temperature scaling and manifold approximation by minimizing the Kullback-Leibler (KL) divergence between the predicted Gaussian distributions and a unit isotropic Gaussian. This unifies uncertainty calibration and latent control in a principled probabilistic manner, enabling a natural interpretation of class confidence and geometric consistency. Experiments on CIFAR-10 show that ZClassifier improves over softmax classifiers in robustness, calibration, and latent separation.

Results

TaskDatasetMetricValueModel
Image ClassificationCIFAR-10AUCROC0.9994ZClassifier

Related Papers