Easy Batch Normalization
Arip Asadulaev, Alexander Panfilov, Andrey Filchenkov
2022-07-18Object Recognition
Abstract
It was shown that adversarial examples improve object recognition. But what about their opposite side, easy examples? Easy examples are samples that the machine learning model classifies correctly with high confidence. In our paper, we are making the first step toward exploring the potential benefits of using easy examples in the training procedure of neural networks. We propose to use an auxiliary batch normalization for easy examples for the standard and robust accuracy improvement.
Related Papers
GeoMag: A Vision-Language Model for Pixel-level Fine-Grained Remote Sensing Image Parsing2025-07-08Out-of-distribution detection in 3D applications: a review2025-07-01SASep: Saliency-Aware Structured Separation of Geometry and Feature for Open Set Learning on Point Clouds2025-06-16Continual Hyperbolic Learning of Instances and Classes2025-06-12DCIRNet: Depth Completion with Iterative Refinement for Dexterous Grasping of Transparent and Reflective Objects2025-06-11Aligning Text, Images, and 3D Structure Token-by-Token2025-06-09STSBench: A Spatio-temporal Scenario Benchmark for Multi-modal Large Language Models in Autonomous Driving2025-06-06Feature-Based Lie Group Transformer for Real-World Applications2025-06-05