Description
Auxiliary Batch Normalization is a type of regularization used in adversarial training schemes. The idea is that adversarial examples should have a separate batch normalization components to the clean examples, as they have different underlying statistics.
Papers Using This Method
Toward Improving Synthetic Audio Spoofing Detection Robustness via Meta-Learning and Disentangled Training With Adversarial Examples2024-08-23EntProp: High Entropy Propagation for Improving Accuracy and Robustness2024-05-29Diversify, Don't Fine-Tune: Scaling Up Visual Recognition Training with Synthetic Images2023-12-04Improving Model Generalization by On-manifold Adversarial Augmentation in the Frequency Domain2023-02-28Easy Batch Normalization2022-07-18How explainable are adversarially-robust CNNs?2022-05-25Fast AdvProp2022-04-21Pyramid Adversarial Training Improves ViT Performance2021-11-303D Point Cloud Completion with Geometric-Aware Adversarial Augmentation2021-09-21ALFA: Adversarial Feature Augmentation for Enhanced Image Recognition2021-01-01Advanced Graph and Sequence Neural Networks for Molecular Property Prediction and Drug Discovery2020-12-02Adversarial Examples Improve Image Recognition2019-11-21