TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/BaLeNAS: Differentiable Architecture Search via the Bayesi...

BaLeNAS: Differentiable Architecture Search via the Bayesian Learning Rule

Miao Zhang, Jilin Hu, Steven Su, Shirui Pan, Xiaojun Chang, Bin Yang, Gholamreza Haffari

2021-11-25CVPR 2022 1Neural Architecture SearchVariational Inference
PaperPDF

Abstract

Differentiable Architecture Search (DARTS) has received massive attention in recent years, mainly because it significantly reduces the computational cost through weight sharing and continuous relaxation. However, more recent works find that existing differentiable NAS techniques struggle to outperform naive baselines, yielding deteriorative architectures as the search proceeds. Rather than directly optimizing the architecture parameters, this paper formulates the neural architecture search as a distribution learning problem through relaxing the architecture weights into Gaussian distributions. By leveraging the natural-gradient variational inference (NGVI), the architecture distribution can be easily optimized based on existing codebases without incurring more memory and computational consumption. We demonstrate how the differentiable NAS benefits from Bayesian principles, enhancing exploration and improving stability. The experimental results on NAS-Bench-201 and NAS-Bench-1shot1 benchmark datasets confirm the significant improvements the proposed framework can make. In addition, instead of simply applying the argmax on the learned parameters, we further leverage the recently-proposed training-free proxies in NAS to select the optimal architecture from a group architectures drawn from the optimized distribution, where we achieve state-of-the-art results on the NAS-Bench-201 and NAS-Bench-1shot1 benchmarks. Our best architecture in the DARTS search space also obtains competitive test errors with 2.37\%, 15.72\%, and 24.2\% on CIFAR-10, CIFAR-100, and ImageNet datasets, respectively.

Results

TaskDatasetMetricValueModel
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Test)46.54BaLeNAS-TF
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Val)46.14BaLeNAS-TF
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Test)94.33BaLeNAS-TF
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Val)91.52BaLeNAS-TF
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Test)72.95BaLeNAS-TF
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Val)72.67BaLeNAS-TF
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Test)46.54BaLeNAS-TF
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Val)46.14BaLeNAS-TF
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Test)94.33BaLeNAS-TF
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Val)91.52BaLeNAS-TF
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Test)72.95BaLeNAS-TF
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Val)72.67BaLeNAS-TF

Related Papers

DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17Interpretable Bayesian Tensor Network Kernel Machines with Automatic Rank and Feature Selection2025-07-15Scalable Bayesian Low-Rank Adaptation of Large Language Models via Stochastic Variational Subspace Inference2025-06-26AnalogNAS-Bench: A NAS Benchmark for Analog In-Memory Computing2025-06-23VHU-Net: Variational Hadamard U-Net for Body MRI Bias Field Correction2025-06-23From Tiny Machine Learning to Tiny Deep Learning: A Survey2025-06-21One-Shot Neural Architecture Search with Network Similarity Directed Initialization for Pathological Image Classification2025-06-17DDS-NAS: Dynamic Data Selection within Neural Architecture Search via On-line Hard Example Mining applied to Image Classification2025-06-17