TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/IS-DARTS: Stabilizing DARTS through Precise Measurement on...

IS-DARTS: Stabilizing DARTS through Precise Measurement on Candidate Importance

Hongyi He, Longjun Liu, Haonan Zhang, Nanning Zheng

2023-12-19Neural Architecture Search
PaperPDFCode(official)

Abstract

Among existing Neural Architecture Search methods, DARTS is known for its efficiency and simplicity. This approach applies continuous relaxation of network representation to construct a weight-sharing supernet and enables the identification of excellent subnets in just a few GPU days. However, performance collapse in DARTS results in deteriorating architectures filled with parameter-free operations and remains a great challenge to the robustness. To resolve this problem, we reveal that the fundamental reason is the biased estimation of the candidate importance in the search space through theoretical and experimental analysis, and more precisely select operations via information-based measurements. Furthermore, we demonstrate that the excessive concern over the supernet and inefficient utilization of data in bi-level optimization also account for suboptimal results. We adopt a more realistic objective focusing on the performance of subnets and simplify it with the help of the information-based measurements. Finally, we explain theoretically why progressively shrinking the width of the supernet is necessary and reduce the approximation error of optimal weights in DARTS. Our proposed method, named IS-DARTS, comprehensively improves DARTS and resolves the aforementioned problems. Extensive experiments on NAS-Bench-201 and DARTS-based search space demonstrate the effectiveness of IS-DARTS.

Results

TaskDatasetMetricValueModel
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Test)46.34IS-DARTS
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Val)46.37IS-DARTS
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Test)94.36IS-DARTS
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Val)91.55IS-DARTS
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Test)73.51IS-DARTS
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Val)73.49IS-DARTS
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Test)46.34IS-DARTS
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Val)46.37IS-DARTS
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Test)94.36IS-DARTS
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Val)91.55IS-DARTS
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Test)73.51IS-DARTS
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Val)73.49IS-DARTS

Related Papers

DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17AnalogNAS-Bench: A NAS Benchmark for Analog In-Memory Computing2025-06-23From Tiny Machine Learning to Tiny Deep Learning: A Survey2025-06-21One-Shot Neural Architecture Search with Network Similarity Directed Initialization for Pathological Image Classification2025-06-17DDS-NAS: Dynamic Data Selection within Neural Architecture Search via On-line Hard Example Mining applied to Image Classification2025-06-17MARCO: Hardware-Aware Neural Architecture Search for Edge Devices with Multi-Agent Reinforcement Learning and Conformal Prediction Filtering2025-06-16Finding Optimal Kernel Size and Dimension in Convolutional Neural Networks An Architecture Optimization Approach2025-06-16Directed Acyclic Graph Convolutional Networks2025-06-13