TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Differentiable Architecture Search with Random Features

Differentiable Architecture Search with Random Features

Xuanyang Zhang, Yonggang Li, Xiangyu Zhang, Yongtao Wang, Jian Sun

2022-08-18CVPR 2023 1Neural Architecture Search
PaperPDF

Abstract

Differentiable architecture search (DARTS) has significantly promoted the development of NAS techniques because of its high search efficiency and effectiveness but suffers from performance collapse. In this paper, we make efforts to alleviate the performance collapse problem for DARTS from two aspects. First, we investigate the expressive power of the supernet in DARTS and then derive a new setup of DARTS paradigm with only training BatchNorm. Second, we theoretically find that random features dilute the auxiliary connection role of skip-connection in supernet optimization and enable search algorithm focus on fairer operation selection, thereby solving the performance collapse problem. We instantiate DARTS and PC-DARTS with random features to build an improved version for each named RF-DARTS and RF-PCDARTS respectively. Experimental results show that RF-DARTS obtains \textbf{94.36\%} test accuracy on CIFAR-10 (which is the nearest optimal result in NAS-Bench-201), and achieves the newest state-of-the-art top-1 test error of \textbf{24.0\%} on ImageNet when transferring from CIFAR-10. Moreover, RF-DARTS performs robustly across three datasets (CIFAR-10, CIFAR-100, and SVHN) and four search spaces (S1-S4). Besides, RF-PCDARTS achieves even better results on ImageNet, that is, \textbf{23.9\%} top-1 and \textbf{7.1\%} top-5 test error, surpassing representative methods like single-path, training-free, and partial-channel paradigms directly searched on ImageNet.

Results

TaskDatasetMetricValueModel
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Test)46.1RF-DARTS
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Val)46.4RF-DARTS
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Test)94.27RF-DARTS
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Val)91.3RF-DARTS
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Test)72.94RF-DARTS
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Val)72.95RF-DARTS
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Test)46.1RF-DARTS
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Val)46.4RF-DARTS
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Test)94.27RF-DARTS
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Val)91.3RF-DARTS
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Test)72.94RF-DARTS
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Val)72.95RF-DARTS

Related Papers

DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17AnalogNAS-Bench: A NAS Benchmark for Analog In-Memory Computing2025-06-23From Tiny Machine Learning to Tiny Deep Learning: A Survey2025-06-21One-Shot Neural Architecture Search with Network Similarity Directed Initialization for Pathological Image Classification2025-06-17DDS-NAS: Dynamic Data Selection within Neural Architecture Search via On-line Hard Example Mining applied to Image Classification2025-06-17MARCO: Hardware-Aware Neural Architecture Search for Edge Devices with Multi-Agent Reinforcement Learning and Conformal Prediction Filtering2025-06-16Finding Optimal Kernel Size and Dimension in Convolutional Neural Networks An Architecture Optimization Approach2025-06-16Directed Acyclic Graph Convolutional Networks2025-06-13