TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/PRE-NAS: Predictor-assisted Evolutionary Neural Architectu...

PRE-NAS: Predictor-assisted Evolutionary Neural Architecture Search

Yameng Peng, Andy Song, Vic Ciesielski, Haytham M. Fayek, Xiaojun Chang

2022-04-27Neural Architecture Search
PaperPDF

Abstract

Neural architecture search (NAS) aims to automate architecture engineering in neural networks. This often requires a high computational overhead to evaluate a number of candidate networks from the set of all possible networks in the search space during the search. Prediction of the networks' performance can alleviate this high computational overhead by mitigating the need for evaluating every candidate network. Developing such a predictor typically requires a large number of evaluated architectures which may be difficult to obtain. We address this challenge by proposing a novel evolutionary-based NAS strategy, Predictor-assisted E-NAS (PRE-NAS), which can perform well even with an extremely small number of evaluated architectures. PRE-NAS leverages new evolutionary search strategies and integrates high-fidelity weight inheritance over generations. Unlike one-shot strategies, which may suffer from bias in the evaluation due to weight sharing, offspring candidates in PRE-NAS are topologically homogeneous, which circumvents bias and leads to more accurate predictions. Extensive experiments on NAS-Bench-201 and DARTS search spaces show that PRE-NAS can outperform state-of-the-art NAS methods. With only a single GPU searching for 0.6 days, competitive architecture can be found by PRE-NAS which achieves 2.40% and 24% test error rates on CIFAR-10 and ImageNet respectively.

Results

TaskDatasetMetricValueModel
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Test)45.34PRE-NAS
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Test)94.04PRE-NAS
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Val)91.37PRE-NAS
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Test)72.02PRE-NAS
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Val)71.95PRE-NAS
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Test)45.34PRE-NAS
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Test)94.04PRE-NAS
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Val)91.37PRE-NAS
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Test)72.02PRE-NAS
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Val)71.95PRE-NAS

Related Papers

DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17AnalogNAS-Bench: A NAS Benchmark for Analog In-Memory Computing2025-06-23From Tiny Machine Learning to Tiny Deep Learning: A Survey2025-06-21One-Shot Neural Architecture Search with Network Similarity Directed Initialization for Pathological Image Classification2025-06-17DDS-NAS: Dynamic Data Selection within Neural Architecture Search via On-line Hard Example Mining applied to Image Classification2025-06-17MARCO: Hardware-Aware Neural Architecture Search for Edge Devices with Multi-Agent Reinforcement Learning and Conformal Prediction Filtering2025-06-16Finding Optimal Kernel Size and Dimension in Convolutional Neural Networks An Architecture Optimization Approach2025-06-16Directed Acyclic Graph Convolutional Networks2025-06-13