TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/AtomNAS: Fine-Grained End-to-End Neural Architecture Search

AtomNAS: Fine-Grained End-to-End Neural Architecture Search

Jieru Mei, Yingwei Li, Xiaochen Lian, Xiaojie Jin, Linjie Yang, Alan Yuille, Jianchao Yang

2019-12-20ICLR 2020 1Neural Architecture Search
PaperPDFCode(official)

Abstract

Search space design is very critical to neural architecture search (NAS) algorithms. We propose a fine-grained search space comprised of atomic blocks, a minimal search unit that is much smaller than the ones used in recent NAS algorithms. This search space allows a mix of operations by composing different types of atomic blocks, while the search space in previous methods only allows homogeneous operations. Based on this search space, we propose a resource-aware architecture search framework which automatically assigns the computational resources (e.g., output channel numbers) for each operation by jointly considering the performance and the computational cost. In addition, to accelerate the search process, we propose a dynamic network shrinkage technique which prunes the atomic blocks with negligible influence on outputs on the fly. Instead of a search-and-retrain two-stage paradigm, our method simultaneously searches and trains the target architecture. Our method achieves state-of-the-art performance under several FLOPs configurations on ImageNet with a small searching cost. We open our entire codebase at: https://github.com/meijieru/AtomNAS.

Results

TaskDatasetMetricValueModel
Neural Architecture SearchImageNetAccuracy77.6AtomNAS-C+†
Neural Architecture SearchImageNetTop-1 Error Rate22.4AtomNAS-C+†
Neural Architecture SearchImageNetAccuracy77.2AtomNAS-B+†
Neural Architecture SearchImageNetTop-1 Error Rate22.8AtomNAS-B+†
Neural Architecture SearchImageNetAccuracy76.3AtomNAS-A+†
Neural Architecture SearchImageNetTop-1 Error Rate23.7AtomNAS-A+†
AutoMLImageNetAccuracy77.6AtomNAS-C+†
AutoMLImageNetTop-1 Error Rate22.4AtomNAS-C+†
AutoMLImageNetAccuracy77.2AtomNAS-B+†
AutoMLImageNetTop-1 Error Rate22.8AtomNAS-B+†
AutoMLImageNetAccuracy76.3AtomNAS-A+†
AutoMLImageNetTop-1 Error Rate23.7AtomNAS-A+†

Related Papers

DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17AnalogNAS-Bench: A NAS Benchmark for Analog In-Memory Computing2025-06-23From Tiny Machine Learning to Tiny Deep Learning: A Survey2025-06-21One-Shot Neural Architecture Search with Network Similarity Directed Initialization for Pathological Image Classification2025-06-17DDS-NAS: Dynamic Data Selection within Neural Architecture Search via On-line Hard Example Mining applied to Image Classification2025-06-17MARCO: Hardware-Aware Neural Architecture Search for Edge Devices with Multi-Agent Reinforcement Learning and Conformal Prediction Filtering2025-06-16Finding Optimal Kernel Size and Dimension in Convolutional Neural Networks An Architecture Optimization Approach2025-06-16Directed Acyclic Graph Convolutional Networks2025-06-13