TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Generalization Properties of NAS under Activation and Skip...

Generalization Properties of NAS under Activation and Skip Connection Search

Zhenyu Zhu, Fanghui Liu, Grigorios G Chrysos, Volkan Cevher

2022-09-15Neural Architecture Search
PaperPDF

Abstract

Neural Architecture Search (NAS) has fostered the automatic discovery of state-of-the-art neural architectures. Despite the progress achieved with NAS, so far there is little attention to theoretical guarantees on NAS. In this work, we study the generalization properties of NAS under a unifying framework enabling (deep) layer skip connection search and activation function search. To this end, we derive the lower (and upper) bounds of the minimum eigenvalue of the Neural Tangent Kernel (NTK) under the (in)finite-width regime using a certain search space including mixed activation functions, fully connected, and residual neural networks. We use the minimum eigenvalue to establish generalization error bounds of NAS in the stochastic gradient descent training. Importantly, we theoretically and experimentally show how the derived results can guide NAS to select the top-performing architectures, even in the case without training, leading to a train-free algorithm based on our theory. Accordingly, our numerical validation shed light on the design of computationally efficient methods for NAS. Our analysis is non-trivial due to the coupling of various architectures and activation functions under the unifying framework and has its own interest in providing the lower bound of the minimum eigenvalue of NTK in deep learning theory.

Results

TaskDatasetMetricValueModel
Neural Architecture SearchNATS-Bench Topology, CIFAR-10Test Accuracy93.46EigenNas (Zhu et al., 2022)
Neural Architecture SearchNATS-Bench Topology, CIFAR-100Test Accuracy71.42EigenNas (Zhu et al., 2022)
Neural Architecture SearchNATS-Bench Topology, ImageNet16-120Test Accuracy45.54EigenNas (Zhu et al., 2022)
AutoMLNATS-Bench Topology, CIFAR-10Test Accuracy93.46EigenNas (Zhu et al., 2022)
AutoMLNATS-Bench Topology, CIFAR-100Test Accuracy71.42EigenNas (Zhu et al., 2022)
AutoMLNATS-Bench Topology, ImageNet16-120Test Accuracy45.54EigenNas (Zhu et al., 2022)

Related Papers

DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17AnalogNAS-Bench: A NAS Benchmark for Analog In-Memory Computing2025-06-23From Tiny Machine Learning to Tiny Deep Learning: A Survey2025-06-21One-Shot Neural Architecture Search with Network Similarity Directed Initialization for Pathological Image Classification2025-06-17DDS-NAS: Dynamic Data Selection within Neural Architecture Search via On-line Hard Example Mining applied to Image Classification2025-06-17MARCO: Hardware-Aware Neural Architecture Search for Edge Devices with Multi-Agent Reinforcement Learning and Conformal Prediction Filtering2025-06-16Finding Optimal Kernel Size and Dimension in Convolutional Neural Networks An Architecture Optimization Approach2025-06-16Directed Acyclic Graph Convolutional Networks2025-06-13