TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/PNAS

PNAS

Progressive Neural Architecture Search

GeneralIntroduced 200016 papers
Source Paper

Description

Progressive Neural Architecture Search, or PNAS, is a method for learning the structure of convolutional neural networks (CNNs). It uses a sequential model-based optimization (SMBO) strategy, where we search the space of cell structures, starting with simple (shallow) models and progressing to complex ones, pruning out unpromising structures as we go.

At iteration bbb of the algorithm, we have a set of KKK candidate cells (each of size bbb blocks), which we train and evaluate on a dataset of interest. Since this process is expensive, PNAS also learns a model or surrogate function which can predict the performance of a structure without needing to train it. We then expand the KKK candidates of size bbb into K′≫KK' \gg KK′≫K children, each of size b+1b+1b+1. The surrogate function is used to rank all of the K′K'K′ children, pick the top KKK, and then train and evaluate them. We continue in this way until b=Bb=Bb=B, which is the maximum number of blocks we want to use in a cell.

Papers Using This Method

Barrier-Free Microhabitats: Self-Organized Seclusion in Microbial Communities2025-03-27Mixed Delay/Nondelay Embeddings Based Neuromorphic Computing with Patterned Nanomagnet Arrays2024-12-05Fluctuations and the limit of predictability in protein evolution2024-12-02From Complexity to Clarity: How AI Enhances Perceptions of Scientists and the Public's Understanding of Science2024-04-23Cell motility modes are selected by the interplay of mechanosensitive adhesion and membrane tension2023-05-31Convergence of the Backward Deep BSDE Method with Applications to Optimal Stopping Problems2022-10-08POPNASv2: An Efficient Multi-Objective Neural Architecture Search Technique2022-10-06Stochastic Gradient Descent and Anomaly of Variance-flatness Relation in Artificial Neural Networks2022-07-11Information Cocoons in Online Navigation2021-09-14Learning Interaction Kernels for Agent Systems on Riemannian Manifolds2021-01-30Stabilizing Deep Tomographic Reconstruction2020-08-04On the Transcriptomic Signature and General Stress State Associated with Aneuploidy2020-07-28Weak SINDy For Partial Differential Equations2020-07-06Sparse Identification of Nonlinear Dynamical Systems via Reweighted $\ell_1$-regularized Least Squares2020-05-27Auto-Keras: An Efficient Neural Architecture Search System2018-06-27Progressive Neural Architecture Search2017-12-02