TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted...

NSGANetV2: Evolutionary Multi-Objective Surrogate-Assisted Neural Architecture Search

Zhichao Lu, Kalyanmoy Deb, Erik Goodman, Wolfgang Banzhaf, Vishnu Naresh Boddeti

2020-07-20ECCV 2020 8Image ClassificationTransfer LearningNeural Architecture Search
PaperPDFCode(official)

Abstract

In this paper, we propose an efficient NAS algorithm for generating task-specific models that are competitive under multiple competing objectives. It comprises of two surrogates, one at the architecture level to improve sample efficiency and one at the weights level, through a supernet, to improve gradient descent training efficiency. On standard benchmark datasets (C10, C100, ImageNet), the resulting models, dubbed NSGANetV2, either match or outperform models from existing approaches with the search being orders of magnitude more sample efficient. Furthermore, we demonstrate the effectiveness and versatility of the proposed method on six diverse non-standard datasets, e.g. STL-10, Flowers102, Oxford Pets, FGVC Aircrafts etc. In all cases, NSGANetV2s improve the state-of-the-art (under mobile setting), suggesting that NAS can be a viable alternative to conventional transfer learning approaches in handling diverse scenarios such as small-scale or fine-grained datasets. Code is available at https://github.com/mikelzc1990/nsganetv2

Results

TaskDatasetMetricValueModel
Neural Architecture SearchImageNetAccuracy80.4NSGANetV2-xl
Neural Architecture SearchImageNetTop-1 Error Rate19.6NSGANetV2-xl
Neural Architecture SearchImageNetAccuracy79.1NSGANetV2-l
Neural Architecture SearchImageNetTop-1 Error Rate20.9NSGANetV2-l
Neural Architecture SearchImageNetAccuracy78.3NSGANetV2-m
Neural Architecture SearchImageNetTop-1 Error Rate21.7NSGANetV2-m
Neural Architecture SearchImageNetAccuracy77.4NSGANetV2-s
Neural Architecture SearchImageNetTop-1 Error Rate22.6NSGANetV2-s
Image ClassificationSTL-10Percentage correct92NSGANetV2
AutoMLImageNetAccuracy80.4NSGANetV2-xl
AutoMLImageNetTop-1 Error Rate19.6NSGANetV2-xl
AutoMLImageNetAccuracy79.1NSGANetV2-l
AutoMLImageNetTop-1 Error Rate20.9NSGANetV2-l
AutoMLImageNetAccuracy78.3NSGANetV2-m
AutoMLImageNetTop-1 Error Rate21.7NSGANetV2-m
AutoMLImageNetAccuracy77.4NSGANetV2-s
AutoMLImageNetTop-1 Error Rate22.6NSGANetV2-s

Related Papers

Automatic Classification and Segmentation of Tunnel Cracks Based on Deep Learning and Visual Explanations2025-07-18RaMen: Multi-Strategy Multi-Modal Learning for Bundle Construction2025-07-18Adversarial attacks to image classification systems using evolutionary algorithms2025-07-17Efficient Adaptation of Pre-trained Vision Transformer underpinned by Approximately Orthogonal Fine-Tuning Strategy2025-07-17Federated Learning for Commercial Image Sources2025-07-17MUPAX: Multidimensional Problem Agnostic eXplainable AI2025-07-17Disentangling coincident cell events using deep transfer learning and compressive sensing2025-07-17DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17