TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Shapley-NAS: Discovering Operation Contribution for Neural...

Shapley-NAS: Discovering Operation Contribution for Neural Architecture Search

Han Xiao, Ziwei Wang, Zheng Zhu, Jie zhou, Jiwen Lu

2022-06-20CVPR 2022 1Neural Architecture Search
PaperPDFCode(official)

Abstract

In this paper, we propose a Shapley value based method to evaluate operation contribution (Shapley-NAS) for neural architecture search. Differentiable architecture search (DARTS) acquires the optimal architectures by optimizing the architecture parameters with gradient descent, which significantly reduces the search cost. However, the magnitude of architecture parameters updated by gradient descent fails to reveal the actual operation importance to the task performance and therefore harms the effectiveness of obtained architectures. By contrast, we propose to evaluate the direct influence of operations on validation accuracy. To deal with the complex relationships between supernet components, we leverage Shapley value to quantify their marginal contributions by considering all possible combinations. Specifically, we iteratively optimize the supernet weights and update the architecture parameters by evaluating operation contributions via Shapley value, so that the optimal architectures are derived by selecting the operations that contribute significantly to the tasks. Since the exact computation of Shapley value is NP-hard, the Monte-Carlo sampling based algorithm with early truncation is employed for efficient approximation, and the momentum update mechanism is adopted to alleviate fluctuation of the sampling process. Extensive experiments on various datasets and various search spaces show that our Shapley-NAS outperforms the state-of-the-art methods by a considerable margin with light search cost. The code is available at https://github.com/Euphoria16/Shapley-NAS.git

Results

TaskDatasetMetricValueModel
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Test)46.85Shapley-NAS
Neural Architecture SearchNAS-Bench-201, ImageNet-16-120Accuracy (Val)46.57Shapley-NAS
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Test)94.37Shapley-NAS
Neural Architecture SearchNAS-Bench-201, CIFAR-10Accuracy (Val)91.61Shapley-NAS
Neural Architecture SearchImageNetTop-1 Error Rate23.9Shapley-NAS
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Test)73.51Shapley-NAS
Neural Architecture SearchNAS-Bench-201, CIFAR-100Accuracy (Val)73.49Shapley-NAS
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Test)46.85Shapley-NAS
AutoMLNAS-Bench-201, ImageNet-16-120Accuracy (Val)46.57Shapley-NAS
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Test)94.37Shapley-NAS
AutoMLNAS-Bench-201, CIFAR-10Accuracy (Val)91.61Shapley-NAS
AutoMLImageNetTop-1 Error Rate23.9Shapley-NAS
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Test)73.51Shapley-NAS
AutoMLNAS-Bench-201, CIFAR-100Accuracy (Val)73.49Shapley-NAS

Related Papers

DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17AnalogNAS-Bench: A NAS Benchmark for Analog In-Memory Computing2025-06-23From Tiny Machine Learning to Tiny Deep Learning: A Survey2025-06-21One-Shot Neural Architecture Search with Network Similarity Directed Initialization for Pathological Image Classification2025-06-17DDS-NAS: Dynamic Data Selection within Neural Architecture Search via On-line Hard Example Mining applied to Image Classification2025-06-17MARCO: Hardware-Aware Neural Architecture Search for Edge Devices with Multi-Agent Reinforcement Learning and Conformal Prediction Filtering2025-06-16Finding Optimal Kernel Size and Dimension in Convolutional Neural Networks An Architecture Optimization Approach2025-06-16Directed Acyclic Graph Convolutional Networks2025-06-13