TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/ShuffleNet Block

ShuffleNet Block

GeneralIntroduced 200051 papers
Source Paper

Description

A ShuffleNet Block is an image model block that utilises a channel shuffle operation, along with depthwise convolutions, for an efficient architectural design. It was proposed as part of the ShuffleNet architecture. The starting point is the Residual Block unit from ResNets, which is then modified with a pointwise group convolution and a channel shuffle operation.

Papers Using This Method

Deploying and Evaluating Multiple Deep Learning Models on Edge Devices for Diabetic Retinopathy Detection2025-06-14Comparison of Neural Models for X-ray Image Classification in COVID-19 Detection2025-01-08Advancing Green AI: Efficient and Accurate Lightweight CNNs for Rice Leaf Disease Identification2024-08-03Fragility, Robustness and Antifragility in Deep Learning2023-12-15Generalizability of CNN Architectures for Face Morph Presentation Attack2023-10-17A Non-monotonic Smooth Activation Function2023-10-16Multi-Transfer Learning Techniques for Detecting Auditory Brainstem Response2023-08-29PSDNet: Determination of Particle Size Distributions Using Synthetic Soil Images and Convolutional Neural Networks2023-03-07Use Cases for Time-Frequency Image Representations and Deep Learning Techniques for Improved Signal Classification2023-02-22QLABGrad: a Hyperparameter-Free and Convergence-Guaranteed Scheme for Deep Learning2023-02-01Predicting microsatellite instability and key biomarkers in colorectal cancer from H&E-stained images: Achieving SOTA predictive performance with fewer data using Swin Transformer2022-08-22Design and Analysis of Novel Bit-flip Attacks and Defense Strategies for DNNs2022-06-24Structured Pruning is All You Need for Pruning CNNs at Initialization2022-03-04Multimodal registration of FISH and nanoSIMS images using convolutional neural network models2022-01-14ThreshNet: An Efficient DenseNet Using Threshold Mechanism to Reduce Connections2022-01-09Smooth Maximum Unit: Smooth Activation Function for Deep Networks Using Smoothing Maximum Technique2022-01-01SMU: smooth activation function for deep networks using smoothing maximum technique2021-11-08Scaling-up Diverse Orthogonal Convolutional Networks by a Paraunitary Framework2021-09-29SAU: Smooth activation function using convolution with approximate identities2021-09-27ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions2021-09-09