ShuffleNet

Computer VisionIntroduced 200051 papers

Description

ShuffleNet is a convolutional neural network designed specially for mobile devices with very limited computing power. The architecture utilizes two new operations, pointwise group convolution and channel shuffle, to reduce computation cost while maintaining accuracy.

Papers Using This Method

Deploying and Evaluating Multiple Deep Learning Models on Edge Devices for Diabetic Retinopathy Detection2025-06-14Comparison of Neural Models for X-ray Image Classification in COVID-19 Detection2025-01-08Advancing Green AI: Efficient and Accurate Lightweight CNNs for Rice Leaf Disease Identification2024-08-03Fragility, Robustness and Antifragility in Deep Learning2023-12-15Generalizability of CNN Architectures for Face Morph Presentation Attack2023-10-17A Non-monotonic Smooth Activation Function2023-10-16Multi-Transfer Learning Techniques for Detecting Auditory Brainstem Response2023-08-29PSDNet: Determination of Particle Size Distributions Using Synthetic Soil Images and Convolutional Neural Networks2023-03-07Use Cases for Time-Frequency Image Representations and Deep Learning Techniques for Improved Signal Classification2023-02-22QLABGrad: a Hyperparameter-Free and Convergence-Guaranteed Scheme for Deep Learning2023-02-01Predicting microsatellite instability and key biomarkers in colorectal cancer from H&E-stained images: Achieving SOTA predictive performance with fewer data using Swin Transformer2022-08-22Design and Analysis of Novel Bit-flip Attacks and Defense Strategies for DNNs2022-06-24Structured Pruning is All You Need for Pruning CNNs at Initialization2022-03-04Multimodal registration of FISH and nanoSIMS images using convolutional neural network models2022-01-14ThreshNet: An Efficient DenseNet Using Threshold Mechanism to Reduce Connections2022-01-09Smooth Maximum Unit: Smooth Activation Function for Deep Networks Using Smoothing Maximum Technique2022-01-01SMU: smooth activation function for deep networks using smoothing maximum technique2021-11-08Scaling-up Diverse Orthogonal Convolutional Networks by a Paraunitary Framework2021-09-29SAU: Smooth activation function using convolution with approximate identities2021-09-27ErfAct and Pserf: Non-monotonic Smooth Trainable Activation Functions2021-09-09