TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Papers/Learning Implicitly Recurrent CNNs Through Parameter Sharing

Learning Implicitly Recurrent CNNs Through Parameter Sharing

Pedro Savarese, Michael Maire

2019-02-26ICLR 2019 5Image ClassificationNeural Architecture Search
PaperPDFCode(official)

Abstract

We introduce a parameter sharing scheme, in which different layers of a convolutional neural network (CNN) are defined by a learned linear combination of parameter tensors from a global bank of templates. Restricting the number of templates yields a flexible hybridization of traditional CNNs and recurrent networks. Compared to traditional CNNs, we demonstrate substantial parameter savings on standard image classification tasks, while maintaining accuracy. Our simple parameter sharing scheme, though defined via soft weights, in practice often yields trained networks with near strict recurrent structure; with negligible side effects, they convert into networks with actual loops. Training these networks thus implicitly involves discovery of suitable recurrent architectures. Though considering only the design aspect of recurrent links, our trained networks achieve accuracy competitive with those built using state-of-the-art neural architecture search (NAS) procedures. Our hybridization of recurrent and convolutional networks may also represent a beneficial architectural bias. Specifically, on synthetic tasks which are algorithmic in nature, our hybrid networks both train faster and extrapolate better to test examples outside the span of the training set.

Results

TaskDatasetMetricValueModel
Neural Architecture SearchCIFAR-10 Image ClassificationParams33.5Soft Parameter Sharing
Neural Architecture SearchCIFAR-10 Image ClassificationPercentage error2.53Soft Parameter Sharing
Neural Architecture SearchCIFAR-10Search Time (GPU days)0.7Soft Parameter Sharing
Image ClassificationCIFAR-10Percentage correct97.47Shared WRN
Image ClassificationCIFAR-100Percentage correct82.57Shared WRN
AutoMLCIFAR-10 Image ClassificationParams33.5Soft Parameter Sharing
AutoMLCIFAR-10 Image ClassificationPercentage error2.53Soft Parameter Sharing
AutoMLCIFAR-10Search Time (GPU days)0.7Soft Parameter Sharing

Related Papers

Automatic Classification and Segmentation of Tunnel Cracks Based on Deep Learning and Visual Explanations2025-07-18Adversarial attacks to image classification systems using evolutionary algorithms2025-07-17Efficient Adaptation of Pre-trained Vision Transformer underpinned by Approximately Orthogonal Fine-Tuning Strategy2025-07-17Federated Learning for Commercial Image Sources2025-07-17MUPAX: Multidimensional Problem Agnostic eXplainable AI2025-07-17DASViT: Differentiable Architecture Search for Vision Transformer2025-07-17Hashed Watermark as a Filter: Defeating Forging and Overwriting Attacks in Weight-based Neural Network Watermarking2025-07-15Transferring Styles for Reduced Texture Bias and Improved Robustness in Semantic Segmentation Networks2025-07-14