Description
NAS-FPN is a Feature Pyramid Network that is discovered via Neural Architecture Search in a novel scalable search space covering all cross-scale connections. The discovered architecture consists of a combination of top-down and bottom-up connections to fuse features across scales
Papers Using This Method
SaRPFF: A Self-Attention with Register-based Pyramid Feature Fusion module for enhanced RLD detection2024-02-26SegNeXt: Rethinking Convolutional Attention Design for Semantic Segmentation2022-09-18OPANAS: One-Shot Path Aggregation Network Architecture Search for Object Detection2021-03-08Simple Copy-Paste is a Strong Data Augmentation Method for Instance Segmentation2020-12-13Rethinking Pre-training and Self-training2020-06-11SP-NAS: Serial-to-Parallel Backbone Search for Object Detection2020-06-01SpineNet: Learning Scale-Permuted Backbone for Recognition and Localization2019-12-10MnasFPN: Learning Latency-aware Pyramid Architecture for Object Detection on Mobile Devices2019-12-02Learning Data Augmentation Strategies for Object Detection2019-06-26NAS-FPN: Learning Scalable Feature Pyramid Architecture for Object Detection2019-04-16ShapeMask: Learning to Segment Novel Objects by Refining Shape Priors2019-04-05