TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/Feedforward Network

Feedforward Network

GeneralIntroduced 20001339 papers

Description

A Feedforward Network, or a Multilayer Perceptron (MLP), is a neural network with solely densely connected layers. This is the classic neural network architecture of the literature. It consists of inputs xxx passed through units hhh (of which there can be many layers) to predict a target yyy. Activation functions are generally chosen to be non-linear to allow for flexible functional approximation.

Image Source: Deep Learning, Goodfellow et al

Papers Using This Method

Enhancing Expressivity of Quantum Neural Networks Based on the SWAP test2025-06-20DART: Differentiable Dynamic Adaptive Region Tokenizer for Vision Transformer and Mamba2025-06-12Probabilistic Variational Contrastive Learning2025-06-11scSSL-Bench: Benchmarking Self-Supervised Learning for Single-Cell Data2025-06-10Circumventing Backdoor Space via Weight Symmetry2025-06-09Weakly-supervised Localization of Manipulated Image Regions Using Multi-resolution Learned Features2025-05-29CF-DETR: Coarse-to-Fine Transformer for Real-Time Object Detection2025-05-29Hardware-Efficient Attention for Fast Decoding2025-05-27Improving Heart Rejection Detection in XPCI Images Using Synthetic Data Augmentation2025-05-26Is Attention Required for Transformer Inference? Explore Function-preserving Attention Replacement2025-05-24Hunyuan-TurboS: Advancing Large Language Models through Mamba-Transformer Synergy and Adaptive Chain-of-Thought2025-05-21SSPS: Self-Supervised Positive Sampling for Robust Self-Supervised Speaker Verification2025-05-20LiDAR MOT-DETR: A LiDAR-based Two-Stage Transformer for 3D Multiple Object Tracking2025-05-19SAINT: Attention-Based Modeling of Sub-Action Dependencies in Multi-Action Policies2025-05-17ToonifyGB: StyleGAN-based Gaussian Blendshapes for 3D Stylized Head Avatars2025-05-15Underwater object detection in sonar imagery with detection transformer and Zero-shot neural architecture search2025-05-10Attention Is Not All You Need: The Importance of Feedforward Networks in Transformer Models2025-05-10DiGIT: Multi-Dilated Gated Encoder and Central-Adjacent Region Integrated Decoder for Temporal Action Detection Transformer2025-05-09PIDiff: Image Customization for Personalized Identities with Diffusion Models2025-05-08Representation Learning via Non-Contrastive Mutual Information2025-04-23