TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/FcaNet

FcaNet

Frequency channel attention networks

GeneralIntroduced 20002 papers
Source Paper

Description

FCANet contains a novel multi-spectral channel attention module. Given an input feature map X∈RC×H×WX \in \mathbb{R}^{C \times H \times W}X∈RC×H×W, multi-spectral channel attention first splits XXX into many parts xi∈RC′×H×Wx^{i} \in \mathbb{R}^{C' \times H \times W}xi∈RC′×H×W. Then it applies a 2D DCT to each part xix^{i}xi. Note that a 2D DCT can use pre-processing results to reduce computation. After processing each part, all results are concatenated into a vector. Finally, fully connected layers, ReLU activation and a sigmoid are used to get the attention vector as in an SE block. This can be formulated as: \begin{align} s = F_\text{fca}(X, \theta) & = \sigma (W_{2} \delta (W_{1}[(\text{DCT}(\text{Group}(X)))])) \end{align} \begin{align} Y & = s X \end{align} where Group(⋅)\text{Group}(\cdot)Group(⋅) indicates dividing the input into many groups and DCT(⋅)\text{DCT}(\cdot)DCT(⋅) is the 2D discrete cosine transform.

This work based on information compression and discrete cosine transforms achieves excellent performance on the classification task.

Papers Using This Method

OrthoNets: Orthogonal Channel Attention Networks2023-11-06FcaNet: Frequency Channel Attention Networks2020-12-22