Description
Bi-attention employs the attention-in-attention (AiA) mechanism to capture second-order statistical information: the outer point-wise channel attention vectors are computed from the output of the inner channel attention.
Papers Using This Method
DiaMond: Dementia Diagnosis with Multi-Modal Vision Transformers Using MRI and PET2024-10-30Elite360M: Efficient 360 Multi-task Learning via Bi-projection Fusion and Cross-task Collaboration2024-08-18Elite360D: Towards Efficient 360 Depth Estimation via Semantic- and Distance-Aware Bi-Projection Fusion2024-03-25Alzheimer's Disease Prediction via Brain Structural-Functional Deep Fusing Network2023-09-28GraphCare: Enhancing Healthcare Predictions with Personalized Knowledge Graphs2023-05-22Tri-Attention: Explicit Context-Aware Attention Mechanism for Natural Language Processing2022-11-05Cross-Modal Transformer GAN: A Brain Structure-Function Deep Fusing Framework for Alzheimer's Disease2022-06-20Compare learning: bi-attention network for few-shot learning2022-03-25BiBERT: Accurate Fully Binarized BERT2022-03-12Gated Convolutional Bidirectional Attention-based Model for Off-topic Spoken Response Detection2020-04-20Bilinear Attention Networks for Person Retrieval2019-10-01