TasksSotADatasetsPapersMethodsSubmitAbout
Papers With Code 2

A community resource for machine learning research: papers, code, benchmarks, and state-of-the-art results.

Explore

Notable BenchmarksAll SotADatasetsPapersMethods

Community

Submit ResultsAbout

Data sourced from the PWC Archive (CC-BY-SA 4.0). Built by the community, for the community.

Methods/Set Transformer

Set Transformer

GeneralIntroduced 200020 papers
Source Paper

Description

Many machine learning tasks such as multiple instance learning, 3D shape recognition, and few-shot image classification are defined on sets of instances. Since solutions to such problems do not depend on the order of elements of the set, models used to address them should be permutation invariant. We present an attention-based neural network module, the Set Transformer, specifically designed to model interactions among elements in the input set. The model consists of an encoder and a decoder, both of which rely on attention mechanisms. In an effort to reduce computational complexity, we introduce an attention scheme inspired by inducing point methods from sparse Gaussian process literature. It reduces the computation time of self-attention from quadratic to linear in the number of elements in the set. We show that our model is theoretically attractive and we evaluate it on a range of tasks, demonstrating the state-of-the-art performance compared to recent methods for set-structured data.

Papers Using This Method

To Bin or not to Bin: Alternative Representations of Mass Spectra2025-02-15Advances in Set Function Learning: A Survey of Techniques and Applications2025-01-24Adaptive parameters identification for nonlinear dynamics using deep permutation invariant networks2025-01-20Multiset Transformer: Advancing Representation Learning in Persistence Diagrams2024-11-22Graph as Point Set2024-05-05Associative Transformer2023-09-22Pointersect: Neural Rendering with Cloud-Ray Intersection2023-04-24Event Voxel Set Transformer for Spatiotemporal Representation Learning on Event Streams2023-03-07Set Norm and Equivariant Skip Connections: Putting the Deep in Deep Sets2022-06-23Permutation-Invariant Relational Network for Multi-person 3D Pose Estimation2022-04-11Voxel Set Transformer: A Set-to-Set Approach to 3D Object Detection from Point Clouds2022-03-19Automated Identification of Cell Populations in Flow Cytometry Data with Transformers2021-08-23You are AllSet: A Multiset Function Framework for Hypergraph Neural Networks2021-06-24SetVAE: Learning Hierarchical Composition for Generative Modeling of Set-Structured Data2021-03-29Latent Variable Sequential Set Transformers For Joint Multi-Agent Motion Prediction2021-02-19Gaining Insight into SARS-CoV-2 Infection and COVID-19 Severity Using Self-supervised Edge Features and Graph Neural Networks2020-06-23Self-supervised edge features for improved Graph Neural Network training2020-06-23Few-Shot Learning as Domain Adaptation: Algorithm and Analysis2020-02-06Learning Set-equivariant Functions with SWARM Mappings2019-06-22Set Transformer: A Framework for Attention-based Permutation-Invariant Neural Networks2018-10-01