The GatedTabTransformer. An enhanced deep learning architecture for tabular modeling
Radostin Cholakov, Todor Kolev
Abstract
There is an increasing interest in the application of deep learning architectures to tabular data. One of the state-of-the-art solutions is TabTransformer which incorporates an attention mechanism to better track relationships between categorical features and then makes use of a standard MLP to output its final logits. In this paper we propose multiple modifications to the original TabTransformer performing better on binary classification tasks for three separate datasets with more than 1% AUROC gains. Inspired by gated MLP, linear projections are implemented in the MLP block and multiple activation functions are tested. We also evaluate the importance of specific hyper parameters during training.
Related Papers
Automatic Classification and Segmentation of Tunnel Cracks Based on Deep Learning and Visual Explanations2025-07-18A Survey of Deep Learning for Geometry Problem Solving2025-07-16An Automated Classifier of Harmful Brain Activities for Clinical Usage Based on a Vision-Inspired Pre-trained Framework2025-07-10Uncertainty Quantification for Motor Imagery BCI -- Machine Learning vs. Deep Learning2025-07-10Chat-Ghosting: A Comparative Study of Methods for Auto-Completion in Dialog Systems2025-07-08Deep Learning Optimization of Two-State Pinching Antennas Systems2025-07-08AXLearn: Modular Large Model Training on Heterogeneous Infrastructure2025-07-07Determination Of Structural Cracks Using Deep Learning Frameworks2025-07-03