TeLU: A New Activation Function for Deep Learning
Marina Adriana Mercioni, Stefan Holban
2021-01-01Deep Learning
Abstract
In this paper we proposed two novel activation functions, which we called them TeLU and TeLU learnable. These proposals are a combination of ReLU (Rectified Linear Unit), tangent(tanh), and ELU (Exponential Linear Units) without and with a learnable parameter. We prove that the activation functions TeLU and TeLU learnable give better results than other popular activation functions, including ReLU, Mish, TanhExp, using current architectures tested on Computer Vision datasets.
Related Papers
Automatic Classification and Segmentation of Tunnel Cracks Based on Deep Learning and Visual Explanations2025-07-18A Survey of Deep Learning for Geometry Problem Solving2025-07-16Uncertainty Quantification for Motor Imagery BCI -- Machine Learning vs. Deep Learning2025-07-10Chat-Ghosting: A Comparative Study of Methods for Auto-Completion in Dialog Systems2025-07-08Deep Learning Optimization of Two-State Pinching Antennas Systems2025-07-08AXLearn: Modular Large Model Training on Heterogeneous Infrastructure2025-07-07Determination Of Structural Cracks Using Deep Learning Frameworks2025-07-03Generalized Adaptive Transfer Network: Enhancing Transfer Learning in Reinforcement Learning Across Domains2025-07-02