Description
ADACT: LEARNING TO OPTIMIZE ACTIVATION FUNCTION CHOICE THROUGH ADAPTIVE ACTIVATION MODULES
This paper presents an innovative approach to enhancing neural network performance through the development and implementation of an adaptive activation function, termed adaptive activation (AdAct). AdAct amalgamates various wellestablished and novel activation functions into a single, learnable framework, allowing dynamic adaptation to specific network layers’ needs. We explore the effectiveness of ReLU and its variants, including ELU, LReLU, PReLU, RReLU, and more recent functions like Swish and Mish, integrating them into the AdAct function. Employing ConvNet variants across FMNIST, CIFAR10, SVHN and FER datasets, our study empirically assesses each function’s contribution and demonstrates AdAct’s potential in optimizing neural networks, especially in selecting optimal activation functions for diverse tasks.