Gå til innhold
Vitenskapelig tidsskriftspublikasjon

Sb-PiPLU: A Novel Parametric Activation Function for Deep Learning

Mondal, Ayan; Shrivastava, Vimal K.; Chatterjee, Ayan; Ramachandra, Raghavendra

Publikasjonsdetaljer

Tidsskrift: IEEE Access, vol. 13, 72087–72103, 2025

Doi: doi.org/10.1109/ACCESS.2025.3561464

Sammendrag:
The choice of activation function—particularly non-linear ones—plays a vital role in enhancing the classification performance of deep neural networks. In recent years, a variety of non-linear activation functions have been proposed. However, many of these suffer from drawbacks that limit the effectiveness of deep learning models. Common issues include the dying neuron problem, bias shift, gradient explosion, and vanishing gradients. To address these challenges, we introduce a new activation function: Softsign-based Piecewise Parametric Linear Unit (Sb-PiPLU). This function offers improved non-linear approximation capabilities for neural networks. Its piecewise, parametric design allows for greater adaptability and flexibility, which in turn enhances overall model performance. We evaluated Sb-PiPLU through a series of image classification experiments across various Convolutional Neural Network (CNN) architectures. Additionally, we assessed its memory usage and computational cost, demonstrating that Sb-PiPLU is both stable and efficient in practical applications. Our experimental results show that Sb-PiPLU consistently outperforms conventional activation functions in both classification accuracy and computational efficiency. It achieved higher accuracy on multiple benchmark datasets, including CIFAR-10, CINIC-10, MWD, Brain Tumor, and SVHN, surpassing widely-used functions such as ReLU and Tanh. Due to its flexibility and robustness, Sb-PiPLU is particularly well-suited for complex image classification tasks.