A COMPARATIVE STUDY OF ACTIVATION FUNCTIONS IN DEEP LEARNING MODELS

Авторы

  • Abbaz Primbetov tatu
  • Navruz Akbarov

Ключевые слова:

Activation Functions, Spline Activation, Convolutional Neural Networks, CIFAR-10, Image Classification, ReLU, Swish, Deep Learning

Аннотация

Activation functions play a vital role in the training dynamics and generalization performance of deep learning models. This study presents a comparative analysis of ten widely used activation functions—ReLU, Sigmoid, Tanh, ELU, SELU, Softplus, Softsign, Swish, GELU, and a custom spline-based function—within a unified convolutional neural network (CNN) architecture. All models were trained and evaluated on the CIFAR-10 dataset under identical experimental settings, including fixed learning rate, batch size, number of epochs, and architecture configuration. 

Библиографические ссылки

Bouraya, S., & Belangour, A. (2024). A comparative analysis of activation functions in neural networks: unveiling categories. Bulletin of Electrical Engineering and Informatics, 13(5), 3301-3308.

Sharma, S., Sharma, S., & Athaiya, A. (2017). Activation functions in neural networks. Towards Data Sci, 6(12), 310-316.

Dubey, S. R., Singh, S. K., & Chaudhuri, B. B. (2022). Activation functions in deep learning: A comprehensive survey and benchmark. Neurocomputing, 503, 92-108.

Rasamoelina, A. D., Adjailia, F., & Sinčák, P. (2020, January). A review of activation function for artificial neural network. In 2020 IEEE 18th world symposium on applied machine intelligence and informatics (SAMI) (pp. 281-286). IEEE.

Xu, B., Wang, N., Chen, T., & Li, M. (2015). Empirical evaluation of rectified activations in convolutional network. arXiv preprint arXiv:1505.00853.

He, K., Zhang, X., Ren, S., & Sun, J. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. In Proceedings of the IEEE international conference on computer vision (pp. 1026-1034).

Scardapane, S., Scarpiniti, M., Comminiello, D., & Uncini, A. (2017, June). Learning activation functions from data using cubic spline interpolation. In Italian Workshop on Neural Nets (pp. 73-83). Cham: Springer International Publishing.

Bohra, P., Campos, J., Gupta, H., Aziznejad, S., & Unser, M. (2020). Learning activation functions in deep (spline) neural networks. IEEE Open Journal of Signal Processing, 1, 295-309.

Primbetov Abbaz. (2025). DEEPFAKE DETECTION USING A HYBRID RESNEXT AND LSTM ARCHITECTURE. Al-Farg’oniy avlodlari, 1(2), 87-94.

Primbetov Abbaz. (2025). DEEPFAKE TECHNOLOGY: THREAT LANDSCAPE, DETECTION TECHNIQUES, AND ETHICAL GOVERNANCE. Al-Farg’oniy avlodlari, 1(2), 106-112.

Загрузки

Опубликован

2025-10-03

Как цитировать

Primbetov, A., & Akbarov, N. (2025). A COMPARATIVE STUDY OF ACTIVATION FUNCTIONS IN DEEP LEARNING MODELS. Потомки Аль-Фаргани, (3), 80–84. извлечено от https://al-fargoniy.uz/index.php/journal/article/view/893

Наиболее читаемые статьи этого автора (авторов)