Yayın:
ADLU: Adaptive double parametric activation functions

dc.contributor.authorGüney, Duman M.
dc.contributor.authorKoparal, S.
dc.contributor.authorÖmür, N.
dc.contributor.authorErtürk, A.
dc.contributor.authorAptoula, E.
dc.contributor.buuauthorKOPARAL, SİBEL
dc.contributor.departmentFen Edebiyat Fakültesi
dc.contributor.departmentMatematik Ana Bilim Dalı
dc.contributor.scopusid56437541100
dc.date.accessioned2025-11-28T08:02:12Z
dc.date.issued2026-01-01
dc.description.abstractActivation functions are critical components of neural networks, introducing the necessary nonlinearity for learning complex data relationships. While widely used functions such as ReLU and its variants have demonstrated notable success, they still suffer from limitations such as vanishing gradients, dead neurons, and limited adaptability at various degrees. This paper proposes two novel differentiable double-parameter activation functions (AdLU1 and AdLU2) designed to address these challenges. They incorporate tunable parameters to optimize gradient flow and enhance adaptability. Evaluations on benchmark datasets, MNIST, FMNIST, USPS, and CIFAR-10, using ResNet-18 and ResNet-50 architectures, demonstrate that the proposed functions consistently achieve high classification accuracy. Notably, AdLU1 improves accuracy by up to 5.5 % compared to ReLU, particularly in deeper architectures and more complex datasets. While introducing some computational overhead, their performance gains establish them as competitive alternatives to both traditional and modern activation functions.
dc.identifier.doi10.1016/j.dsp.2025.105579
dc.identifier.issn1051-2004
dc.identifier.scopus2-s2.0-105015608453
dc.identifier.urihttps://hdl.handle.net/11452/56871
dc.identifier.volume168
dc.indexed.scopusScopus
dc.language.isoen
dc.publisherElsevier Inc
dc.relation.journalDigital Signal Processing A Review Journal
dc.rightsinfo:eu-repo/semantics/closedAccess
dc.subjectResNet-50
dc.subjectResNet-18
dc.subjectDeep neural networks
dc.subjectAdLU
dc.subjectActivation functions
dc.titleADLU: Adaptive double parametric activation functions
dc.typeArticle
dspace.entity.typePublication
local.contributor.departmentFen Edebiyat Fakültesi/Matematik Ana Bilim Dalı
local.indexed.atScopus
relation.isAuthorOfPublication8cb7f10d-dcea-4fcd-90bb-650fdf67a97c
relation.isAuthorOfPublication.latestForDiscovery8cb7f10d-dcea-4fcd-90bb-650fdf67a97c

Dosyalar