DSpace Repository

Self-Gated Rectified Linear Unit for Performance Improvement of Deep Neural Networks

Show simple item record

dc.contributor.author Jahana, Israt
dc.contributor.author Ahmeda, Md. Faisal
dc.contributor.author Alia, Md. Osman
dc.contributor.author Jang, Yeong Min
dc.date.accessioned 2024-08-27T07:51:28Z
dc.date.available 2024-08-27T07:51:28Z
dc.date.issued 2022-01-03
dc.identifier.uri http://dspace.daffodilvarsity.edu.bd:8080/handle/123456789/13221
dc.description.abstract This technical paper proposes an activation function, self-gated rectified linear unit (SGReLU), to achieve high classification accuracy, low loss, and low computational time. Vanishing gradient problem, dying ReLU, noise vulnerability are also resolved in our proposed SGReLU function. SGReLU’s performance is evaluated on MNIST, Fashion-MNIST, and Imagenet datasets and compared with seven highly effective activation functions. We obtained that the proposed SGReLU outperformed other activation functions in most cases in VGG16, Inception v3, and ResNet50. In VGG16 and Inception v3, it achieved an accuracy of 90.87% and 95.01%, respectively, exceeding other functions with the second-fastest computing time in these networks. en_US
dc.language.iso en_US en_US
dc.publisher Elsevier en_US
dc.subject Classification en_US
dc.subject Neural networks en_US
dc.subject Linear programming en_US
dc.subject Computing en_US
dc.title Self-Gated Rectified Linear Unit for Performance Improvement of Deep Neural Networks en_US
dc.type Article en_US


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search DSpace


Browse

My Account