{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,5,8]],"date-time":"2026-05-08T22:14:28Z","timestamp":1778278468478,"version":"3.51.4"},"reference-count":30,"publisher":"MDPI AG","issue":"5","license":[{"start":{"date-parts":[[2022,5,17]],"date-time":"2022-05-17T00:00:00Z","timestamp":1652745600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"University of Nottingham","award":["I03211200008"],"award-info":[{"award-number":["I03211200008"]}]},{"name":"University of Nottingham","award":["72071116"],"award-info":[{"award-number":["72071116"]}]},{"name":"University of Nottingham","award":["2019B10026"],"award-info":[{"award-number":["2019B10026"]}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["I03211200008"],"award-info":[{"award-number":["I03211200008"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["72071116"],"award-info":[{"award-number":["72071116"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["2019B10026"],"award-info":[{"award-number":["2019B10026"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"name":"Ningbo Municipal Bureau Science and Technology","award":["I03211200008"],"award-info":[{"award-number":["I03211200008"]}]},{"name":"Ningbo Municipal Bureau Science and Technology","award":["72071116"],"award-info":[{"award-number":["72071116"]}]},{"name":"Ningbo Municipal Bureau Science and Technology","award":["2019B10026"],"award-info":[{"award-number":["2019B10026"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Symmetry"],"abstract":"<jats:p>As a key step to endow the neural network with nonlinear factors, the activation function is crucial to the performance of the network. This paper proposes an Efficient Asymmetric Nonlinear Activation Function (EANAF) for deep neural networks. Compared with existing activation functions, the proposed EANAF requires less computational effort, and it is self-regularized, asymmetric and non-monotonic. These desired characteristics facilitate the outstanding performance of the proposed EANAF. To demonstrate the effectiveness of this function in the field of object detection, the proposed activation function is compared with several state-of-the-art activation functions on the typical backbone networks such as ResNet and DSPDarkNet. The experimental results demonstrate the superior performance of the proposed EANAF.<\/jats:p>","DOI":"10.3390\/sym14051027","type":"journal-article","created":{"date-parts":[[2022,5,17]],"date-time":"2022-05-17T08:34:29Z","timestamp":1652776469000},"page":"1027","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":21,"title":["An Efficient Asymmetric Nonlinear Activation Function for Deep Neural Networks"],"prefix":"10.3390","volume":"14","author":[{"given":"Enhui","family":"Chai","sequence":"first","affiliation":[{"name":"Baotou Teachers College, Inner Mongolia University of Science and Technology, Baotou 014030, China"}]},{"given":"Wei","family":"Yu","sequence":"additional","affiliation":[{"name":"School of Business, Ningbo University, Ningbo 315000, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0102-2581","authenticated-orcid":false,"given":"Tianxiang","family":"Cui","sequence":"additional","affiliation":[{"name":"School of Computer Science, University of Nottingham Ningbo China, Ningbo 315000, China"}]},{"given":"Jianfeng","family":"Ren","sequence":"additional","affiliation":[{"name":"School of Computer Science, University of Nottingham Ningbo China, Ningbo 315000, China"}]},{"given":"Shusheng","family":"Ding","sequence":"additional","affiliation":[{"name":"School of Business, Ningbo University, Ningbo 315000, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,5,17]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"2278","DOI":"10.1109\/5.726791","article-title":"Gradient-based learning applied to document recognition","volume":"86","author":"Lecun","year":"1998","journal-title":"Proc. IEEE"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Gatys, L.A., Ecker, A.S., and Bethge, M. (2016, January 27\u201330). Image Style Transfer Using Convolutional Neural Networks. Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.265"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Cheng, J., Dong, L., and Lapata, M. (2016). Long Short-Term Memory-Networks for Machine Reading. arXiv.","DOI":"10.18653\/v1\/D16-1053"},{"key":"ref_4","unstructured":"Bishop, C.M. (1993). Neural Networks for Pattern Recognition. Advances in Computers, Clarendon Press."},{"key":"ref_5","first-page":"49","article-title":"The piecewise non-linear approximation of the sigmoid function and its implementation in FPGA","volume":"43","author":"Yukun","year":"2017","journal-title":"Appl. Electron. Technol."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"14","DOI":"10.1016\/j.neunet.2021.01.026","article-title":"A survey on modern trainable activation functions","volume":"138","author":"Apicella","year":"2021","journal-title":"Neural Netw."},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Szandaa, T. (2020). Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks. Bio-Inspired Neurocomputing, Springer.","DOI":"10.1007\/978-981-15-5495-7_11"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"48","DOI":"10.1016\/j.neunet.2022.01.001","article-title":"Discovering Parametric Activation Functions","volume":"148","author":"Bingham","year":"2022","journal-title":"Neural Netw."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (2016). Deep Residual Learning for Image Recognition. arXiv.","DOI":"10.1109\/CVPR.2016.90"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"318","DOI":"10.1109\/TPAMI.2018.2858826","article-title":"Focal Loss for Dense Object Detection","volume":"42","author":"Lin","year":"2017","journal-title":"Trans. Pattern Anal. Mach. Intell."},{"key":"ref_11","unstructured":"Bochkovskiy, A., Wang, C.Y., and Liao, H. (2020). YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Lin, T.Y., Maire, M., Belongie, S., Hays, J., Perona, P., Ramanan, D., Doll\u00e1r, P., and Zitnick, C.L. (2014, January 6\u201312). Microsoft COCO: Common Objects in Context. Proceedings of the European Conference on Computer Vision 2014, Zurich, Switzerland.","DOI":"10.1007\/978-3-319-10602-1_48"},{"key":"ref_13","first-page":"416","article-title":"Generalized Fuzzy Hyperbolic Model: A Universal Approximator","volume":"30","author":"Huaguang","year":"2004","journal-title":"J. Autom. Sin."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Chang, C.H., Zhang, E.H., and Huang, S.H. (2019, January 3\u20136). Softsign Function Hardware Implementation Using Piecewise Linear Approximation. Proceedings of the 2019 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS), Taipei, Taiwan.","DOI":"10.1109\/ISPACS48206.2019.8986274"},{"key":"ref_15","unstructured":"Krizhevsky, A., Sutskever, I., and Hinton, G.E. (2012). Imagenet classification with deep convolutional neural networks. Advances in Neural Information Processing Systems, Association for Computing Machinery."},{"key":"ref_16","unstructured":"Nair, V., and Hinton, G.E. (2010, January 21\u201324). Rectified linear units improve restricted boltzmann machines. Proceedings of the 27th International Conference on International Conference on Machine Learning, Haifa, Israel."},{"key":"ref_17","unstructured":"Howard, A.G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., and Adam, H. (2017). MobileNets: Efficient Convolutional Neural Networks for Mobile Vision Applications. arXiv."},{"key":"ref_18","unstructured":"Maas, A.L., Hannun, A.Y., and Ng, A.Y. (2013, January 16\u201321). Rectifier nonlinearities improve neural network acoustic models. Proceedings of the ICML, Atlanta, GA, USA."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (2015). Delving deep into rectifiers: Surpassing human-level performance on imagenet classification. arXiv.","DOI":"10.1109\/ICCV.2015.123"},{"key":"ref_20","unstructured":"Xu, B., Wang, N., Chen, T., and Li, M. (2015). Empirical evaluation of rectified activations in convolutional network. arXiv."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"241","DOI":"10.1007\/s11760-020-01746-9","article-title":"Parametric rectified nonlinear unit (PRenu) for convolution neural networks","volume":"15","author":"Ellahyani","year":"2021","journal-title":"Signal Image Video Process."},{"key":"ref_22","unstructured":"Clevert, D., Unterthiner, T., and Hochreiter, S. (2015). Fast and accurate deep network learning by exponential linear units. arXiv."},{"key":"ref_23","unstructured":"Barron, J.T. (2017). Continuously Differentiable Exponential Linear Units. arXiv."},{"key":"ref_24","unstructured":"Klambauer, G., Unterthiner, T., Mayr, A., and Hochreiter, S. (2017). Self-Normalizing Neural Networks. arXiv."},{"key":"ref_25","unstructured":"Hendrycks, D., and Gimpel, K. (2016). Gaussian error linear units (gelus). arXiv."},{"key":"ref_26","unstructured":"Chao, Y., and Su, Z. (2019). Symmetrical Gaussian Error Linear Units (SGELUs). arXiv."},{"key":"ref_27","unstructured":"Dugas, C., Bengio, Y., Belisle, F., and Nadeau, C. (2000). Incorporating second order functional knowledge into learning algorithms. Advances in Neural Information Processing Systems 13, Proceedings of the 2000 Neural Information Processing Systems (NIPS) Conference, Denver, CO, USA, 28\u201330 November 2000, MIT Press."},{"key":"ref_28","unstructured":"Ramachandran, P., Zoph, B., and Le, Q.V. (2017). Searching for activation functions. arXiv."},{"key":"ref_29","unstructured":"Misra, D. (2020). Mish: A Self Regularized Non-Monotonic Neural Activation Function. arXiv."},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Howard, A., Sandler, M., Chu, G., Chen, L., Chen, B., Tan, M., Wang, W., Zhu, Y., Pang, R., and Vasudevan, V. (2019). Searching for MobileNetV3. arXiv.","DOI":"10.1109\/ICCV.2019.00140"}],"container-title":["Symmetry"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2073-8994\/14\/5\/1027\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T23:13:35Z","timestamp":1760138015000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2073-8994\/14\/5\/1027"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,5,17]]},"references-count":30,"journal-issue":{"issue":"5","published-online":{"date-parts":[[2022,5]]}},"alternative-id":["sym14051027"],"URL":"https:\/\/doi.org\/10.3390\/sym14051027","relation":{},"ISSN":["2073-8994"],"issn-type":[{"value":"2073-8994","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,5,17]]}}}