{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,23]],"date-time":"2026-03-23T19:19:43Z","timestamp":1774293583194,"version":"3.50.1"},"reference-count":25,"publisher":"Springer Science and Business Media LLC","issue":"2","license":[{"start":{"date-parts":[[2025,3,18]],"date-time":"2025-03-18T00:00:00Z","timestamp":1742256000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2025,3,18]],"date-time":"2025-03-18T00:00:00Z","timestamp":1742256000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Neural Process Lett"],"abstract":"<jats:title>Abstract<\/jats:title>\n          <jats:p>Convolutional Neural Networks (CNNs) experience performance and training efficiency changes according to the selection of correct hyperparameters. The research presents WACSO which combines Crow Search Optimization with Grey Wolf Optimizer to improve Convolutional Neural Networks hyperparameter selection through a hybrid metaheuristic algorithm. The hybrid algorithm WACSO uses exploration parts from CSO together with GWO exploitation mechanics to obtain optimized performance. WACSO reaches higher classification accuracy than traditional optimization algorithms when performing tests on the MNIST and CIFAR-10 datasets along with Random Search and particle swarm optimization and genetic algorithms and standalone CSO and standalone GWO. The best classification results reached 98.9% accuracy levels on MNIST along with 91.5% accuracy levels on CIFAR-10. The final outcomes of this system depend on the combination of model structure along with dataset challenges and available computational power. The investigation demonstrates that mixing algorithms drawn from nature can lead to successful CNN hyperparameter optimization. The promising outcomes of WACSO depend on multiple variables including computation expenses and sensitive parameter adjustments and universal result adaptability between different datasets and network setups. Research into WACSO should expand to involve longer evaluations across multiple datasets and various models to confirm widespread usage.<\/jats:p>","DOI":"10.1007\/s11063-025-11740-2","type":"journal-article","created":{"date-parts":[[2025,3,18]],"date-time":"2025-03-18T11:07:47Z","timestamp":1742296067000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":5,"title":["WACSO: Wolf Crow Search Optimizer for Convolutional Neural Network Hyperparameter Optimization"],"prefix":"10.1007","volume":"57","author":[{"given":"Rahul Rajendra","family":"Papalkar","sequence":"first","affiliation":[]},{"given":"Jayendra","family":"Jadhav","sequence":"additional","affiliation":[]},{"given":"Tareek","family":"Pattewar","sequence":"additional","affiliation":[]},{"given":"Vivek","family":"Thorat","sequence":"additional","affiliation":[]},{"given":"Pallavi","family":"Morey","sequence":"additional","affiliation":[]},{"given":"Mayur","family":"Deshmukh","sequence":"additional","affiliation":[]},{"given":"Rajkumar","family":"Jagdale","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2025,3,18]]},"reference":[{"issue":"8","key":"11740_CR1","first-page":"3512","volume":"29","author":"RJ Williams","year":"2018","unstructured":"Williams RJ (2018) A classification of optimization techniques in deep learning. IEEE Trans Neural Netw Learn Syst 29(8):3512\u20133523","journal-title":"IEEE Trans Neural Netw Learn Syst"},{"issue":"3","key":"11740_CR2","first-page":"543","volume":"269","author":"Y Nesterov","year":"1983","unstructured":"Nesterov Y (1983) A method of solving the convex programming problem with convergence rate O(1\/k^2). Dokl Math 269(3):543\u2013547","journal-title":"Dokl Math"},{"key":"11740_CR3","unstructured":"D. P. Kingma and J. Ba, (2015) \u201cAdam: A method for stochastic optimization,\u201d International conference on learning representations (ICLR),"},{"key":"11740_CR4","unstructured":"D. P. Kingma, (2014) \u201cAdam: A method for stochastic optimization,\u201d arXiv:1412.6980,"},{"key":"11740_CR5","first-page":"1929","volume":"15","author":"N Srivastava","year":"2014","unstructured":"Srivastava N et al (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15:1929\u20131958","journal-title":"J Mach Learn Res"},{"key":"11740_CR6","unstructured":"D. P. Kingma and M. Welling, (2014) \u201cAuto-Encoding Variational Bayes,\u201d International conference on learning representations (ICLR)"},{"key":"11740_CR7","unstructured":"S. Ioffe and C. Szegedy, 2015 \u201cBatch normalization: Accelerating deep network training by reducing internal covariate shift,\u201d Proceedings of the 32nd international conference on machine learning (ICML)"},{"key":"11740_CR8","doi-asserted-by":"crossref","unstructured":"K. He et al., 2016 \u201cDeep residual learning for image recognition,\u201d IEEE conference on computer vision and pattern recognition (CVPR),","DOI":"10.1109\/CVPR.2016.90"},{"key":"11740_CR9","unstructured":"M. Tan and Q. V. Le, 2019 \u201cEfficientNet: Rethinking model scaling for convolutional neural networks,\u201d International conference on machine learning (ICML)"},{"key":"11740_CR10","volume-title":"Adaptation in natural and artificial systems","author":"J Holland","year":"1975","unstructured":"Holland J (1975) Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor"},{"issue":"6","key":"11740_CR11","doi-asserted-by":"publisher","first-page":"539","DOI":"10.1002\/int.4550080406","volume":"8","author":"X Yao","year":"1993","unstructured":"Yao X (1993) Evolutionary artificial neural networks. Int J Intell Syst 8(6):539\u2013551","journal-title":"Int J Intell Syst"},{"key":"11740_CR12","doi-asserted-by":"crossref","unstructured":"J Kennedy R Eberhart 1995 Particle swarm optimization Proceedings of the IEEE International conference on neural networks 4: pp 1942-1948","DOI":"10.1109\/ICNN.1995.488968"},{"key":"11740_CR13","first-page":"1","volume":"28","author":"AS Al-Ani","year":"2019","unstructured":"Al-Ani AS, Sulaiman HAN (2019) Optimization of deep learning models using PSO. J Comput Sci Technol 28:1\u201310","journal-title":"J Comput Sci Technol"},{"issue":"1","key":"11740_CR14","doi-asserted-by":"publisher","first-page":"53","DOI":"10.1109\/4235.585892","volume":"1","author":"M Dorigo","year":"1997","unstructured":"Dorigo M, Gambardella LM (1997) Ant colony system: a cooperative learning approach to the traveling salesman problem. IEEE Trans Evol Comput 1(1):53\u201366","journal-title":"IEEE Trans Evol Comput"},{"issue":"2","key":"11740_CR15","first-page":"1063","volume":"66","author":"KV Shyamala","year":"2021","unstructured":"Shyamala KV, Baskar R (2021) Ant colony optimization based feature selection for deep learning in classification tasks. Comput, Mater Contin 66(2):1063\u20131079","journal-title":"Comput, Mater Contin"},{"issue":"6","key":"11740_CR16","doi-asserted-by":"crossref","first-page":"1629","DOI":"10.1007\/s00500-016-2454-x","volume":"22","author":"X Zhang","year":"2018","unstructured":"Zhang X et al (2018) Hybrid genetic algorithm and particle swarm optimization for deep neural network optimization. Soft Comput 22(6):1629\u20131640","journal-title":"Soft Comput"},{"issue":"4","key":"11740_CR17","first-page":"1045","volume":"38","author":"L Li","year":"2020","unstructured":"Li L, Liu Y (2020) Hybridizing genetic algorithm and ant colony optimization for CNN optimization. Comput Intell 38(4):1045\u20131058","journal-title":"Comput Intell"},{"issue":"1","key":"11740_CR18","doi-asserted-by":"publisher","first-page":"3061","DOI":"10.1149\/10701.3061ecst","volume":"107","author":"RR Papalkar","year":"2022","unstructured":"Papalkar RR, Alvi AS (2022) Analysis of defense techniques for DDos attacks in IoT\u2013A review. ECS Trans 107(1):3061","journal-title":"ECS Trans"},{"key":"11740_CR19","volume-title":"Review of unknown attack detection with deep learning techniques In Artificial Intelligence Blockchain Computing and Security","author":"RR Papalkar","year":"2023","unstructured":"Papalkar RR, Alvi AS (2023) Review of unknown attack detection with deep learning techniques In Artificial Intelligence Blockchain Computing and Security. CRC Press, Boca Raton"},{"key":"11740_CR20","volume-title":"An optimized feature selection guided light-weight machine learning models for DDoS attacks detection in cloud computing In Artificial Intelligence Blockchain Computing and Security","author":"RR Papalkar","year":"2023","unstructured":"Papalkar RR, Alvi AS, Ali S, Awasthy M, Kanse R (2023) An optimized feature selection guided light-weight machine learning models for DDoS attacks detection in cloud computing In Artificial Intelligence Blockchain Computing and Security. CRC Press, Boca Raton"},{"issue":"1","key":"11740_CR21","first-page":"24","volume":"2","author":"MA Pund","year":"2011","unstructured":"Pund MA, Jadhao SR, Thakare PD (2011) A role of query optimization in relational database. Int J Sci Eng Res 2(1):24\u201333","journal-title":"Int J Sci Eng Res"},{"issue":"6","key":"11740_CR22","first-page":"10","volume":"11","author":"RR Papalkar","year":"2024","unstructured":"Papalkar RR, Alvi ASA (2024) Hybrid CNN approach for unknown attack detection in edge-based IoT networks. EAI Endorsed Scal Inf Syst 11(6):10","journal-title":"EAI Endorsed Scal Inf Syst"},{"key":"11740_CR23","doi-asserted-by":"publisher","first-page":"2431","DOI":"10.1007\/s00500-023-08449-6","volume":"28","author":"LK Singh","year":"2024","unstructured":"Singh LK, Khanna M, Garg H et al (2024) Emperor penguin optimization algorithm- and bacterial foraging optimization algorithm-based novel feature selection approach for glaucoma classification from fundus images. Soft Comput 28:2431\u20132467. https:\/\/doi.org\/10.1007\/s00500-023-08449-6","journal-title":"Soft Comput"},{"key":"11740_CR24","doi-asserted-by":"publisher","first-page":"122349","DOI":"10.1016\/j.eswa.2023.122349","volume":"239","author":"Z Yang","year":"2024","unstructured":"Yang Z (2024) Competing leaders grey wolf optimizer and its application for training multi-layer perceptron classifier. Expert Syst Appl 239:122349. https:\/\/doi.org\/10.1016\/j.eswa.2023.122349","journal-title":"Expert Syst Appl"},{"key":"11740_CR25","doi-asserted-by":"publisher","first-page":"2525","DOI":"10.1007\/s10586-023-04099-3","volume":"27","author":"AN Malti","year":"2024","unstructured":"Malti AN, Hakem M, Benmammar B (2024) A new hybrid multi-objective optimization algorithm for task scheduling in cloud systems. Cluster Comput 27:2525\u20132548. https:\/\/doi.org\/10.1007\/s10586-023-04099-3","journal-title":"Cluster Comput"}],"container-title":["Neural Processing Letters"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11063-025-11740-2.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s11063-025-11740-2\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11063-025-11740-2.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,4,23]],"date-time":"2025-04-23T16:58:15Z","timestamp":1745427495000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s11063-025-11740-2"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,3,18]]},"references-count":25,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2025,4]]}},"alternative-id":["11740"],"URL":"https:\/\/doi.org\/10.1007\/s11063-025-11740-2","relation":{},"ISSN":["1573-773X"],"issn-type":[{"value":"1573-773X","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,3,18]]},"assertion":[{"value":"15 February 2025","order":1,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"18 March 2025","order":2,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare no competing interests.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interests"}},{"value":"\u201cThe present study does not engage live people or animals as subjects; rather, it hinges on computational processes and accessible data.\u201d","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Human or Animal Rights"}},{"value":"Not applicable","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Informed Consent"}}],"article-number":"31"}}