{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,12,3]],"date-time":"2025-12-03T05:25:24Z","timestamp":1764739524568,"version":"3.46.0"},"reference-count":31,"publisher":"MDPI AG","issue":"12","license":[{"start":{"date-parts":[[2025,12,1]],"date-time":"2025-12-01T00:00:00Z","timestamp":1764547200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"US National Academy of Sciences","award":["STCU-7125"],"award-info":[{"award-number":["STCU-7125"]}]},{"DOI":"10.13039\/100018227","name":"National Research Foundation of Ukraine","doi-asserted-by":"crossref","award":["2025.06\/0100"],"award-info":[{"award-number":["2025.06\/0100"]}],"id":[{"id":"10.13039\/100018227","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Computation"],"abstract":"<jats:p>Purpose: While hybrid quantum\u2013classical neural networks (HNNs) are a promising avenue for quantum advantage, the critical influence of the classical backbone architecture on their performance remains poorly understood. This study investigates the role of lightweight convolutional neural network architectures, focusing on LCNet, in determining the stability, generalization, and effectiveness of hybrid models augmented with quantum layers for medical applications. The objective is to clarify the architectural compatibility between quantum and classical components and provide guidelines for backbone selection in hybrid designs. Methods: We constructed HNNs by integrating a four-qubit quantum circuit (with trainable rotations) into scaled versions of LCNet (050, 075, 100, 150, 200). These models were rigorously evaluated on CIFAR-10 and MedMNIST using stratified 5-fold cross-validation, assessing accuracy, AUC, and robustness metrics. Performance was assessed with accuracy, macro- and micro-averaged area under the ROC curve (AUC), per-class accuracy, and out-of-fold (OoF) predictions to ensure unbiased generalization. In addition, training dynamics, confusion matrices, and performance stability across folds were analyzed to capture both predictive accuracy and robustness. Results: The experiments revealed a strong dependence of hybrid network performance on both backbone architecture and model scale. Across all tests, LCNet-based hybrids achieved the most consistent benefits, particularly at compact and medium configurations. From LCNet050 to LCNet100, hybrid models maintained high macro-AUC values exceeding 0.95 and delivered higher mean accuracies with lower variance across folds, confirming enhanced stability and generalization through quantum integration. On the DermaMNIST dataset, these hybrids achieved accuracy gains of up to seven percentage points and improved AUC by more than three points, demonstrating their robustness in imbalanced medical settings. However, as backbone complexity increased (LCNet150 and LCNet200), the classical architectures regained superiority, indicating that the advantages of quantum layers diminish with scale. The mostconsistent gains were observed at smaller and medium LCNet scales, where hybridization improved accuracy and stability across folds. This divergence indicates that hybrid networks do not necessarily follow the \u201cbigger is better\u201d paradigm of classical deep learning. Per-class analysis further showed that hybrids improved recognition in challenging categories, narrowing the gap between easy and difficult classes. Conclusions: The study demonstrates that the performance and stability of hybrid quantum\u2013classical neural networks are fundamentally determined by the characteristics of their classical backbones. Across extensive experiments on CIFAR-10 and DermaMNIST, LCNet-based hybrids consistently outperformed or matched their classical counterparts at smaller and medium scales, achieving higher accuracy and AUC along with notably reduced variability across folds. These improvements highlight the role of quantum layers as implicit regularizers that enhance learning stability and generalization\u2014particularly in data-limited or imbalanced medical settings. However, the observed benefits diminished with increasing backbone complexity, as larger classical models regained superiority in both accuracy and convergence reliability. This indicates that hybrid architectures do not follow the conventional \u201clarger-is-better\u201d paradigm of classical deep learning. Overall, the results establish that architectural compatibility and model scale are decisive factors for effective quantum\u2013classical integration. Lightweight backbones such as LCNet offer a robust foundation for realizing the advantages of hybridization in practical, resource-constrained medical applications, paving the way for future studies on scalable, hardware-efficient, and clinically reliable hybrid neural networks.<\/jats:p>","DOI":"10.3390\/computation13120278","type":"journal-article","created":{"date-parts":[[2025,12,2]],"date-time":"2025-12-02T15:31:17Z","timestamp":1764689477000},"page":"278","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":0,"title":["Impact of Scaling Classic Component on Performance of Hybrid Multi-Backbone Quantum\u2013Classic Neural Networks for Medical Applications"],"prefix":"10.3390","volume":"13","author":[{"ORCID":"https:\/\/orcid.org\/0009-0009-1631-8172","authenticated-orcid":false,"given":"Arsenii","family":"Khmelnytskyi","sequence":"first","affiliation":[{"name":"Faculty of Informatics and Computer Science, National Technical University of Ukraine \u201cIgor Sikorsky Kyiv Polytechnic Institute\u201d, 03056 Kyiv, Ukraine"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2682-4668","authenticated-orcid":false,"given":"Yuri","family":"Gordienko","sequence":"additional","affiliation":[{"name":"Faculty of Informatics and Computer Science, National Technical University of Ukraine \u201cIgor Sikorsky Kyiv Polytechnic Institute\u201d, 03056 Kyiv, Ukraine"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-9395-8685","authenticated-orcid":false,"given":"Sergii","family":"Stirenko","sequence":"additional","affiliation":[{"name":"Faculty of Informatics and Computer Science, National Technical University of Ukraine \u201cIgor Sikorsky Kyiv Polytechnic Institute\u201d, 03056 Kyiv, Ukraine"}]}],"member":"1968","published-online":{"date-parts":[[2025,12,1]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"436","DOI":"10.1038\/nature14539","article-title":"Deep learning","volume":"521","author":"LeCun","year":"2015","journal-title":"Nature"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"117","DOI":"10.1088\/0034-4885\/61\/2\/002","article-title":"Quantum computing","volume":"61","author":"Steane","year":"1998","journal-title":"Rep. Prog. Phys."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Arthur, D., and Date, P. (2022, January 18\u201323). Hybrid quantum-classical neural networks. Proceedings of the 2022 IEEE International Conference on Quantum Computing and Engineering (QCE), Broomfield, CO, USA.","DOI":"10.1109\/QCE53715.2022.00023"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"2631","DOI":"10.1038\/s41467-021-22539-9","article-title":"Power of data in quantum machine learning","volume":"12","author":"Huang","year":"2021","journal-title":"Nat. Commun."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"195","DOI":"10.1038\/nature23474","article-title":"Quantum machine learning","volume":"549","author":"Biamonte","year":"2017","journal-title":"Nature"},{"key":"ref_6","first-page":"100736","article-title":"Quantum machine learning: Classifications, challenges, and solutions","volume":"42","author":"Lu","year":"2024","journal-title":"J. Ind. Inf. Integr."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"255","DOI":"10.1126\/science.270.5234.255","article-title":"Quantum computation","volume":"270","author":"DiVincenzo","year":"1995","journal-title":"Science"},{"key":"ref_8","unstructured":"Arrazola, J.M., Jahangiri, S., Delgado, A., Ceroni, J., Izaac, J., Sz\u00e1va, A., Azad, U., Lang, R.A., Niu, Z., and Di Matteo, O. (2021). Differentiable quantum computational chemistry with PennyLane. arXiv."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"290311","DOI":"10.1007\/s11433-021-1734-3","article-title":"Hybrid quantum-classical convolutional neural networks","volume":"64","author":"Liu","year":"2021","journal-title":"Sci. China Phys. Mech. Astron."},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Long, C., Huang, M., Ye, X., Futamura, Y., and Sakurai, T. (2025). Hybrid quantum-classical-quantum convolutional neural networks. Sci. Rep., 15.","DOI":"10.1038\/s41598-025-13417-1"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Zhao, J. (2024, January 7\u201313). Towards an architecture description language for hybrid quantum-classical systems. Proceedings of the 2024 IEEE International Conference on Quantum Software (QSW), Shenzhen, China.","DOI":"10.1109\/QSW62656.2024.00016"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Zaman, K., Ahmed, T., Hanif, M.A., Marchisio, A., and Shafique, M. (2024, January 22\u201325). A comparative analysis of hybrid-quantum classical neural networks. Proceedings of the World Congress in Computer Science, Computer Engineering & Applied Computing, Las Vegas, NV, USA.","DOI":"10.1007\/978-3-031-85884-0_9"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"043001","DOI":"10.1088\/2058-9565\/ab4eb5","article-title":"Parameterized quantum circuits as machine learning models","volume":"4","author":"Benedetti","year":"2019","journal-title":"Quantum Sci. Technol."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Qazi, H.M., and Banka, A.A. (2025). Quantum-Classical Hybrid Architectures and How They Are Solving Scientific Problems\u2014A Short Review. SSRN Electron. J.","DOI":"10.2139\/ssrn.5190836"},{"key":"ref_15","unstructured":"Suchara, M., Alexeev, Y., Chong, F., Finkel, H., Hoffmann, H., Larson, J., Osborn, J., and Smith, G. (2018, January 11). Hybrid quantum-classical computing architectures. Proceedings of the 3rd International Workshop on Post-Moore Era Supercomputing, Dallas, TX, USA."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"24","DOI":"10.1016\/j.isprsjprs.2020.12.010","article-title":"Review on Convolutional Neural Networks (CNN) in vegetation remote sensing","volume":"173","author":"Kattenborn","year":"2021","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Chauhan, R., Ghanshala, K.K., and Joshi, R.C. (2018, January 15\u201317). Convolutional neural network (CNN) for image detection and recognition. Proceedings of the 2018 First International Conference on Secure Cyber Computing and Communication (ICSCCC), Jalandhar, India.","DOI":"10.1109\/ICSCCC.2018.8703316"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Koonce, B. (2021). ResNet 50. Convolutional Neural Networks with Swift for Tensorflow: Image Recognition and Dataset Categorization, Springer.","DOI":"10.1007\/978-1-4842-6168-2"},{"key":"ref_19","unstructured":"Iandola, F., Moskewicz, M., Karayev, S., Girshick, R., Darrell, T., and Keutzer, K. (2014). Densenet: Implementing efficient convnet descriptor pyramids. arXiv."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"27","DOI":"10.1007\/s43673-022-00058-z","article-title":"NISQ computing: Where are we and where do we go?","volume":"32","author":"Lau","year":"2022","journal-title":"AAPPS Bull."},{"key":"ref_21","unstructured":"Fahim, J.K., Paul, P.C., Hossain, M.R., Ahmed, M.T., and Chakraborty, D. (2025). HQCNN: A Hybrid Quantum-Classical Neural Network for Medical Image Classification. arXiv."},{"key":"ref_22","unstructured":"Khmelnytskyi, A., and Gordienko, Y. (Inf. Comput. Intell. Syst. J., 2025). Comparative Study of MobileNetV3 and LCNet-Based Hybrid Quantum-Classical Neural Networks for Image Classification, Inf. Comput. Intell. Syst. J., accepted for publication."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Yang, J., Shi, R., and Ni, B. (2021, January 13\u201316). Medmnist classification decathlon: A lightweight automl benchmark for medical image analysis. Proceedings of the 2021 IEEE 18th International Symposium on Biomedical Imaging (ISBI), Nice, France.","DOI":"10.1109\/ISBI48211.2021.9434062"},{"key":"ref_24","unstructured":"Bergholm, V., Izaac, J., Schuld, M., Gogolin, C., Ahmed, S., Ajith, V., Alam, M.S., Alonso-Linaje, G., AkashNarayanan, B., and Asadi, A. (2018). Pennylane: Automatic differentiation of hybrid quantum-classical computations. arXiv."},{"key":"ref_25","unstructured":"Krizhevsky, A., and Hinton, G. (2024, October 10). Learning Multiple Layers of Features from Tiny Images. Available online: https:\/\/www.cs.toronto.edu\/~kriz\/learning-features-2009-TR.pdf."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Henderson, M., Shakya, S., Pradhan, S., and Cook, T. (2019). Quanvolutional Neural Networks: Powering Image Recognition with Quantum Circuits. arXiv.","DOI":"10.1007\/s42484-020-00012-y"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Yu, H., and Zhang, L. (2020, January 6\u201312). LCNet: A light-weight network for object counting. Proceedings of the International Conference on Neural Information Processing, Vancouver, BC, Canada.","DOI":"10.1007\/978-3-030-63830-6_35"},{"key":"ref_28","unstructured":"Gordienko, Y., Trochun, Y., and Khmelnytskyi, A. (2024, October 10). Quantum-Preprocessed CIFAR-10 and MedMnist Datasets. Available online: https:\/\/www.kaggle.com\/datasets\/yoctoman\/qnn-cifar10-medmnist."},{"key":"ref_29","unstructured":"Khmelnytskyi, A., and Gordienko, Y. (2025, October 30). Example of Baseline Model Based on LCNet Used for Experiments. Available online: https:\/\/www.kaggle.com\/code\/arseniykhmelnitskiy\/cifar10-classic-q5-w4-lcnet050-mb1-batch-64-t."},{"key":"ref_30","unstructured":"Khmelnytskyi, A., and Gordienko, Y. (2025, October 30). Example of Implementation of HNN-QC5: Quantum Transformation as Data Augmentation Technique in Hybrid Neural Network Setup with Multiple Backbones and Quantum and Original Channel Inputs for Multiclass Image Classification. Available online: https:\/\/www.kaggle.com\/code\/arseniykhmelnitskiy\/cifar10-full-q5-w4-lcnet050-mb1-batch-64-trial11."},{"key":"ref_31","unstructured":"Khmelnytskyi, A., and Gordienko, Y. (2025, October 30). Example of Implementation of HNN-QC5: Quantum Transformation as Data Augmentation Technique in Hybrid Neural Network Setup with Multiple Backbones and Quantum and Original Channel Inputs for Multiclass Image Classification for Medical Application. Available online: https:\/\/www.kaggle.com\/code\/arseniykhmelnitskiy\/dermamnist-full-q5-w4-lcnet050-mb1-batch-64-t."}],"container-title":["Computation"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2079-3197\/13\/12\/278\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,12,3]],"date-time":"2025-12-03T05:20:25Z","timestamp":1764739225000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2079-3197\/13\/12\/278"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,12,1]]},"references-count":31,"journal-issue":{"issue":"12","published-online":{"date-parts":[[2025,12]]}},"alternative-id":["computation13120278"],"URL":"https:\/\/doi.org\/10.3390\/computation13120278","relation":{},"ISSN":["2079-3197"],"issn-type":[{"type":"electronic","value":"2079-3197"}],"subject":[],"published":{"date-parts":[[2025,12,1]]}}}