{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,2,21]],"date-time":"2025-02-21T10:58:37Z","timestamp":1740135517319,"version":"3.37.3"},"reference-count":42,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2022,7,30]],"date-time":"2022-07-30T00:00:00Z","timestamp":1659139200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,7,30]],"date-time":"2022-07-30T00:00:00Z","timestamp":1659139200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100004735","name":"Natural Science Foundation of\u00a0Hunan Province","doi-asserted-by":"publisher","award":["2022JJ30625"],"award-info":[{"award-number":["2022JJ30625"]}],"id":[{"id":"10.13039\/501100004735","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["BMC Med Imaging"],"published-print":{"date-parts":[[2022,12]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:sec><jats:title>Background<\/jats:title><jats:p>Corona Virus Disease 2019 (COVID-19) first appeared in December 2019, and spread rapidly around the world. COVID-19 is a pneumonia caused by novel coronavirus infection in 2019. COVID-19 is highly infectious and transmissible. By 7 May 2021, the total number of cumulative number of deaths is 3,259,033. In order to diagnose the infected person in time to prevent the spread of the virus, the diagnosis method for COVID-19 is extremely important. To solve the above problems, this paper introduces a Multi-Level Enhanced Sensation module (MLES), and proposes a new convolutional neural network model, MLES-Net, based on this module.<\/jats:p><\/jats:sec><jats:sec><jats:title>Methods<\/jats:title><jats:p>Attention has the ability to automatically focus on the key points in various information, and Attention can realize parallelism, which can replace some recurrent neural networks to a certain extent and improve the efficiency of the model. We used the correlation between global and local features to generate the attention mask. First, the feature map was divided into multiple groups, and the initial attention mask was obtained by the dot product of each feature group and the feature after the global pooling. Then the attention masks were normalized. At the same time, there were two scaling and translating parameters in each group so that the normalize operation could be restored. Then, the final attention mask was obtained through the sigmoid function, and the feature of each location in the original feature group was scaled. Meanwhile, we use different classifiers on the network models with different network layers.<\/jats:p><\/jats:sec><jats:sec><jats:title>Results<\/jats:title><jats:p>The network uses three classifiers, FC module (fully connected layer), GAP module (global average pooling layer) and GAPFC module (global average pooling layer and fully connected layer), to improve recognition efficiency. GAPFC as a classifier can obtain the best comprehensive effect by comparing the number of parameters, the amount of calculation and the detection accuracy. The experimental results show that the MLES-Net56-GAPFC achieves the best overall accuracy rate (95.27%) and the best recognition rate for COVID-19 category (100%).<\/jats:p><\/jats:sec><jats:sec><jats:title>Conclusions<\/jats:title><jats:p>MLES-Net56-GAPFC has good classification ability for the characteristics of high similarity between categories of COVID-19 X-Ray images and low intra-category variability. Considering the factors such as accuracy rate, number of network model parameters and calculation amount, we believe that the MLES-Net56-GAPFC network model has better practicability.<\/jats:p><\/jats:sec>","DOI":"10.1186\/s12880-022-00861-y","type":"journal-article","created":{"date-parts":[[2022,7,30]],"date-time":"2022-07-30T08:06:51Z","timestamp":1659168411000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":3,"title":["Detecting COVID-19 patients via MLES-Net deep learning models from X-Ray images"],"prefix":"10.1186","volume":"22","author":[{"given":"Wei","family":"Wang","sequence":"first","affiliation":[]},{"given":"Yongbin","family":"Jiang","sequence":"additional","affiliation":[]},{"given":"Xin","family":"Wang","sequence":"additional","affiliation":[]},{"given":"Peng","family":"Zhang","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5990-8396","authenticated-orcid":false,"given":"Ji","family":"Li","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,7,30]]},"reference":[{"key":"861_CR1","doi-asserted-by":"publisher","first-page":"A1","DOI":"10.1016\/j.physio.2020.03.003","volume":"107","author":"B Mdlma","year":"2020","unstructured":"Mdlma B, Lg C, Apm D, et al. Early reflection on the global impact of COVID19, and implications for physiotherapy. Physiotherapy. 2020;107:A1\u20133.","journal-title":"Physiotherapy"},{"issue":"4","key":"861_CR2","first-page":"040901","volume":"58","author":"W Wang","year":"2019","unstructured":"Wang W, Yang Y, Wang X, et al. Development of convolutional neural network and its application in image classification: a survey. Opt Eng. 2019;58(4):040901.","journal-title":"Opt Eng"},{"issue":"11","key":"861_CR3","doi-asserted-by":"publisher","first-page":"2278","DOI":"10.1109\/5.726791","volume":"86","author":"Y Lecun","year":"1998","unstructured":"Lecun Y, Bottou L. Gradient-based learning applied to document recognition. Proc IEEE. 1998;86(11):2278\u2013324.","journal-title":"Proc IEEE"},{"key":"861_CR4","first-page":"1097","volume":"25","author":"A Krizhevsky","year":"2012","unstructured":"Krizhevsky A, Sutskever I, Hinton GE. ImageNet Classification with Deep Convolutional Neural Networks. Adv Neural Inf Process Syst. 2012;25:1097\u2013105.","journal-title":"Adv Neural Inf Process Syst"},{"key":"861_CR5","unstructured":"Simonyan K, Zisserman A. Very Deep Convolutional Networks for Large-Scale Image Recognition. Computer science. arXiv: 1409.1556 [Preprint]. 2014"},{"key":"861_CR6","doi-asserted-by":"crossref","unstructured":"He K, Zhang X, Ren S, et al. Identity mappings in deep residual networks. European conference on computer vision. Springer, Cham. arXiv: 1603.05027 [Preprint] 2016.","DOI":"10.1007\/978-3-319-46493-0_38"},{"key":"861_CR7","unstructured":"Bahdanau D, Cho K, Bengio Y, et al. Neural machine translation by jointly learning to align and translate. Computer Science. 2014, arXiv: 1409.0473 [Preprint]."},{"key":"861_CR8","doi-asserted-by":"crossref","unstructured":"Huang G, Liu Z, Laurens V, et al. Densely connected convolutional networks. Proceedings of the IEEE conference on computer vision and pattern recognition. 2017, 4700\u20134708.","DOI":"10.1109\/CVPR.2017.243"},{"key":"861_CR9","doi-asserted-by":"crossref","unstructured":"Chen L, Zhang H, Xiao J, et al. SCA-CNN: Spatial and channel-wise attention in convolutional networks for image captioning. Proceedings of the IEEE conference on computer vision and pattern recognition. 2017, 5659\u20135667.","DOI":"10.1109\/CVPR.2017.667"},{"key":"861_CR10","doi-asserted-by":"crossref","unstructured":"Szegedy, Liu W, Jia Y, et al. Going deeper with convolutions. Proceedings of the IEEE conference on computer vision and pattern recognition. 2015, 1\u20139.","DOI":"10.1109\/CVPR.2015.7298594"},{"key":"861_CR11","unstructured":"Ioffe S, Szegedy C. Batch normalization: accelerating deep network training by reducing internal covariate shift. International conference on machine learning. 2015, pp. 448\u2013456."},{"key":"861_CR12","doi-asserted-by":"crossref","unstructured":"Howard A, Sandler M, Chen B, et al. Searching for MobileNetV3. Proceedings of the IEEE\/CVF international conference on computer vision. 2019, pp. 1314\u20131324.","DOI":"10.1109\/ICCV.2019.00140"},{"key":"861_CR13","doi-asserted-by":"crossref","unstructured":"Szegedy C, Ioffe S, Vanhoucke V, et al. Inception-v4, Inception-ResNet and the impact of residual connections on learning. Proceedings of the AAAI conference on artificial intelligence. 2017, 31(1).","DOI":"10.1609\/aaai.v31i1.11231"},{"key":"861_CR14","doi-asserted-by":"crossref","unstructured":"He K, Zhang X, Ren S, et al. Deep Residual Learning for Image Recognition. Proceedings of the IEEE conference on computer vision and pattern recognition. 2016, pp. 770\u2013778.","DOI":"10.1109\/CVPR.2016.90"},{"key":"861_CR15","unstructured":"Howard A G, Zhu M, Chen B, et al. MobileNets: efficient convolutional neural networks for mobile vision applications. arXiv: 1704.04861 [Preprint] 2017."},{"key":"861_CR16","first-page":"1","volume":"2020","author":"W Wang","year":"2020","unstructured":"Wang W, Li Y, Zou T, et al. A novel image classification approach via Dense-MobileNet models. Mob Inf Syst. 2020;2020:1\u20138.","journal-title":"Mob Inf Syst"},{"issue":"9","key":"861_CR17","doi-asserted-by":"publisher","first-page":"4446","DOI":"10.1109\/TIP.2017.2710620","volume":"26","author":"SSS Kruthiventi","year":"2017","unstructured":"Kruthiventi SSS, Ayush K, Babu RV. Deepfix: a fully convolutional neural network for predicting human eye fixations. IEEE Trans Image Process. 2017;26(9):4446\u201356.","journal-title":"IEEE Trans Image Process"},{"issue":"2","key":"861_CR18","doi-asserted-by":"publisher","first-page":"1592","DOI":"10.2991\/ijcis.d.191209.001","volume":"12","author":"W Wang","year":"2019","unstructured":"Wang W, Jiang Y, Luo Y, Li J, Wang X, Zhang T. An advanced deep residual dense network (DRDN) approach for image super-resolution. Int J Comput Intell Syst. 2019;12(2):1592\u2013601.","journal-title":"Int J Comput Intell Syst"},{"issue":"1","key":"861_CR19","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1186\/s12880-019-0399-0","volume":"20","author":"W Wang","year":"2020","unstructured":"Wang W, Tian J, Zhang C, Luo Y, Wang X, Li J. An improved deep learning approach and its applications on colonic polyp images detection. BMC Med Imaging. 2020;20(1):1\u201314.","journal-title":"BMC Med Imaging"},{"issue":"11","key":"861_CR20","doi-asserted-by":"publisher","first-page":"2207","DOI":"10.1109\/TUFFC.2020.3005512","volume":"67","author":"L Carrer","year":"2020","unstructured":"Carrer L, Donini E, Marinelli D, et al. Automatic pleural line extraction and COVID-19 scoring from lung ultrasound data. IEEE Trans Ultrason Ferroelectr Freq Control. 2020;67(11):2207\u201317.","journal-title":"IEEE Trans Ultrason Ferroelectr Freq Control"},{"key":"861_CR21","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2020.3001973","author":"MB Jamshidi","year":"2020","unstructured":"Jamshidi MB, Lalbakhsh A, Talla J, et al. Artificial intelligence and COVID-19: deep learning approaches for diagnosis and treatment. IEEE Access. 2020. https:\/\/doi.org\/10.1109\/ACCESS.2020.3001973.","journal-title":"IEEE Access"},{"issue":"10","key":"861_CR22","doi-asserted-by":"publisher","first-page":"1439","DOI":"10.7150\/ijms.46684","volume":"17","author":"S Albahli","year":"2020","unstructured":"Albahli S. Efficient GAN-based chest radiographs (CXR) augmentation to diagnose coronavirus disease pneumonia. Int J Med Sci. 2020;17(10):1439\u201348.","journal-title":"Int J Med Sci"},{"key":"861_CR23","doi-asserted-by":"publisher","DOI":"10.1109\/TMI.2020.2995508","author":"X Ouyang","year":"2020","unstructured":"Ouyang X, Huo J, Xia L, et al. Dual-sampling attention network for diagnosis of COVID-19 from community acquired pneumonia. IEEE Trans Med Imaging. 2020. https:\/\/doi.org\/10.1109\/TMI.2020.2995508.","journal-title":"IEEE Trans Med Imaging"},{"issue":"1","key":"861_CR24","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1007\/s42979-020-00401-x","volume":"2","author":"S Ullah","year":"2021","unstructured":"Ullah S, Islam MM, Mahmud S, et al. Scalable telehealth services to combat novel coronavirus (COVID-19) pandemic. SN Comput Sci. 2021;2(1):1\u20138.","journal-title":"SN Comput Sci."},{"issue":"6","key":"861_CR25","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1007\/s42979-020-00335-4","volume":"1","author":"MM Islam","year":"2020","unstructured":"Islam MM, Mahmud S, Muhammad LJ, et al. Wearable technology to assist the patients infected with novel coronavirus (COVID-19). SN Comput Sci. 2020;1(6):1\u20139.","journal-title":"SN Comput Sci"},{"key":"861_CR26","doi-asserted-by":"publisher","DOI":"10.1007\/s42979-020-00300-1","author":"MM Islam","year":"2020","unstructured":"Islam MM, Ullah S, Mahmud S, et al. Breathing aid devices to support novel coronavirus (COVID-19) infected patients. SN Comput Sci. 2020. https:\/\/doi.org\/10.1007\/s42979-020-00300-1.","journal-title":"SN Comput Sci"},{"key":"861_CR27","doi-asserted-by":"crossref","unstructured":"Rahman M M, Manik M M H, Islam M M, et al. An Automated System to Limit COVID-19 Using Facial Mask Detection in Smart City Network. 2020 IEEE international IOT, electronics and mechatronics conference (IEMTRONICS). 2020. pp. 1\u20135","DOI":"10.1109\/IEMTRONICS51293.2020.9216386"},{"key":"861_CR28","doi-asserted-by":"publisher","first-page":"30551","DOI":"10.1109\/ACCESS.2021.3058537","volume":"9","author":"MM Islam","year":"2021","unstructured":"Islam MM, Karray F, Alhajj R, et al. A review on deep learning techniques for the diagnosis of novel coronavirus (COVID-19). IEEE Access. 2021;9:30551\u201372.","journal-title":"IEEE Access"},{"issue":"4","key":"861_CR29","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1007\/s42979-020-00216-w","volume":"1","author":"LJ Muhammad","year":"2020","unstructured":"Muhammad LJ, Islam MM, Usman SS, et al. Predictive data mining models for novel coronavirus (COVID-19) infected patients recovery. SN Compu Sci. 2020;1(4):1\u20137.","journal-title":"SN Compu Sci"},{"issue":"6","key":"861_CR30","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1007\/s42979-020-00383-w","volume":"1","author":"A Asraf","year":"2020","unstructured":"Asraf A, Islam MZ, Haque MR, et al. Deep learning applications to combat novel coronavirus (COVID-19) pandemic. SN Comput Sci. 2020;1(6):1\u20137.","journal-title":"SN Comput Sci"},{"issue":"1","key":"861_CR31","doi-asserted-by":"publisher","first-page":"199","DOI":"10.2991\/ijcis.d.201123.001","volume":"14","author":"W Wang","year":"2020","unstructured":"Wang W, Liu H, Li J, et al. Using CFW-Net deep learning models for X-Ray images to detect COIVD-19 patients. Int J Comput Intel Syst. 2020;14(1):199\u2013207.","journal-title":"Int J Comput Intel Syst."},{"key":"861_CR32","doi-asserted-by":"publisher","first-page":"103792","DOI":"10.1016\/j.compbiomed.2020.103792","volume":"121","author":"T Ozturk","year":"2020","unstructured":"Ozturk T, Talo M, Yildirim A, et al. Automated detection of COVID-19 cases using Deep Neural Networks with X-ray images. Comput Biol Med. 2020;121:103792.","journal-title":"Comput Biol Med"},{"key":"861_CR33","doi-asserted-by":"publisher","first-page":"103869","DOI":"10.1016\/j.compbiomed.2020.103869","volume":"122","author":"T Mahmud","year":"2020","unstructured":"Mahmud T, Rahman MA, Fattah SA. CovXNet: a multi-dilation convolutional neural network for automatic COVID-19 and other pneumonia detection from chest X-ray images with transferable multi-receptive feature optimization. Comput Biol Med. 2020;122:103869.","journal-title":"Comput Biol Med"},{"key":"861_CR34","doi-asserted-by":"publisher","first-page":"115041","DOI":"10.1109\/ACCESS.2020.3003810","volume":"8","author":"S Rajaraman","year":"2020","unstructured":"Rajaraman S, Siegelman J, Alderson PO, et al. Iteratively pruned deep learning ensembles for COVID-19 detection in chest X-rays. IEEE Access. 2020;8:115041\u201350.","journal-title":"IEEE Access"},{"issue":"6","key":"861_CR35","doi-asserted-by":"publisher","first-page":"e0235187","DOI":"10.1371\/journal.pone.0235187","volume":"15","author":"MA Elaziz","year":"2020","unstructured":"Elaziz MA, Hosny KM, Salah A, et al. New machine learning method for image-based diagnosis of COVID-19. PLoS ONE. 2020;15(6):e0235187.","journal-title":"PLoS ONE"},{"key":"861_CR36","unstructured":"Mnih V, Heess N, Graves A, et al. Recurrent models of visual attention. arXiv: 1406.6247 [Preprint] 2014."},{"key":"861_CR37","doi-asserted-by":"crossref","unstructured":"Xie S, Girshick R, Doll\u00e1r P, et al. Aggregated residual transformations for deep neural networks. Proceedings of the IEEE conference on computer vision and pattern recognition. 2017, 1492\u20131500.","DOI":"10.1109\/CVPR.2017.634"},{"key":"861_CR38","doi-asserted-by":"crossref","unstructured":"Luong M T, Pham H, Manning C D. Effective Approaches to Attention-based Neural Machine Translation. arXiv: 1508.04025 [Preprint] 2015.","DOI":"10.18653\/v1\/D15-1166"},{"key":"861_CR39","doi-asserted-by":"publisher","first-page":"100412","DOI":"10.1016\/j.imu.2020.100412","volume":"20","author":"MZ Islam","year":"2020","unstructured":"Islam MZ, Islam MM, Asraf A. A combined deep CNN-LSTM network for the detection of novel coronavirus (COVID-19) using X-ray images. Inf Med Unlocked. 2020;20:100412.","journal-title":"Inf Med Unlocked"},{"issue":"8","key":"861_CR40","doi-asserted-by":"publisher","first-page":"2101","DOI":"10.1049\/ipr2.12474","volume":"16","author":"W Wang","year":"2022","unstructured":"Wang W, Huang W, Wang X, Zhang P, Zhang N. A COVID-19 CXR image recognition method based on MSA-DDCovidNet. IET Image Process. 2022;16(8):2101\u201313.","journal-title":"IET Image Process"},{"key":"861_CR41","doi-asserted-by":"publisher","first-page":"100505","DOI":"10.1016\/j.imu.2020.100505","volume":"22","author":"P Saha","year":"2021","unstructured":"Saha P, Sadi MS, Islam MM. EMCNet: automated COVID-19 diagnosis from X-ray images using convolutional neural network and ensemble of machine learning classifiers. Inf Med Unlocked. 2021;22:100505.","journal-title":"Inf Med Unlocked"},{"key":"861_CR42","unstructured":"Islam M M, Islam M Z, Asraf A, et al. Diagnosis of COVID-19 from X-rays using combined CNN-RNN architecture with transfer learning. medRxiv. 2020."}],"container-title":["BMC Medical Imaging"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1186\/s12880-022-00861-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1186\/s12880-022-00861-y\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1186\/s12880-022-00861-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,9,30]],"date-time":"2024-09-30T07:11:28Z","timestamp":1727680288000},"score":1,"resource":{"primary":{"URL":"https:\/\/bmcmedimaging.biomedcentral.com\/articles\/10.1186\/s12880-022-00861-y"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,7,30]]},"references-count":42,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2022,12]]}},"alternative-id":["861"],"URL":"https:\/\/doi.org\/10.1186\/s12880-022-00861-y","relation":{},"ISSN":["1471-2342"],"issn-type":[{"type":"electronic","value":"1471-2342"}],"subject":[],"published":{"date-parts":[[2022,7,30]]},"assertion":[{"value":"26 October 2020","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"22 July 2022","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"30 July 2022","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"All data sets we used in experiments are public data sets. The chest X-Ray dataset of COVID-19 comes from GitHub. The second chest X-Ray dataset comes from Kaggle. So, there need no ethics approval and consent to participate.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics approval and consent to participate"}},{"value":"Not applicable.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent to publish"}},{"value":"The authors declare that they have no competing interests.","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Competing interests"}}],"article-number":"135"}}