{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,7]],"date-time":"2026-03-07T08:03:28Z","timestamp":1772870608415,"version":"3.50.1"},"reference-count":37,"publisher":"MDPI AG","issue":"19","license":[{"start":{"date-parts":[[2022,9,26]],"date-time":"2022-09-26T00:00:00Z","timestamp":1664150400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["61901500"],"award-info":[{"award-number":["61901500"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["62001486"],"award-info":[{"award-number":["62001486"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["61901481"],"award-info":[{"award-number":["61901481"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["2020TQ0082"],"award-info":[{"award-number":["2020TQ0082"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100002858","name":"China Postdoctoral Science Foundation","doi-asserted-by":"publisher","award":["61901500"],"award-info":[{"award-number":["61901500"]}],"id":[{"id":"10.13039\/501100002858","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100002858","name":"China Postdoctoral Science Foundation","doi-asserted-by":"publisher","award":["62001486"],"award-info":[{"award-number":["62001486"]}],"id":[{"id":"10.13039\/501100002858","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100002858","name":"China Postdoctoral Science Foundation","doi-asserted-by":"publisher","award":["61901481"],"award-info":[{"award-number":["61901481"]}],"id":[{"id":"10.13039\/501100002858","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100002858","name":"China Postdoctoral Science Foundation","doi-asserted-by":"publisher","award":["2020TQ0082"],"award-info":[{"award-number":["2020TQ0082"]}],"id":[{"id":"10.13039\/501100002858","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>At present, deep learning has been widely used in SAR ship target detection, but the accurate and real-time detection of multi-scale targets still faces tough challenges. CNN-based SAR ship detectors are challenged to meet real-time requirements because of a large number of parameters. In this paper, we propose a lightweight, single-stage SAR ship target detection model called YOLO-based lightweight multi-scale ship detector (LMSD-YOLO), with better multi-scale adaptation capabilities. The proposed LMSD-YOLO consists of depthwise separable convolution, batch normalization and activate or not (ACON) activation function (DBA) module, Mobilenet with stem block (S-Mobilenet) backbone module, depthwise adaptively spatial feature fusion (DSASFF) neck module and SCYLLA-IoU (SIoU) loss function. Firstly, the DBA module is proposed as a general lightweight convolution unit to construct the whole lightweight model. Secondly, the improved S-Mobilenet module is designed as the backbone feature extraction network to enhance feature extraction ability without adding additional calculations. Then, the DSASFF module is proposed to achieve adaptive fusion of multi-scale features with fewer parameters. Finally, the SIoU is used as the loss function to accelerate model convergence and improve detection accuracy. The effectiveness of the LMSD-YOLO is validated on the SSDD, HRSID and GFSDD datasets, respectively, and the experimental results show that our proposed model has a smaller model volume and higher detection accuracy, and can accurately detect multi-scale targets in more complex scenes. The model volume of LMSD-YOLO is only 7.6MB (52.77% of model size of YOLOv5s), the detection speed on the NVIDIA AGX Xavier development board reached 68.3 FPS (32.7 FPS higher than YOLOv5s detector), indicating that the LMSD-YOLO can be easily deployed to the mobile platform for real-time application.<\/jats:p>","DOI":"10.3390\/rs14194801","type":"journal-article","created":{"date-parts":[[2022,9,28]],"date-time":"2022-09-28T03:30:37Z","timestamp":1664335837000},"page":"4801","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":110,"title":["LMSD-YOLO: A Lightweight YOLO Algorithm for Multi-Scale SAR Ship Detection"],"prefix":"10.3390","volume":"14","author":[{"given":"Yue","family":"Guo","sequence":"first","affiliation":[{"name":"National Key Laboratory of Science and Technology on Automatic Target Recognition, College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073, China"}]},{"given":"Shiqi","family":"Chen","sequence":"additional","affiliation":[{"name":"National Key Laboratory of Science and Technology on Automatic Target Recognition, College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6799-620X","authenticated-orcid":false,"given":"Ronghui","family":"Zhan","sequence":"additional","affiliation":[{"name":"National Key Laboratory of Science and Technology on Automatic Target Recognition, College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073, China"}]},{"given":"Wei","family":"Wang","sequence":"additional","affiliation":[{"name":"National Key Laboratory of Science and Technology on Automatic Target Recognition, College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073, China"}]},{"given":"Jun","family":"Zhang","sequence":"additional","affiliation":[{"name":"National Key Laboratory of Science and Technology on Automatic Target Recognition, College of Electronic Science and Engineering, National University of Defense Technology, Changsha 410073, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,9,26]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"3019","DOI":"10.1109\/TGRS.2008.923026","article-title":"Wide-Area Traffic Monitoring With the SAR\/GMTI System PAMIR","volume":"46","author":"Klare","year":"2008","journal-title":"IEEE Trans. Geosci. Remote Sens."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"6014","DOI":"10.1109\/ACCESS.2016.2611492","article-title":"Automatic Target Recognition in Synthetic Aperture Radar Imagery: A State-of-the-Art Review","volume":"4","author":"Gill","year":"2016","journal-title":"IEEE Access"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Wang, X., Cheng, P., Liu, X., and Uzochukwu, B. (2018, January 20\u201323). Fast and accurate, convolutional neural network based approach for object detection from UAV. Proceedings of the 44th Annual Conference of the IEEE Industrial Electronics Society, IECON 2018, Washington, DC, USA.","DOI":"10.1109\/IECON.2018.8592805"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"1536","DOI":"10.1109\/LGRS.2015.2412174","article-title":"A Bilateral CFAR Algorithm for Ship Detection in SAR Images","volume":"12","author":"Leng","year":"2015","journal-title":"IEEE Geosci. Remote Sens. Lett."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Gong, B., Wang, Y., Cui, L., Xu, L., Tao, M., Wang, H., and Hou, Y. (2018, January 10\u201312). On the Ship Wake Simulation for Multi-Frequncy and Mutli-Polarization SAR Imaging. Proceedings of the 2018 China International SAR Symposium (CISS), Shanghai, China.","DOI":"10.1109\/SARS.2018.8552036"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Girshick, R., Donahue, J., Darrell, T., and Malik, J. (2014, January 23\u201328). Rich feature hierarchies for accurate object detection and semantic segmentation. Proceedings of the 27th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2014, Columbus, OH, USA.","DOI":"10.1109\/CVPR.2014.81"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Girshick, R. (2015, January 7\u201313). Fast R-CNN. Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Washington, DC, USA.","DOI":"10.1109\/ICCV.2015.169"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"1137","DOI":"10.1109\/TPAMI.2016.2577031","article-title":"Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks","volume":"39","author":"Ren","year":"2017","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Liu, W., Anguelov, D., Erhan, D., Szegedy, C., Reed, S., Fu, C.-Y., and Berg, A.C. (2016, January 8\u201316). SSD: Single shot multibox detector. Proceedings of the 14th European Conference on Computer Vision, ECCV 2016, Amsterdam, The Netherlands.","DOI":"10.1007\/978-3-319-46448-0_2"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Redmon, J., Divvala, S., Girshick, R., and Farhadi, A. (July, January 26). You only look once: Unified, real-time object detection. Proceedings of the 29th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2016, Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.91"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Tian, Z., Shen, C., Chen, H., and He, T. (November, January 27). FCOS: Fully convolutional one-stage object detection. Proceedings of the 17th IEEE\/CVF International Conference on Computer Vision, ICCV 2019, Seoul, Korea.","DOI":"10.1109\/ICCV.2019.00972"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Li, P., and Che, C. (2021, January 18\u201322). SeMo-YOLO: A Multiscale Object Detection Network in Satellite Remote Sensing Images. Proceedings of the 2021 International Joint Conference on Neural Networks (IJCNN), Shenzhen, China.","DOI":"10.1109\/IJCNN52387.2021.9534343"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Yu, J., Wu, T., Zhang, X., and Zhang, W. (2022). An Efficient Lightweight SAR Ship Target Detection Network with Improved Regression Loss Function and Enhanced Feature Information Expression. Sensors, 22.","DOI":"10.3390\/s22093447"},{"key":"ref_14","first-page":"5217712","article-title":"A Robust One-Stage Detector for Multiscale Ship Detection with Complex Background in Massive SAR Images","volume":"60","author":"Yang","year":"2022","journal-title":"IEEE Trans. Geosci. Remote Sens."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"107787","DOI":"10.1016\/j.patcog.2020.107787","article-title":"A CenterNet++ model for ship detection in SAR images","volume":"112","author":"Guo","year":"2021","journal-title":"Pattern Recognit."},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Sun, Z., Leng, X., Lei, Y., Xiong, B., Ji, K., and Kuang, G. (2021). BiFA-YOLO: A Novel YOLO-Based Method for Arbitrary-Oriented Ship Detection in High-Resolution SAR Images. Remote Sens., 13.","DOI":"10.3390\/rs13214209"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"5215212","DOI":"10.1109\/TGRS.2021.3113919","article-title":"Spatial Singularity-Exponent-Domain Multiresolution Imaging-Based SAR Ship Target Detection Method","volume":"60","author":"Xiong","year":"2022","journal-title":"IEEE Trans. Geosci. Remote Sens."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Sun, K., Liang, Y., Ma, X., Huai, Y., and Xing, M. (2021). DSDet: A Lightweight Densely Connected Sparsely Activated Detector for Ship Target Detection in High-Resolution SAR Images. Remote Sens., 13.","DOI":"10.3390\/rs13142743"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"4667","DOI":"10.1109\/JSTARS.2022.3180159","article-title":"An Improved Lightweight RetinaNet for Ship Detection in SAR Images","volume":"15","author":"Miao","year":"2022","journal-title":"IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"1267","DOI":"10.1109\/JSTARS.2020.3041783","article-title":"Learning Slimming SAR Ship Object Detector Through Network Pruning and Knowledge Distillation","volume":"14","author":"Chen","year":"2021","journal-title":"IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Fu, Z., Cao, W., and Li, S. (2022, January 14\u201316). A Lightweight SAR Image Recognition Algorithm Based on Deep Convolutional Neural Network. Proceedings of the 2022 2nd International Conference on Consumer Electronics and Computer Engineering (ICCECE), Guangzhou, China.","DOI":"10.1109\/ICCECE54139.2022.9712814"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Huang, Y., Wang, Z., Wang, Z., Zheng, Y., and Jiao, M. (2021, January 3\u20135). Target feature extraction algorithm for SAR images of complex background based on corner estimation. Proceedings of the 2021 2nd China International SAR Symposium (CISS), Shanghai, China.","DOI":"10.23919\/CISS51089.2021.9652285"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Wen, G., Cao, P., Wang, H., Chen, H., Liu, X., Xu, J., and Zaiane, O. (2022). MS-SSD: Multi-scale single shot detector for ship detection in remote sensing images. Appl. Intell., 1\u201319.","DOI":"10.1007\/s10489-022-03549-6"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Zheng, X., Feng, Y., Shi, H., Zhang, B., and Chen, L. (2020, January 4\u20136). Lightweight convolutional neural network for false alarm elimination in SAR ship detection. In Proceedings of IET International Radar Conference (IET IRC 2020), Chongqing, China.","DOI":"10.1049\/icp.2021.0801"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"85","DOI":"10.1016\/j.infsof.2015.10.001","article-title":"ACon: A learning-based approach to deal with uncertainty in contextual requirements at runtime","volume":"70","author":"Knauss","year":"2016","journal-title":"Inf. Softw. Technol."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Ioannou, Y., Robertson, D., Cipolla, R., and Criminisi, A. (2017, January 21\u201326). Deep Roots: Improving CNN Efficiency with Hierarchical Filter Groups. In Proceedings of 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, Hawaii.","DOI":"10.1109\/CVPR.2017.633"},{"key":"ref_27","first-page":"787","article-title":"Pointwise CNN for 3D Object Classification on Point Cloud","volume":"17","author":"Song","year":"2021","journal-title":"J. Inf. Process. Syst."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Khalid, M., Baber, J., Kasi, M.K., Bakhtyar, M., Devi, V., and Sheikh, N. (2020, January 7\u20139). Empirical Evaluation of Activation Functions in Deep Convolution Neural Network for Facial Expression Recognition. In Proceedings of the 43rd International Conference on Telecommunications and Signal Processing (TSP), Milan, Italy.","DOI":"10.1109\/TSP49548.2020.9163446"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Zhu, H., Xie, Y., Huang, H., Jing, C., Rong, Y., and Wang, C. (2021). DB-YOLO: A Duplicate Bilateral YOLO Network for Multi-Scale Ship Detection in SAR Images. Sensors, 21.","DOI":"10.3390\/s21238146"},{"key":"ref_30","unstructured":"Howard, A.G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., and Adam, H. (2017). MobileNets: Efficient convolutional neural networks for mobile vision applications. arXiv, Available online: https:\/\/arxiv.org\/abs\/1704.04861."},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Howard, A., Sandler, M., Chen, B., Wang, W., Chen, L.-C., Tan, M., Chu, G., Vasudevan, V., Zhu, Y., and Pang, R. (November, January 27). Searching for mobileNetV3. Proceedings of the 17th IEEE\/CVF International Conference on Computer Vision, ICCV 2019, Seoul, Korea.","DOI":"10.1109\/ICCV.2019.00140"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Yu, J., Zhou, G., Zhou, S., and Qin, M. (2021). A Fast and Lightweight Detection Network for Multi-Scale SAR Ship Detection under Complex Backgrounds. Remote Sens., 14.","DOI":"10.3390\/rs14010031"},{"key":"ref_33","unstructured":"Gevorgyan, Z. (2022). SIoU Loss: More Powerful Learning for Bounding Box Regression. arXiv."},{"key":"ref_34","unstructured":"Liu, S., Huang, D., and Wang, Y. (2019). Learning spatial fusion for single-shot object detection. arXiv, Available online: https:\/\/arxiv.org\/abs\/1911.09516."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Xu, X., Zhang, X., and Zhang, T. (2022). Lite-YOLOv5: A Lightweight Deep Learning Detector for On-Board Ship Detection in Large-Scene Sentinel-1 SAR Images. Remote Sens., 14.","DOI":"10.3390\/rs14041018"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Zhang, T., Zhang, X., Li, J., Xu, X., Wang, B., Zhan, X., Xu, Y., Ke, X., Zeng, T., and Su, H. (2021). SAR Ship Detection Dataset (SSDD): Official Release and Comprehensive Data Analysis. Remote Sens., 13.","DOI":"10.3390\/rs13183690"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"120234","DOI":"10.1109\/ACCESS.2020.3005861","article-title":"HRSID: A High-Resolution SAR Images Dataset for Ship Detection and Instance Segmentation","volume":"8","author":"Wei","year":"2020","journal-title":"IEEE Access"}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/19\/4801\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T00:39:43Z","timestamp":1760143183000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/19\/4801"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,9,26]]},"references-count":37,"journal-issue":{"issue":"19","published-online":{"date-parts":[[2022,10]]}},"alternative-id":["rs14194801"],"URL":"https:\/\/doi.org\/10.3390\/rs14194801","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,9,26]]}}}