{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,14]],"date-time":"2026-04-14T21:42:40Z","timestamp":1776202960245,"version":"3.50.1"},"reference-count":51,"publisher":"MDPI AG","issue":"18","license":[{"start":{"date-parts":[[2021,9,16]],"date-time":"2021-09-16T00:00:00Z","timestamp":1631750400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["61936003"],"award-info":[{"award-number":["61936003"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100003453","name":"Natural Science Foundation of Guangdong Province","doi-asserted-by":"publisher","award":["2017A030312006"],"award-info":[{"award-number":["2017A030312006"]}],"id":[{"id":"10.13039\/501100003453","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>Landing an unmanned aerial vehicle (UAV) autonomously and safely is a challenging task. Although the existing approaches have resolved the problem of precise landing by identifying a specific landing marker using the UAV\u2019s onboard vision system, the vast majority of these works are conducted in either daytime or well-illuminated laboratory environments. In contrast, very few researchers have investigated the possibility of landing in low-illumination conditions by employing various active light sources to lighten the markers. In this paper, a novel vision system design is proposed to tackle UAV landing in outdoor extreme low-illumination environments without the need to apply an active light source to the marker. We use a model-based enhancement scheme to improve the quality and brightness of the onboard captured images, then present a hierarchical-based method consisting of a decision tree with an associated light-weight convolutional neural network (CNN) for coarse-to-fine landing marker localization, where the key information of the marker is extracted and reserved for post-processing, such as pose estimation and landing control. Extensive evaluations have been conducted to demonstrate the robustness, accuracy, and real-time performance of the proposed vision system. Field experiments across a variety of outdoor nighttime scenarios with an average luminance of 5 lx at the marker locations have proven the feasibility and practicability of the system.<\/jats:p>","DOI":"10.3390\/s21186226","type":"journal-article","created":{"date-parts":[[2021,9,22]],"date-time":"2021-09-22T03:47:35Z","timestamp":1632282455000},"page":"6226","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":38,"title":["Real-Time Monocular Vision System for UAV Autonomous Landing in Outdoor Low-Illumination Environments"],"prefix":"10.3390","volume":"21","author":[{"given":"Shanggang","family":"Lin","sequence":"first","affiliation":[{"name":"School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510640, China"},{"name":"South China University of Technology-Zhuhai Institute of Modern Industrial Innovation, Zhuhai 519175, China"}]},{"given":"Lianwen","family":"Jin","sequence":"additional","affiliation":[{"name":"School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510640, China"},{"name":"South China University of Technology-Zhuhai Institute of Modern Industrial Innovation, Zhuhai 519175, China"}]},{"given":"Ziwei","family":"Chen","sequence":"additional","affiliation":[{"name":"School of Electronic and Information Engineering, South China University of Technology, Guangzhou 510640, China"},{"name":"South China University of Technology-Zhuhai Institute of Modern Industrial Innovation, Zhuhai 519175, China"}]}],"member":"1968","published-online":{"date-parts":[[2021,9,16]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"832","DOI":"10.1002\/rob.21436","article-title":"Collaborative mapping of an earthquake-damaged building via ground and aerial robots","volume":"29","author":"Michael","year":"2012","journal-title":"J. Field Robot."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"1450","DOI":"10.1002\/rob.21723","article-title":"Radiation search operations using scene understanding with autonomous UAV and UGV","volume":"34","author":"Christie","year":"2017","journal-title":"J. Field Robot."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Kerle, N., Nex, F., Gerke, M., Duarte, D., and Vetrivel, A. (2020). UAV-based structural damage mapping: A review. Int. J. Geo-Inf., 9.","DOI":"10.3390\/ijgi9010014"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Guo, Y., Guo, J., Liu, C., Xiong, H., Chai, L., and He, D. (2020). Precision landing test and simulation of the agricultural UAV on apron. Sensors, 20.","DOI":"10.3390\/s20123369"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"368","DOI":"10.1016\/j.ast.2018.07.026","article-title":"Intelligent GNSS\/INS integrated navigation system for a commercial UAV flight control system","volume":"80","author":"Zhang","year":"2018","journal-title":"Aerosp. Sci. Technol."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"645","DOI":"10.1007\/s10846-018-0933-2","article-title":"A bision-based approach for unmanned aerial vehicle landing","volume":"95","author":"Patruno","year":"2019","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"499","DOI":"10.1007\/s10846-012-9749-7","article-title":"An onboard monocular vision system for autonomous takeoff, hovering and landing of a micro aerial vehicle","volume":"69","author":"Yang","year":"2013","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"369","DOI":"10.1007\/s10846-016-0399-z","article-title":"Vision based autonomous landing of multirotor UAV on moving platform","volume":"85","author":"Arrar","year":"2017","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"874","DOI":"10.1002\/rob.21858","article-title":"Autonomous landing on a moving vehicle with an unmanned aerial vehicle","volume":"36","author":"Baca","year":"2019","journal-title":"J. Field Robot."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"6","DOI":"10.1002\/rob.21814","article-title":"Ellipse proposal and convolutional neural network discriminant for autonomous landing marker detection","volume":"36","author":"Jin","year":"2019","journal-title":"J. Field Robot."},{"key":"ref_11","unstructured":"(2021, September 02). Global Drone Regulations Database. Available online: https:\/\/droneregulations.info\/."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Chen, X., Phang, S.K., and Chen, B.M. (2017, January 1\u20134). System integration of a vision-guided UAV for autonomous tracking on moving platform in low illumination condition. Proceedings of the ION 2017 Pacific PNT Meeting, Honolulu, HI, USA.","DOI":"10.33012\/2017.15022"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"474","DOI":"10.1016\/j.ast.2018.12.030","article-title":"A visual\/inertial integrated landing guidance method for UAV landing on the ship","volume":"85","author":"Meng","year":"2019","journal-title":"Aerosp. Sci. Technol."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"881","DOI":"10.1007\/s10514-016-9564-2","article-title":"Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment","volume":"41","author":"Lin","year":"2017","journal-title":"Auton. Robot."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"371","DOI":"10.1109\/TRA.2003.810239","article-title":"Visually guided landing of an unmanned aerial vehicle","volume":"19","author":"Saripalli","year":"2003","journal-title":"IEEE Trans. Robot. Autom."},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Lee, D., Ryan, T., and Kim, H.J. (2012, January 14\u201318). Autonomous landing of a VTOL UAV on a moving platform using image-based visual servoing. Proceedings of the 2012 IEEE International Conference on Robotics and Automation (ICRA), Saint Paul, MN, USA.","DOI":"10.1109\/ICRA.2012.6224828"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"1524","DOI":"10.1109\/TRO.2016.2604495","article-title":"Landing of a quadrotor on a moving target using dynamic image-based visual servo control","volume":"32","author":"Serra","year":"2016","journal-title":"IEEE Trans. Robot."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Wu, Y., Niu, X., Du, J., Chang, L., Tang, H., and Zhang, H. (2019). Artificial marker and MEMS IMU-based pose estimation method to meet multirotor UAV landing requirements. Sensors, 19.","DOI":"10.3390\/s19245428"},{"key":"ref_19","unstructured":"Masselli, A., and Zell, A. (2012, January 13). A novel marker based tracking method for position and attitude control of MAVs. Proceedings of the International Micro Air Vehicle Conference and Flight Competition, Braunschweig, Germany."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"667","DOI":"10.1002\/rob.21467","article-title":"Automated vision-based recovery of a rotary wing unmanned aerial vehicle onto a moving platform","volume":"30","author":"Richardson","year":"2013","journal-title":"J. Field Robot."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"34","DOI":"10.1002\/rob.21815","article-title":"Fast vision-based autonomous detection of moving cooperative target for unmanned aerial vehicle landing","volume":"36","author":"Li","year":"2019","journal-title":"J. Field Robot."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"252","DOI":"10.1002\/rob.21850","article-title":"Vision techniques for on-board detection, following, and mapping of moving targets","volume":"36","author":"Stepan","year":"2019","journal-title":"J. Field Robot."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"49","DOI":"10.1002\/rob.21821","article-title":"Fully autonomous micro air vehicle flight and landing on a moving target using visual\u2013inertial estimation and model-predictive control","volume":"36","author":"Tzoumanikas","year":"2019","journal-title":"J. Field Robot."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Horla, D., Giernacki, W., Cie\u015blak, J., and Campoy, P. (2021). Altitude measurement-based optimization of the landing process of UAVs. Sensors, 21.","DOI":"10.3390\/s21041151"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Nguyen, P.H., Arsalan, M., Koo, J.H., Naqvi, R.A., Truong, N.Q., and Park, K.R. (2018). LightDenseYOLO: A fast and accurate marker tracker for autonomous UAV landing by visible light camera sensor on drone. Sensors, 18.","DOI":"10.3390\/s18061703"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"171","DOI":"10.1177\/1756829318757470","article-title":"Deep learning for vision-based micro aerial vehicle autonomous landing","volume":"10","author":"Yu","year":"2018","journal-title":"Int. J. Micro Air Veh."},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Abu-Jbara, K., Alheadary, W., Sundaramorthi, G., and Claudel, C. (2015, January 9\u201312). A robust vision-based runway detection and tracking algorithm for automatic UAV landing. Proceedings of the 2015 International Conference on Unmanned Aircraft Systems (ICUAS), Denver, CO, USA.","DOI":"10.1109\/ICUAS.2015.7152407"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"215","DOI":"10.1134\/S2075108719040084","article-title":"Optical aircraft positioning for monitoring of the integrated navigation system during landing approach","volume":"10","author":"Hecker","year":"2019","journal-title":"Gyroscopy Navig."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Hiba, A., G\u00e1ti, A., and Manecy, A. (2021). Optical navigation sensor for runway relative positioning of aircraft during final approach. Sensors, 21.","DOI":"10.3390\/s21062203"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Kong, W., Zhou, D., Zhang, Y., Zhang, D., Wang, X., Zhao, B., Yan, C., Shen, L., and Zhang, J. (2014, January 14\u201318). A ground-based optical system for autonomous landing of a fixed wing UAV. Proceedings of the 2014 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Chicago, IL, USA.","DOI":"10.1109\/IROS.2014.6943244"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Kong, W., Hu, T., Zhang, D., Shen, L., and Zhang, J. (2017). Localization framework for real-time UAV autonomous landing: An on-ground deployed visual approach. Sensors, 17.","DOI":"10.3390\/s17061437"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Yang, T., Li, G., Li, J., Zhang, Y., Zhang, X., Zhang, Z., and Li, Z. (2016). A ground-based near infrared camera array system for UAV auto-landing in GPS-denied environment. Sensors, 16.","DOI":"10.3390\/s16091393"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.ast.2016.09.005","article-title":"A UWB positioning network enabling unmanned aircraft systems auto land","volume":"58","author":"Kim","year":"2016","journal-title":"Aerosp. Sci. Technol."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Tiemann, J., and Wietfeld, C. (2017, January 18\u201321). Scalable and precise multi-UAV indoor navigation using TDOA-based UWB localization. Proceedings of the 2017 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sapporo, Japan.","DOI":"10.1109\/IPIN.2017.8115937"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Pavlenko, T., Sch\u00fctz, M., Vossiek, M., Walter, T., and Montenegro, S. (2019, January 19\u201321). Wireless local positioning system for controlled UAV landing in GNSS-denied environment. Proceedings of the 2019 IEEE 5th International Workshop on Metrology for AeroSpace (MetroAeroSpace), Torino Turin, Italy.","DOI":"10.1109\/MetroAeroSpace.2019.8869587"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Silva, J., Mendonca, R., Marques, F., Rodrigues, P., Santana, P.S., and Barata, J. (2014, January 5\u201310). Saliency-based cooperative landing of a multirotor aerial vehicle on an autonomous surface vehicle. Proceedings of the 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO), Bali, Indonesia.","DOI":"10.1109\/ROBIO.2014.7090550"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"113","DOI":"10.1007\/s10846-013-9926-3","article-title":"An approach toward visual autonomous ship board landing of a VTOL UAV","volume":"74","author":"Pestana","year":"2014","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"125","DOI":"10.1007\/s10846-017-0757-5","article-title":"Quadrotor autonomous approaching and landing on a vessel deck","volume":"92","author":"Wang","year":"2018","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Xu, Z.C., Hu, B.B., Liu, B., Wang, X., and Zhang, H.T. (2020, January 27\u201329). Vision-based autonomous landing of unmanned aerial vehicle on a motional unmanned surface vessel. Proceedings of the 2020 39th Chinese Control Conference (CCC), Shenyang, China.","DOI":"10.23919\/CCC50068.2020.9188979"},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Wu, S., Zhang, K., Li, S., and Yan, J. (2020). Learning to track aircraft in infrared imagery. Remote Sens., 12.","DOI":"10.3390\/rs12233995"},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Hr\u00faz, M., Bugaj, M., Nov\u00e1k, A., Kandera, B., and Bad\u00e1nik, B. (2021). The use of UAV with infrared camera and RFID for airframe condition monitoring. Appl. Sci., 11.","DOI":"10.3390\/app11093737"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"1498","DOI":"10.1016\/j.cja.2013.07.049","article-title":"Use of land\u2019s cooperative object to estimate UAV\u2019s pose for autonomous landing","volume":"26","author":"Xu","year":"2013","journal-title":"Chin. J. Aeronaut."},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Kalinov, I., Safronov, E., Agishev, R., Kurenkov, M., and Tsetserukou, D. (May, January 28). High-precision UAV localization system for landing on a mobile collaborative robot based on an IR marker pattern recognition. Proceedings of the 2019 IEEE 89th Vehicular Technology Conference (VTC2019-Spring), Kuala Lumpur, Malaysia.","DOI":"10.1109\/VTCSpring.2019.8746668"},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"197","DOI":"10.1007\/s10846-013-9819-5","article-title":"Airborne vision-based navigation method for UAV accuracy landing using infrared lamps","volume":"72","author":"Gui","year":"2013","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_45","unstructured":"Dong, X., Wang, G., Pang, Y., Li, W., Wen, J., Meng, W., and Lu, Y. (2011, January 11\u201315). Fast efficient algorithm for enhancement of low lighting video. Proceedings of the 2011 IEEE International Conference on Multimedia and Expo, Barcelona, Spain."},{"key":"ref_46","doi-asserted-by":"crossref","first-page":"2341","DOI":"10.1109\/TPAMI.2010.168","article-title":"Single image haze removal using dark channel prior","volume":"33","author":"He","year":"2011","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"He, K., Zhang, X., Ren, S., and Sun, J. (2016, January 27\u201330). Deep residual learning for image recognition. Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA.","DOI":"10.1109\/CVPR.2016.90"},{"key":"ref_48","unstructured":"Simonyan, K., and Zisserman, A. (2015, January 7\u20139). Very deep convolutional networks for large-scale image recognition. Proceedings of the 3rd International Conference on Learning Representations, San Diego, CA, USA."},{"key":"ref_49","unstructured":"Iandola, F.N., Han, S., Moskewicz, M.W., Ashraf, K., Dally, W.J., and Keutzer, K. (2016). SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5MB model size. arXiv."},{"key":"ref_50","unstructured":"Redmon, J., and Farhadi, A. (2018). YOLOv3: An incremental improvement. arXiv."},{"key":"ref_51","doi-asserted-by":"crossref","unstructured":"Sandler, M., Howard, A., Zhu, M., Zhmoginov, A., and Chen, L.C. (2018). MobileNetV2: Inverted residuals and linear bottlenecks. arXiv.","DOI":"10.1109\/CVPR.2018.00474"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/21\/18\/6226\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T07:00:58Z","timestamp":1760166058000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/21\/18\/6226"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,9,16]]},"references-count":51,"journal-issue":{"issue":"18","published-online":{"date-parts":[[2021,9]]}},"alternative-id":["s21186226"],"URL":"https:\/\/doi.org\/10.3390\/s21186226","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,9,16]]}}}