{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,2]],"date-time":"2026-04-02T02:35:57Z","timestamp":1775097357155,"version":"3.50.1"},"reference-count":35,"publisher":"MDPI AG","issue":"21","license":[{"start":{"date-parts":[[2023,10,24]],"date-time":"2023-10-24T00:00:00Z","timestamp":1698105600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"National Natural Science Foundation of China","award":["52304183"],"award-info":[{"award-number":["52304183"]}]},{"name":"National Natural Science Foundation of China","award":["52274159"],"award-info":[{"award-number":["52274159"]}]},{"name":"National Natural Science Foundation of China","award":["2022-KF-22-05"],"award-info":[{"award-number":["2022-KF-22-05"]}]},{"name":"National Natural Science Foundation of China","award":["BK20210497"],"award-info":[{"award-number":["BK20210497"]}]},{"name":"National Natural Science Foundation of China","award":["2022M723392"],"award-info":[{"award-number":["2022M723392"]}]},{"name":"Joint Open Fund of State Key Laboratory of Robotics","award":["52304183"],"award-info":[{"award-number":["52304183"]}]},{"name":"Joint Open Fund of State Key Laboratory of Robotics","award":["52274159"],"award-info":[{"award-number":["52274159"]}]},{"name":"Joint Open Fund of State Key Laboratory of Robotics","award":["2022-KF-22-05"],"award-info":[{"award-number":["2022-KF-22-05"]}]},{"name":"Joint Open Fund of State Key Laboratory of Robotics","award":["BK20210497"],"award-info":[{"award-number":["BK20210497"]}]},{"name":"Joint Open Fund of State Key Laboratory of Robotics","award":["2022M723392"],"award-info":[{"award-number":["2022M723392"]}]},{"name":"Natural Science Foundation of Jiangsu Province for Youths","award":["52304183"],"award-info":[{"award-number":["52304183"]}]},{"name":"Natural Science Foundation of Jiangsu Province for Youths","award":["52274159"],"award-info":[{"award-number":["52274159"]}]},{"name":"Natural Science Foundation of Jiangsu Province for Youths","award":["2022-KF-22-05"],"award-info":[{"award-number":["2022-KF-22-05"]}]},{"name":"Natural Science Foundation of Jiangsu Province for Youths","award":["BK20210497"],"award-info":[{"award-number":["BK20210497"]}]},{"name":"Natural Science Foundation of Jiangsu Province for Youths","award":["2022M723392"],"award-info":[{"award-number":["2022M723392"]}]},{"name":"China Postdoctoral Science Foundation","award":["52304183"],"award-info":[{"award-number":["52304183"]}]},{"name":"China Postdoctoral Science Foundation","award":["52274159"],"award-info":[{"award-number":["52274159"]}]},{"name":"China Postdoctoral Science Foundation","award":["2022-KF-22-05"],"award-info":[{"award-number":["2022-KF-22-05"]}]},{"name":"China Postdoctoral Science Foundation","award":["BK20210497"],"award-info":[{"award-number":["BK20210497"]}]},{"name":"China Postdoctoral Science Foundation","award":["2022M723392"],"award-info":[{"award-number":["2022M723392"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Mobile robots in complex underground coal mine environments are still unable to achieve accurate pose estimation and the real-time reconstruction of scenes with absolute geographic information. Integrated terrestrial-underground localization and mapping technologies have still not been effectively developed. This paper proposes a multimodal robust SLAM method based on wireless beacon-assisted geographic information transmission and lidar-IMU-UWB elastic fusion mechanism (LIU-SLAM). In order to obtain the pose estimation and scene models consistent with the geographic information, the construction of two kinds of absolute geographic information constraints based on UWB beacons is proposed. An elastic multimodal fusion state estimation mechanism is designed based on incremental factor graph optimization. A tightly coupled lidar-inertial odometry is firstly designed to construct the lidar-inertial local transformation constraints, which are further integrated with the absolute geographic constraints by UWB anchors through a loosely coupled approach. Extensive field tests based on coal mine robots have been conducted in scenarios such as underground garages and underground coal mine laneways. The results show that the proposed geodesic-coordinate driven multimode robust SLAM method can obtain absolute localization accuracy within 25 cm with practical robustness and real-time performance in different underground application scenarios. The wireless beacon-assisted geodesic-coordinate transmission strategy can provide a plug-and-play customized solution for precise positioning and scene modeling in complex scenarios of various coal mine robot application.<\/jats:p>","DOI":"10.3390\/rs15215093","type":"journal-article","created":{"date-parts":[[2023,10,24]],"date-time":"2023-10-24T11:34:47Z","timestamp":1698147287000},"page":"5093","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":12,"title":["A Multimodal Robust Simultaneous Localization and Mapping Approach Driven by Geodesic Coordinates for Coal Mine Mobile Robots"],"prefix":"10.3390","volume":"15","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-2395-9543","authenticated-orcid":false,"given":"Menggang","family":"Li","sequence":"first","affiliation":[{"name":"School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou 221116, China"},{"name":"Jiangsu Collaborative Innovation Center of Intelligent Mining Equipment, China University of Mining and Technology, Xuzhou 221008, China"},{"name":"State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China"}]},{"given":"Kun","family":"Hu","sequence":"additional","affiliation":[{"name":"School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou 221116, China"}]},{"given":"Yuwang","family":"Liu","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China"}]},{"given":"Eryi","family":"Hu","sequence":"additional","affiliation":[{"name":"Information Institute, Ministry of Emergency Management of the People\u2019s Republic of China, Beijing 100029, China"}]},{"given":"Chaoquan","family":"Tang","sequence":"additional","affiliation":[{"name":"School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou 221116, China"}]},{"given":"Hua","family":"Zhu","sequence":"additional","affiliation":[{"name":"School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou 221116, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9759-1895","authenticated-orcid":false,"given":"Gongbo","family":"Zhou","sequence":"additional","affiliation":[{"name":"School of Mechatronic Engineering, China University of Mining and Technology, Xuzhou 221116, China"}]}],"member":"1968","published-online":{"date-parts":[[2023,10,24]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1104","DOI":"10.1109\/TRO.2012.2200990","article-title":"Zebedee: Design of a spring-mounted 3-D range sensor with application to mobile mapping","volume":"28","author":"Bosse","year":"2012","journal-title":"IEEE Trans. Robot."},{"key":"ref_2","unstructured":"Huber, D.F., and Vandapel, N. (2003). Field and Service Robotics, Springer."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"79","DOI":"10.1109\/MRA.2004.1371614","article-title":"Autonomous exploration and mapping of abandoned mines","volume":"11","author":"Thrun","year":"2004","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"758","DOI":"10.1002\/rob.21504","article-title":"Efficient Large-scale Three-dimensional Mobile Mapping for Underground Mines","volume":"31","author":"Zlot","year":"2010","journal-title":"J. Field Robot."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1074","DOI":"10.1002\/rob.21871","article-title":"Ground robotics in tunnels: Keys and lessons learned after 10 years of research and experiments","volume":"36","author":"Tardioli","year":"2019","journal-title":"J. Field Robot."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Kasper, M., McGuire, S., and Heckman, C. (2019, January 4\u20138). A Benchmark for Visual-Inertial Odometry Systems Employing Onboard Illumination. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems, Macau, China.","DOI":"10.1109\/IROS40897.2019.8968554"},{"key":"ref_7","unstructured":"Ebadi, K., Change, Y., Palieri, M., Ebadi, K., Chang, Y., Palieri, M., Stephens, A., Hatteland, A., Heiden, E., and Thakur, A. (August, January 31). LAMP: Large-Scale Autonomous Mapping and Positioning for Exploration of Perceptually-Degraded Subterranean Environments. Proceedings of the IEEE International Conference on Robotics and Automation, Paris, France."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"1004","DOI":"10.1109\/LRA.2021.3056380","article-title":"Unified Multi-Modal Landmark Tracking for Tightly Coupled Lidar-Visual-Inertial Odometry","volume":"6","author":"Wisth","year":"2021","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_9","first-page":"2193","article-title":"Research on depth vision based mobile robot autonomous navigation in underground coal mine","volume":"45","author":"Ma","year":"2020","journal-title":"J. China Coal Soc."},{"key":"ref_10","first-page":"2182","article-title":"Development of millimeter wave radar imaging and SLAM in underground coal mine environment","volume":"45","author":"Chen","year":"2020","journal-title":"J. China Coal Soc."},{"key":"ref_11","first-page":"2045","article-title":"Research progress of autonomous perception and control technology for intelligent heading","volume":"45","author":"Yang","year":"2020","journal-title":"J. China Coal Soc."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"14124","DOI":"10.1109\/ACCESS.2018.2889304","article-title":"Efficient Laser-Based 3D SLAM for Coal Mine Rescue Robots","volume":"7","author":"Li","year":"2019","journal-title":"IEEE Access"},{"key":"ref_13","first-page":"997","article-title":"Intelligent mining technology and its equipment for medium thickness thin seam","volume":"45","author":"Gao","year":"2020","journal-title":"J. China Coal Soc."},{"key":"ref_14","first-page":"1","article-title":"LOAM: Lidar Odometry and Mapping in Real-time","volume":"Volume 2","author":"Zhang","year":"2014","journal-title":"Robotics: Science and Systems"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Shan, T., and Englot, B. (2018, January 1\u20135). LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8594299"},{"key":"ref_16","unstructured":"Qin, C., Ye, H., Pranata, C.E., Han, J., Zhang, S., and Liu, M. (August, January 31). LINS: A Lidar-Inertial State Estimator for Robust and Efficient Navigation. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Paris, France."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"4861","DOI":"10.1109\/LRA.2022.3152830","article-title":"Faster-LIO: Lightweight Tightly Coupled Lidar-Inertial Odometry Using Parallel Sparse Incremental Voxels","volume":"7","author":"Chunge","year":"2022","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Ye, H., Chen, Y., and Liu, M. (2019, January 20\u201324). Tightly Coupled 3D Lidar Inertial Odometry and Mapping. Proceedings of the International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada.","DOI":"10.1109\/ICRA.2019.8793511"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Shan, T., Englot, B., Meyers, D., Wang, W., Ratti, C., and Rus, D. (2020, January 25\u201329). LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping. Proceedings of the 2020 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA.","DOI":"10.1109\/IROS45743.2020.9341176"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Shan, T., Englot, B., Ratti, C., and Rus, D. (June, January 30). LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping. Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi\u2019an, China.","DOI":"10.1109\/ICRA48506.2021.9561996"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"7469","DOI":"10.1109\/LRA.2021.3095515","article-title":"R2-LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping","volume":"6","author":"Lin","year":"2021","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Lin, J., and Zhang, F. (2022, January 23\u201327). R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package. Proceedings of the 2022 International Conference on Robotics and Automation (ICRA), Philadelphia, PA, USA.","DOI":"10.1109\/ICRA46639.2022.9811935"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Zuo, X., Geneva, P., Lee, W., Liu, Y., and Huang, G. (2019, January 3\u20138). LIC-Fusion: LiDAR-Inertial-Camera Odometry. Proceedings of the 2019 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China.","DOI":"10.1109\/IROS40897.2019.8967746"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Zuo, X., Yulin, Y., Patrick, G., Jiajun, L., Yong, L., Guoquan, H., and Marc, P. (2020, January 25\u201329). LIC-Fusion 2.0: LiDAR-Inertial-Camera Odometry with Sliding-Window Plane-Feature Tracking. Proceedings of the 2020 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA.","DOI":"10.1109\/IROS45743.2020.9340704"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"504","DOI":"10.1109\/LRA.2022.3226074","article-title":"mVIL-Fusion: Monocular Visual-Inertial-LiDAR Simultaneous Localization and Mapping in Challenging Environments","volume":"8","author":"Wang","year":"2023","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Zhang, J., Kaess, M., and Singh, S. (2016, January 16\u201321). On degeneracy of optimization-based state estimation problems. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden.","DOI":"10.1109\/ICRA.2016.7487211"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"1242","DOI":"10.1002\/rob.21809","article-title":"Laser-visual-inertial odometry and mapping with high robustness and low drift","volume":"35","author":"Zhang","year":"2018","journal-title":"J. Field Robot."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Shao, W., Vijayarangan, S., Li, C., and Kantor, G. (2019, January 3\u20138). Stereo Visual Inertial LiDAR Simultaneous Localization and Mapping. Proceedings of the 2019 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Macau, China.","DOI":"10.1109\/IROS40897.2019.8968012"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Zhen, W., and Scherer, S. (2019, January 20\u201324). Estimating the Localizability in Tunnel-like Environments using LiDAR and UWB. Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada.","DOI":"10.1109\/ICRA.2019.8794167"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"61","DOI":"10.1109\/TRO.2011.2170332","article-title":"Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions","volume":"28","author":"Lupton","year":"2011","journal-title":"IEEE Trans. Robot."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1109\/TRO.2016.2597321","article-title":"On-Manifold Preintegration for Real-Time Visual\u2013Inertial Odometry","volume":"33","author":"Forster","year":"2016","journal-title":"IEEE Trans. Robot."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"563","DOI":"10.1177\/0278364919835021","article-title":"Closed-form preintegration methods for graph-based visual\u2013inertial navigation","volume":"38","author":"Eckenhoff","year":"2019","journal-title":"Int. J. Robot. Res."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"1004","DOI":"10.1109\/TRO.2018.2853729","article-title":"Vins-mono: A robust and versatile monocular visual-inertial state estimator","volume":"34","author":"Qin","year":"2018","journal-title":"IEEE Trans. Robot."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"6652","DOI":"10.1109\/JSEN.2020.2976097","article-title":"UWB-Based Localization System Aided with Inertial Sensor for Underground Coal Mine Applications","volume":"20","author":"Li","year":"2020","journal-title":"IEEE Sens. J."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"1349","DOI":"10.1109\/TRO.2018.2858229","article-title":"Resource-aware large-scale cooperative three-dimensional mapping using multiple mobile devices","volume":"34","author":"Guo","year":"2018","journal-title":"IEEE Trans. Robot."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/15\/21\/5093\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T21:11:05Z","timestamp":1760130665000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/15\/21\/5093"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,10,24]]},"references-count":35,"journal-issue":{"issue":"21","published-online":{"date-parts":[[2023,11]]}},"alternative-id":["rs15215093"],"URL":"https:\/\/doi.org\/10.3390\/rs15215093","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,10,24]]}}}