{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,4]],"date-time":"2026-02-04T17:18:26Z","timestamp":1770225506954,"version":"3.49.0"},"reference-count":42,"publisher":"MDPI AG","issue":"8","license":[{"start":{"date-parts":[[2022,4,15]],"date-time":"2022-04-15T00:00:00Z","timestamp":1649980800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["51975468"],"award-info":[{"award-number":["51975468"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["50674075"],"award-info":[{"award-number":["50674075"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>Realizing robust six degrees of freedom (6DOF) state estimation and high-performance simultaneous localization and mapping (SLAM) for perceptually degraded scenes (such as underground tunnels, corridors, and roadways) is a challenge in robotics. To solve these problems, we propose a SLAM algorithm based on tightly coupled LiDAR-IMU fusion, which consists of two parts: front end iterative Kalman filtering and back end pose graph optimization. Firstly, on the front end, an iterative Kalman filter is established to construct a tightly coupled LiDAR-Inertial Odometry (LIO). The state propagation process for the a priori position and attitude of a robot, which uses predictions and observations, increases the accuracy of the attitude and enhances the system robustness. Second, on the back end, we deploy a keyframe selection strategy to meet the real-time requirements of large-scale scenes. Moreover, loop detection and ground constraints are added to the tightly coupled framework, thereby further improving the overall accuracy of the 6DOF state estimation. Finally, the performance of the algorithm is verified using a public dataset and the dataset we collected. The experimental results show that for perceptually degraded scenes, compared with existing LiDAR-SLAM algorithms, our proposed algorithm grants the robot higher accuracy, real-time performance and robustness, effectively reducing the cumulative error of the system and ensuring the global consistency of the constructed maps.<\/jats:p>","DOI":"10.3390\/s22083063","type":"journal-article","created":{"date-parts":[[2022,4,19]],"date-time":"2022-04-19T02:39:31Z","timestamp":1650335971000},"page":"3063","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":14,"title":["A Tightly Coupled LiDAR-Inertial SLAM for Perceptually Degraded Scenes"],"prefix":"10.3390","volume":"22","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-8234-7512","authenticated-orcid":false,"given":"Lin","family":"Yang","sequence":"first","affiliation":[{"name":"School of Mechanical Engineering, Xi\u2019an University of Science and Technology, Xi\u2019an 710054, China"},{"name":"Shaanxi Key Laboratory of Mine Electromechanical Equipment Intelligent Monitoring, Xi\u2019an 710054, China"}]},{"given":"Hongwei","family":"Ma","sequence":"additional","affiliation":[{"name":"School of Mechanical Engineering, Xi\u2019an University of Science and Technology, Xi\u2019an 710054, China"},{"name":"Shaanxi Key Laboratory of Mine Electromechanical Equipment Intelligent Monitoring, Xi\u2019an 710054, China"}]},{"given":"Yan","family":"Wang","sequence":"additional","affiliation":[{"name":"School of Mechanical Engineering, Xi\u2019an University of Science and Technology, Xi\u2019an 710054, China"},{"name":"Shaanxi Key Laboratory of Mine Electromechanical Equipment Intelligent Monitoring, Xi\u2019an 710054, China"}]},{"given":"Jing","family":"Xia","sequence":"additional","affiliation":[{"name":"School of Mechanical Engineering, Xi\u2019an University of Science and Technology, Xi\u2019an 710054, China"},{"name":"Shaanxi Key Laboratory of Mine Electromechanical Equipment Intelligent Monitoring, Xi\u2019an 710054, China"}]},{"given":"Chuanwei","family":"Wang","sequence":"additional","affiliation":[{"name":"School of Mechanical Engineering, Xi\u2019an University of Science and Technology, Xi\u2019an 710054, China"},{"name":"Shaanxi Key Laboratory of Mine Electromechanical Equipment Intelligent Monitoring, Xi\u2019an 710054, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,4,15]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Smith, R., Self, M., and Cheeseman, P. (1990). Estimating uncertain spatial relationships in robotics. Autonomous Robot Vehicles, Springer.","DOI":"10.1007\/978-1-4613-8997-2_14"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Anderson, S., and Barfoot, T.D. (2013, January 3\u20137). RANSAC for motion-distorted 3D visual sensors. Proceedings of the 2013 IEEE\/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan.","DOI":"10.1109\/IROS.2013.6696649"},{"key":"ref_3","first-page":"59","article-title":"Efficient Surfel-Based SLAM using 3D Laser Range Data in Urban Environments","volume":"2018","author":"Behley","year":"2018","journal-title":"Robot. Sci. Syst."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"965","DOI":"10.1109\/LRA.2018.2793349","article-title":"Robust stereo visual inertial odometry for fast autonomous flight","volume":"3","author":"Sun","year":"2018","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1004","DOI":"10.1109\/TRO.2018.2853729","article-title":"Vins-mono: A robust and versatile monocular visual-inertial state estimator","volume":"34","author":"Qin","year":"2018","journal-title":"IEEE Trans. Robot."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Qin, C., Ye, H., Pranata, C.E., Han, J., Zhang, S., and Liu, M. (August, January 31). Lins: A lidar-inertial state estimator for robust and efficient navigation. Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France.","DOI":"10.1109\/ICRA40945.2020.9197567"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Ye, H., Chen, Y., and Liu, M. (2019, January 20\u201324). Tightly coupled 3d lidar inertial odometry and mapping. Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada.","DOI":"10.1109\/ICRA.2019.8793511"},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Barfoot, T.D., McManus, C., and Anderson, S. (2016). Into darkness: Visual navigation based on a lidar-intensity-image pipeline. Robotics Research, Springer.","DOI":"10.1007\/978-3-319-28872-7_28"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1109\/TIM.2020.2987049","article-title":"An integrated GNSS\/LiDAR-SLAM pose estimation framework for large-scale map building in partially GNSS-denied environments","volume":"70","author":"He","year":"2020","journal-title":"IEEE Trans. Instrum. Meas."},{"key":"ref_10","unstructured":"Besl, P.J., and McKay, N.D. (1992, January 14\u201315). Method for registration of 3-D shapes. Sensor fusion IV: Control paradigms and data structures. Proceedings of the Sensor Fusion IV: Control Paradigms and Data Structures, Boston, MA, USA."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Segal, A., Haehnel, D., and Thrun, S. (2009). Generalized-ICP. Robotics: Science and Systems, University of Washington.","DOI":"10.15607\/RSS.2009.V.021"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Grant, W.S., Voorhies, R.C., and Itti, L. (2013, January 3\u20137). Finding planes in LiDAR point clouds for real-time registration. Proceedings of the 2013 IEEE\/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan.","DOI":"10.1109\/IROS.2013.6696980"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"401","DOI":"10.1007\/s10514-016-9548-2","article-title":"Low-drift and Real-time Lidar Odometry and. Mapping","volume":"41","author":"Zhang","year":"2017","journal-title":"Auton. Robots"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Shan, T., and Englot, B. (2018, January 1\u20135). Lego-loam: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8594299"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Lynen, S., Achtelik, M.W., Weiss, S., Chli, M., and Siegwart, R. (2013, January 3\u20137). A robust and modular multi-sensor fusion approach applied to MAV navigation. Proceedings of the 2013 IEEE\/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan.","DOI":"10.1109\/IROS.2013.6696917"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Yang, S., Zhu, X., Nian, X., Feng, L., Qu, X., and Ma, T. (2018, January 1\u20135). A robust pose graph approach for city scale LiDAR mapping. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8593754"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Demir, M., and Fujimura, K. (2019, January 27\u201330). Robust localization with low-mounted multiple lidars in urban environments. Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, New Zealand.","DOI":"10.1109\/ITSC.2019.8916995"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"23286","DOI":"10.3390\/s150923286","article-title":"INS\/GPS\/LiDAR integrated navigation system for urban and indoor environments using hybrid scan matching algorithm","volume":"15","author":"Gao","year":"2015","journal-title":"Sensors"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Hening, S., Ippolito, C.A., Krishnakumar, K.S., Stepanyan, V., and Teodorescu, M. (2017). 3D LiDAR SLAM Integration with GPS\/INS for UAVs in Urban GPS-Degraded Environments, AIAA Information Systems-AIAA Infotech@ Aerospace.","DOI":"10.2514\/6.2017-0448"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Zhang, S., Guo, Y., Zhu, Q., and Liu, Z. (2019, January 3\u20135). Lidar-IMU and wheel odometer based autonomous vehicle localization system. Proceedings of the 2019 Chinese Control and Decision Conference (CCDC), Nanchang, China.","DOI":"10.1109\/CCDC.2019.8832695"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Hess, W., Kohler, D., Rapp, H., and Andor, D. (2016, January 16\u201321). Real-time loop closure in 2D LIDAR SLAM. Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden.","DOI":"10.1109\/ICRA.2016.7487258"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Li, M., Kim, B.H., and Mourikis, A.I. (2013, January 6\u201310). Real-time motion tracking on a cellphone using inertial sensing and a rolling-shutter camera. Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany.","DOI":"10.1109\/ICRA.2013.6631248"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Huai, Z., and Huang, G. (2018, January 1\u20135). Robocentric visual-inertial odometry. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8593643"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Shan, T., Englot, B., Meyers, D., Wang, W., Ratti, C., and Rus, D. (January, January 24). Lio-sam: Tightly-coupled lidar inertial odometry via smoothing and mapping. Proceedings of the 2020 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA.","DOI":"10.1109\/IROS45743.2020.9341176"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"1226","DOI":"10.1109\/TRO.2013.2267991","article-title":"A quadratic-complexity observability-constrained unscented Kalman filter for SLAM","volume":"29","author":"Huang","year":"2013","journal-title":"IEEE Trans. Robot."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Qin, C., Ye, H., Pranata, C.E., Han, J., Zhang, S., and Liu, M. (2019). R-lins: A robocentric lidar-inertial state estimator for robust and efficient navigation. arXiv.","DOI":"10.1109\/ICRA40945.2020.9197567"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"3317","DOI":"10.1109\/LRA.2021.3064227","article-title":"Fast-lio: A fast, robust lidar-inertial odometry package by tightly-coupled iterated kalman filter","volume":"6","author":"Xu","year":"2021","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Xu, W., Cai, Y., He, D., Lin, J., and Zhang, F. (2021). Fast-lio2: Fast direct lidar-inertial odometry. arXiv.","DOI":"10.1109\/TRO.2022.3141876"},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"1036","DOI":"10.1109\/TRO.2007.903811","article-title":"Convergence and consistency analysis for extended Kalman filter based SLAM","volume":"23","author":"Huang","year":"2007","journal-title":"IEEE Trans. Robot."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"294","DOI":"10.1109\/9.250476","article-title":"The iterated Kalman filter update as a Gauss-Newton method","volume":"38","author":"Bell","year":"1993","journal-title":"IEEE Trans. Autom. Control"},{"key":"ref_31","unstructured":"Sola, J. (2017). Quaternion kinematics for the error-state Kalman filter. arXiv."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Geneva, P., Eckenhoff, K., Yang, Y., and Huang, G. (2018, January 1\u20135). Lips: Lidar-inertial 3D plane slam. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8594463"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"381","DOI":"10.1145\/358669.358692","article-title":"Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography","volume":"24","author":"Fischler","year":"1981","journal-title":"Commun. ACM"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Ma, L., Kerl, C., St\u00fcckler, J., and Cremers, D. (2016, January 16\u201321). CPA-SLAM: Consistent plane-model alignment for direct RGB-D SLAM. Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden.","DOI":"10.1109\/ICRA.2016.7487260"},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"609","DOI":"10.1002\/rob.20345","article-title":"1-Point RANSAC for extended Kalman filtering: Application to real-time structure from motion and visual odometry","volume":"27","author":"Civera","year":"2010","journal-title":"J. Field Robot."},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"1053","DOI":"10.1177\/0278364917728574","article-title":"Iterated extended Kalman filter based visual-inertial odometry using direct photometric feedback","volume":"36","author":"Bloesch","year":"2017","journal-title":"Int. J. Robot. Res."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"61","DOI":"10.1109\/TRO.2011.2170332","article-title":"Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions","volume":"28","author":"Lupton","year":"2011","journal-title":"IEEE Trans. Robot."},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Ren, Z., Wang, L., and Bi, L. (2019). Robust GICP-based 3D LiDAR SLAM for underground mining environment. Sensors, 19.","DOI":"10.3390\/s19132915"},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"189","DOI":"10.1007\/s10514-012-9321-0","article-title":"OctoMap: An efficient probabilistic 3D mapping framework based on octrees","volume":"34","author":"Hornung","year":"2013","journal-title":"Auton. Robot."},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Yang, J.-C., Lin, C.-J., You, B.-Y., Yan, Y.-L., and Cheng, T.-H. (2021). Rtlio: Real-time lidar-inertial odometry and mapping for UAVS. Sensors, 21.","DOI":"10.3390\/s21123955"},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Hartley, R., and Zisserman, A. (2003). Multiple View Geometry in Computer Vision, Cambridge University Press.","DOI":"10.1017\/CBO9780511811685"},{"key":"ref_42","unstructured":"K\u00fcmmerle, R., Grisetti, G., Strasdat, H., Konolige, K., and Burgard, W. (2011, January 9\u201313). g2o: A general framework for graph optimization. Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/22\/8\/3063\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T22:55:13Z","timestamp":1760136913000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/22\/8\/3063"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,4,15]]},"references-count":42,"journal-issue":{"issue":"8","published-online":{"date-parts":[[2022,4]]}},"alternative-id":["s22083063"],"URL":"https:\/\/doi.org\/10.3390\/s22083063","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,4,15]]}}}