{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,4]],"date-time":"2026-03-04T05:41:50Z","timestamp":1772602910196,"version":"3.50.1"},"reference-count":34,"publisher":"MDPI AG","issue":"15","license":[{"start":{"date-parts":[[2023,7,31]],"date-time":"2023-07-31T00:00:00Z","timestamp":1690761600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"National Natural Science Foundation Project of China","award":["72088101"],"award-info":[{"award-number":["72088101"]}]},{"name":"National Natural Science Foundation Project of China","award":["52274163"],"award-info":[{"award-number":["52274163"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>The demand for autonomous exploration and mapping of underground environments has significantly increased in recent years. However, accurately localizing and mapping robots in subterranean settings presents notable challenges. This paper presents a tightly coupled LiDAR-Inertial odometry system that combines the NanoGICP point cloud registration method with IMU pre-integration using incremental smoothing and mapping. Specifically, a point cloud affected by dust particles is first filtered out and separated into ground and non-ground point clouds (for ground vehicles). To maintain accuracy in environments with spatial variations, an adaptive voxel filter is employed, which reduces computation time while preserving accuracy. The estimated motion derived from IMU pre-integration is utilized to correct point cloud distortion and provide an initial estimation for LiDAR odometry. Subsequently, a scan-to-map point cloud registration is executed using NanoGICP to obtain a more refined pose estimation. The resulting LiDAR odometry is then employed to estimate the bias of the IMU. We comprehensively evaluated our system on established subterranean datasets. These datasets were collected by two separate teams using different platforms during the DARPA Subterranean (SubT) Challenge. The experimental results demonstrate that our system achieved performance enhancements as high as 50\u201360% in terms of root mean square error (RMSE).<\/jats:p>","DOI":"10.3390\/s23156834","type":"journal-article","created":{"date-parts":[[2023,8,1]],"date-time":"2023-08-01T09:24:24Z","timestamp":1690881864000},"page":"6834","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":12,"title":["Tightly Coupled LiDAR-Inertial Odometry and Mapping for Underground Environments"],"prefix":"10.3390","volume":"23","author":[{"given":"Jianhong","family":"Chen","sequence":"first","affiliation":[{"name":"School of Resources and Safety Engineering, Central South University, Changsha 410083, China"}]},{"given":"Hongwei","family":"Wang","sequence":"additional","affiliation":[{"name":"School of Resources and Safety Engineering, Central South University, Changsha 410083, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8249-0183","authenticated-orcid":false,"given":"Shan","family":"Yang","sequence":"additional","affiliation":[{"name":"School of Resources and Safety Engineering, Central South University, Changsha 410083, China"}]}],"member":"1968","published-online":{"date-parts":[[2023,7,31]]},"reference":[{"key":"ref_1","unstructured":"Ebadi, K., Bernreiter, L., Biggie, H., Catt, G., Chang, Y., Chatterjee, A., Denniston, C.E., Desch\u00eanes, S.-P., Harlow, K., and Khattak, S. (2022). Present and Future of SLAM in Extreme Underground Environments. arXiv."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"4755","DOI":"10.1109\/JSEN.2021.3112547","article-title":"Underground Mine Positioning: A Review","volume":"22","author":"Seguel","year":"2022","journal-title":"IEEE Sens. J."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"5","DOI":"10.1002\/rob.21978","article-title":"What localizes beneath: A metric multisensor localization and mapping system for autonomous underground mining vehicles","volume":"38","author":"Jacobson","year":"2021","journal-title":"J. Field Robot."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"1074","DOI":"10.1002\/rob.21871","article-title":"Ground robotics in tunnels: Keys and lessons learned after 10 years of research and experiments","volume":"36","author":"Tardioli","year":"2019","journal-title":"J. Field Robot."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1255","DOI":"10.1109\/TRO.2017.2705103","article-title":"ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras","volume":"33","year":"2017","journal-title":"IEEE Trans. Robot."},{"key":"ref_6","unstructured":"Thrun, S., Hahnel, D., Ferguson, D., Montemerlo, M., Triebel, R., Burgard, W., Baker, C., Omohundro, Z., Thayer, S., and Whittaker, W. (2003, January 14\u201319). A system for volumetric robotic mapping of abandoned mines. Proceedings of the 2003 IEEE International Conference on Robotics and Automation (ICRA), Taipei, Taiwan."},{"key":"ref_7","unstructured":"Tardioli, D., Sicignano, D., Riazuelo, L., Villarroel, J., and Montano, L. (February, January 31). Robot teams for exploration in underground environments. Proceedings of the 2012 XV Jornadas de Tiempo Real (JTR), Santander, Spain."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Yoshida, K., and Tadokoro, S. (2014). Efficient Large-Scale 3D Mobile Mapping and Surface Reconstruction of an Underground Mine. Field and Service Robotics, Springer.","DOI":"10.1007\/978-3-642-40686-7"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Zhang, J., and Singh, S. (2014, January 30). LOAM: Lidar Odometry and Mapping in Real-time. Proceedings of the Robotics: Science and Systems, Berkeley, CA, USA.","DOI":"10.15607\/RSS.2014.X.007"},{"key":"ref_10","unstructured":"Qin, T., and Cao, S. (2019, March 06). A-loam: Advanced Implementation of Loam. Available online: https:\/\/github.com\/HKUST-Aerial-Robotics\/A-LOAM."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Wang, H., Wang, C., Chen, C., and Xie, L. (October, January 27). F-LOAM: Fast LiDAR Odometry and Mapping. Proceedings of the 2021 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic.","DOI":"10.1109\/IROS51168.2021.9636655"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"1911","DOI":"10.1109\/TIV.2022.3151665","article-title":"E-LOAM: LiDAR Odometry and Mapping with Expanded Local Structural Information","volume":"8","author":"Guo","year":"2023","journal-title":"IEEE Trans. Intell. Veh."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Shan, T., and Englot, B. (2018, January 1\u20135). LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8594299"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"2000","DOI":"10.1109\/LRA.2022.3142739","article-title":"Direct LiDAR Odometry: Fast Localization with Dense Point Clouds","volume":"7","author":"Chen","year":"2022","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_15","unstructured":"Blanco, J.L., and Rai, P.K. (2014, May 02). Nanoflann: A C++ Header-Only Fork of FLANN, a Library for Nearest Neighbor (NN) with KD-Trees. Available online: https:\/\/github.com\/jlblancoc\/nanoflann."},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Koide, K., Yokozuka, M., Oishi, S., and Banno, A. (June, January 30). Voxelized gicp for fast and accurate 3D point cloud registration. Proceedings of the 2021 International Conference on Robotics and Automation (ICRA), Xi\u2019an, China.","DOI":"10.1109\/ICRA48506.2021.9560835"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Ye, H., Chen, Y., and Liu, M. (2019, January 20\u201324). Tightly Coupled 3D Lidar Inertial Odometry and Mapping. Proceedings of the 2019 IEEE International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada.","DOI":"10.1109\/ICRA.2019.8793511"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Shan, T., Englot, B., Meyers, D., Wang, W., Ratti, C., and Rus, D. (January, January 24). LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping. Proceedings of the 2020 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA.","DOI":"10.1109\/IROS45743.2020.9341176"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"216","DOI":"10.1177\/0278364911430419","article-title":"iSAM2: Incremental smoothing mapping using Bayes tree","volume":"31","author":"Kaess","year":"2012","journal-title":"Int. J. Robot. Res."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"5167","DOI":"10.1109\/LRA.2021.3070251","article-title":"Towards High-Performance Solid-State-LiDAR-Inertial Odometry and Mapping","volume":"6","author":"Li","year":"2021","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Qin, C., Ye, H., Pranata, C.E., Han, J., Zhang, S., and Liu, M. (August, January 31). LINS: A Lidar-Inertial State Estimator for Robust and Efficient Navigation. Proceedings of the 2020 IEEE International Conference on Robotics and Automation (ICRA), Paris, France.","DOI":"10.1109\/ICRA40945.2020.9197567"},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"3317","DOI":"10.1109\/LRA.2021.3064227","article-title":"FAST-LIO: A Fast, Robust LiDAR-inertial Odometry Package by Tightly-Coupled Iterated Kalman Filter","volume":"6","author":"Xu","year":"2021","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"2053","DOI":"10.1109\/TRO.2022.3141876","article-title":"FAST-LIO2: Fast Direct LiDAR-Inertial Odometry","volume":"38","author":"Xu","year":"2022","journal-title":"IEEE Trans. Robot."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"4861","DOI":"10.1109\/LRA.2022.3152830","article-title":"Faster-LIO: Lightweight Tightly Coupled Lidar-Inertial Odometry Using Parallel Sparse Incremental Voxels","volume":"7","author":"Bai","year":"2022","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Bogoslavskyi, I., and Stachniss, C. (2016, January 9\u201314). Fast range image-based segmentation of sparse 3D laser scans for online operation. Proceedings of the 2016 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, South Korea.","DOI":"10.1109\/IROS.2016.7759050"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1109\/TRO.2016.2597321","article-title":"On-Manifold Preintegration for Real-Time Visual-Inertial Odometry","volume":"33","author":"Forster","year":"2017","journal-title":"IEEE Trans. Robot."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"61","DOI":"10.1109\/TRO.2011.2170332","article-title":"Visual-Inertial-Aided Navigation for High-Dynamic Motion in Built Environments Without Initial Conditions","volume":"28","author":"Lupton","year":"2012","journal-title":"IEEE Trans. Robot."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Segal, A.V., Hahnel, D., and Thrun, S. (2009, January 25\u201328). Generalized-ICP. Proceedings of the 2009 Robotics: Science and Systems (RSS), Seattle, WA, USA.","DOI":"10.15607\/RSS.2009.V.021"},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"8518","DOI":"10.1109\/LRA.2022.3187250","article-title":"Efficient and Probabilistic Adaptive Voxel Mapping for Accurate Online LiDAR Odometry","volume":"7","author":"Yuan","year":"2022","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"abp9742","DOI":"10.1126\/scirobotics.abp9742","article-title":"CERBERUS in the DARPA Subterranean Challenge","volume":"7","author":"Tranzatto","year":"2022","journal-title":"Sci. Robot."},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Zhao, S., Zhang, H., Wang, P., Nogueira, L., and Scherer, S. (October, January 27). Super Odometry: IMU-Centric LiDAR-Visual-Inertial Estimator for Challenging Environments. Proceedings of the 2021 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic.","DOI":"10.1109\/IROS51168.2021.9635862"},{"key":"ref_32","unstructured":"Dellaert, F. (2021, December 21). GTSAM. Available online: https:\/\/github.com\/borglab\/gtsam."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Zhang, Z., and Scaramuzza, D. (2018, January 1\u20135). A Tutorial on Quantitative Trajectory Evaluation for Visual(-Inertial) Odometry. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8593941"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Khattak, S., Huan, N., Mascarich, F., Dang, T., and Alexis, K. (2020, January 1\u20134). Complementary multimodal sensor fusion for resilient robot pose estimation in subterranean environments. Proceedings of the 2020 IEEE International Conference on Unmanned Aircraft Systems (ICUAS), Athens, Greece.","DOI":"10.1109\/ICUAS48674.2020.9213865"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/15\/6834\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T20:23:30Z","timestamp":1760127810000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/15\/6834"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,7,31]]},"references-count":34,"journal-issue":{"issue":"15","published-online":{"date-parts":[[2023,8]]}},"alternative-id":["s23156834"],"URL":"https:\/\/doi.org\/10.3390\/s23156834","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,7,31]]}}}