{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,16]],"date-time":"2025-10-16T00:39:09Z","timestamp":1760575149946,"version":"build-2065373602"},"reference-count":19,"publisher":"Emerald","issue":"5","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2025,10,16]]},"abstract":"<jats:sec>\n                  <jats:title>Purpose<\/jats:title>\n                  <jats:p>Simultaneous localization and mapping (SLAM) is widely used in autonomous robotics. Although LiDAR and vision-based methods have been widely deployed in large-scale environments, their submeter accuracy is insufficient for operational scenarios. This paper aims to present a tightly integrated SLAM framework combining point clouds, visual landmarks and IMU data to enhance localization accuracy.<\/jats:p>\n               <\/jats:sec>\n               <jats:sec>\n                  <jats:title>Design\/methodology\/approach<\/jats:title>\n                  <jats:p>A novel method to resolve planar marker pose ambiguity is proposed. Edge constraints between the point cloud and marker correct marker\u2019 pose and distortions. A tightly coupled coarse-to-fine optimization approach integrates point cloud features and marker corners during tracking and localization. Pose constraints are added to the objective function to improve accuracy, and bundle adjustment is applied to key frame and marker poses in the common view to prevent registration errors.<\/jats:p>\n               <\/jats:sec>\n               <jats:sec>\n                  <jats:title>Findings<\/jats:title>\n                  <jats:p>The proposed method is evaluated in real-world environments. Experimental results demonstrate that it achieves superior localization and mapping accuracy compared to existing methods, thus validating the effectiveness of the framework.<\/jats:p>\n               <\/jats:sec>\n               <jats:sec>\n                  <jats:title>Originality\/value<\/jats:title>\n                  <jats:p>This paper introduces a tightly integrated framework that combines point clouds, visual landmarks and IMU measurements to improve small-scale localization accuracy in operational scenario.<\/jats:p>\n               <\/jats:sec>","DOI":"10.1108\/ir-10-2024-0485","type":"journal-article","created":{"date-parts":[[2025,5,29]],"date-time":"2025-05-29T02:20:43Z","timestamp":1748485243000},"page":"760-769","source":"Crossref","is-referenced-by-count":0,"title":["Tightly-coupled lidar-inertial localization with visual landmarks in operational scenarios"],"prefix":"10.1108","volume":"52","author":[{"given":"Gongcheng","family":"Wang","sequence":"first","affiliation":[{"name":"Harbin Institute of Technology Key State Laboratory of Robotics and System, , Harbin,","place":["China"]}]},{"given":"Zexin","family":"Cao","sequence":"additional","affiliation":[{"name":"Harbin Institute of Technology Key State Laboratory of Robotics and System, , Harbin,","place":["China"]}]},{"given":"Xin","family":"Lv","sequence":"additional","affiliation":[{"name":"Harbin Institute of Technology Key State Laboratory of Robotics and System, , Harbin,","place":["China"]}]},{"given":"Han","family":"Wang","sequence":"additional","affiliation":[{"name":"Harbin Institute of Technology Key State Laboratory of Robotics and System, , Harbin,","place":["China"]}]},{"given":"Chong","family":"Wang","sequence":"additional","affiliation":[{"name":"Harbin Institute of Technology Key State Laboratory of Robotics and System, , Harbin,","place":["China"]}]},{"given":"Pengchao","family":"Ding","sequence":"additional","affiliation":[{"name":"Harbin Institute of Technology Key State Laboratory of Robotics and System, , Harbin,","place":["China"]}]},{"given":"Weidong","family":"Wang","sequence":"additional","affiliation":[{"name":"Harbin Institute of Technology Key State Laboratory of Robotics and System, , Harbin,","place":["China"]}]}],"member":"140","published-online":{"date-parts":[[2025,5,30]]},"reference":[{"issue":"6","key":"2025101507170355000_ref001","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1109\/tro.2021.3075644","article-title":"ORB-SLAM3: an accurate open-source library for visual, visual\u2013inertial, and multimap SLAM","volume":"37","author":"Campos","year":"2021","journal-title":"IEEE Transactions on Robotics"},{"key":"2025101507170355000_ref002","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1109\/tim.2021.3053066","article-title":"An uncertainty-driven and observability-based state estimator for nonholonomic robots","volume":"70","author":"Fontanelli","year":"2021","journal-title":"IEEE Transactions on Instrumentation and Measurement"},{"issue":"6","key":"2025101507170355000_ref003","doi-asserted-by":"publisher","first-page":"2280","DOI":"10.1016\/j.patcog.2014.01.005","article-title":"Automatic generation and detection of highly reliable fiducial markers under occlusion","volume":"47","author":"Garrido-Jurado","year":"2014","journal-title":"Pattern Recognition"},{"key":"2025101507170355000_ref004","doi-asserted-by":"publisher","DOI":"10.15607\/rss.2013.ix.029","volume-title":"CiteSeer X (the PA State University)","author":"Levinson","year":"2013"},{"issue":"4","key":"2025101507170355000_ref005","doi-asserted-by":"publisher","first-page":"7469","DOI":"10.1109\/lra.2021.3095515","article-title":"R $^2$LIVE: a robust, real-time, LiDAR-inertial-visual tightly-coupled state estimator and mapping","volume":"6","author":"Lin","year":"2021","journal-title":"IEEE Robotics and Automation Letters"},{"issue":"6","key":"2025101507170355000_ref006","doi-asserted-by":"publisher","first-page":"3734","DOI":"10.1109\/tro.2022.3174476","article-title":"Observability-aware intrinsic and extrinsic calibration of LiDAR-IMU systems","volume":"38","author":"Lv","year":"2022","journal-title":"IEEE Transactions on Robotics"},{"key":"2025101507170355000_ref007","doi-asserted-by":"publisher","first-page":"107193","DOI":"10.1016\/j.patcog.2019.107193","article-title":"UcoSLAM: simultaneous localization and mapping by fusion of keypoints and squared planar markers","volume":"101","author":"Mu\u00f1oz-Salinas","year":"2020","journal-title":"Pattern Recognition"},{"key":"2025101507170355000_ref008","doi-asserted-by":"publisher","first-page":"156","DOI":"10.1016\/j.patcog.2018.09.003","article-title":"SPM-SLAM: simultaneous localization and mapping with squared planar markers","volume":"86","author":"Mu\u00f1oz-Salinas","year":"2019","journal-title":"Pattern Recognition"},{"key":"2025101507170355000_ref009","doi-asserted-by":"publisher","first-page":"158","DOI":"10.1016\/j.patcog.2017.08.010","article-title":"Mapping and localization from planar markers","volume":"73","author":"Mu\u00f1oz-Salinas","year":"2018","journal-title":"Pattern Recognition"},{"issue":"5","key":"2025101507170355000_ref010","doi-asserted-by":"publisher","first-page":"1255","DOI":"10.1109\/tro.2017.2705103","article-title":"ORB-SLAM2: an Open-Source SLAM system for monocular, stereo, and RGB-D cameras","volume":"33","author":"Mur-Artal","year":"2017","journal-title":"IEEE Transactions on Robotics"},{"issue":"6","key":"2025101507170355000_ref011","doi-asserted-by":"publisher","first-page":"2588","DOI":"10.1109\/tmech.2017.2762598","article-title":"Indoor localization of mobile robots through QR code detection and dead reckoning data fusion","volume":"22","author":"Nazemzadeh","year":"2017","journal-title":"IEEE\/ASME Transactions on Mechatronics"},{"key":"2025101507170355000_ref012","doi-asserted-by":"publisher","DOI":"10.48550\/arxiv.1507.02081","volume-title":"An Open Source, Fiducial Based, Visual-Inertial Motion Capture System","author":"Neunert","year":"2015"},{"issue":"4","key":"2025101507170355000_ref013","doi-asserted-by":"publisher","first-page":"1004","DOI":"10.1109\/tro.2018.2853729","article-title":"VINS-Mono: a robust and versatile monocular visual-inertial state estimator","volume":"34","author":"Qin","year":"2018","journal-title":"IEEE Transactions on Robotics"},{"key":"2025101507170355000_ref014","doi-asserted-by":"publisher","first-page":"5692","DOI":"10.1109\/icra48506.2021.9561996","article-title":"LVI-SAM: tightly-coupled lidar-visual-inertial odometry via smoothing and mapping","volume-title":"2021 IEEE International Conference on Robotics and Automation (ICRA)","author":"Shan","year":"2021"},{"key":"2025101507170355000_ref015","doi-asserted-by":"publisher","first-page":"5135","DOI":"10.1109\/iros45743.2020.9341176","article-title":"LIO-SAM: tightly-coupled lidar inertial odometry via smoothing and mapping","volume-title":"2020 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS)","author":"Shan","year":"2020"},{"issue":"3","key":"2025101507170355000_ref016","doi-asserted-by":"publisher","first-page":"1406","DOI":"10.1109\/tase.2020.3006724","article-title":"An RFID-Based mobile robot localization method combining phase difference and readability","volume":"18","author":"Tao","year":"2021","journal-title":"IEEE Transactions on Automation Science and Engineering"},{"issue":"2","key":"2025101507170355000_ref017","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1109\/lra.2021.3064227","article-title":"FAST-LIO: a fast, robust LiDAR-inertial odometry package by tightly-coupled iterated Kalman filter","volume":"6","author":"Xu","year":"2021","journal-title":"IEEE Robotics and Automation Letters"},{"issue":"4","key":"2025101507170355000_ref018","doi-asserted-by":"publisher","first-page":"2053","DOI":"10.1109\/tro.2022.3141876","article-title":"FAST-LIO2: fast direct LiDAR-inertial odometry","volume":"38","author":"Xu","year":"2022","journal-title":"IEEE Transactions on Robotics"},{"key":"2025101507170355000_ref019","doi-asserted-by":"publisher","first-page":"5562","DOI":"10.1109\/iros.2018.8593660","article-title":"Automatic extrinsic calibration of a camera and a 3D LiDAR using line and plane correspondences","volume-title":"2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS)","author":"Zhou","year":"2018"}],"container-title":["Industrial Robot: the international journal of robotics research and application"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.emerald.com\/insight\/content\/doi\/10.1108\/IR-10-2024-0485\/full\/xml","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/www.emerald.com\/ir\/article-pdf\/52\/5\/760\/10355690\/ir-10-2024-0485en.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"https:\/\/www.emerald.com\/ir\/article-pdf\/52\/5\/760\/10355690\/ir-10-2024-0485en.pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,15]],"date-time":"2025-10-15T11:17:08Z","timestamp":1760527028000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.emerald.com\/ir\/article\/52\/5\/760\/1256548\/Tightly-coupled-lidar-inertial-localization-with"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,5,30]]},"references-count":19,"journal-issue":{"issue":"5","published-print":{"date-parts":[[2025,10,16]]}},"URL":"https:\/\/doi.org\/10.1108\/ir-10-2024-0485","relation":{},"ISSN":["0143-991X","1758-5791"],"issn-type":[{"type":"print","value":"0143-991X"},{"type":"electronic","value":"1758-5791"}],"subject":[],"published":{"date-parts":[[2025,5,30]]}}}