{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,12,9]],"date-time":"2025-12-09T08:25:07Z","timestamp":1765268707597,"version":"build-2065373602"},"reference-count":36,"publisher":"MDPI AG","issue":"4","license":[{"start":{"date-parts":[[2021,2,20]],"date-time":"2021-02-20T00:00:00Z","timestamp":1613779200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["41874006"],"award-info":[{"award-number":["41874006"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"name":"Key Laboratory of Surveying and Mapping Science and Geospatial Information Technology of Ministry of Natural Resources","award":["2020-1-7"],"award-info":[{"award-number":["2020-1-7"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Simultaneous Localization and Mapping (SLAM) has always been the focus of the robot navigation for many decades and becomes a research hotspot in recent years. Because a SLAM system based on vision sensor is vulnerable to environment illumination and texture, the problem of initial scale ambiguity still exists in a monocular SLAM system. The fusion of a monocular camera and an inertial measurement unit (IMU) can effectively solve the scale blur problem, improve the robustness of the system, and achieve higher positioning accuracy. Based on a monocular visual-inertial navigation system (VINS-mono), a state-of-the-art fusion performance of monocular vision and IMU, this paper designs a new initialization scheme that can calculate the acceleration bias as a variable during the initialization process so that it can be applied to low-cost IMU sensors. Besides, in order to obtain better initialization accuracy, visual matching positioning method based on feature point is used to assist the initialization process. After the initialization process, it switches to optical flow tracking visual positioning mode to reduce the calculation complexity. By using the proposed method, the advantages of feature point method and optical flow method can be fused. This paper, the first one to use both the feature point method and optical flow method, has better performance in the comprehensive performance of positioning accuracy and robustness under the low-cost sensors. Through experiments conducted with the EuRoc dataset and campus environment, the results show that the initial values obtained through the initialization process can be efficiently used for launching nonlinear visual-inertial state estimator and positioning accuracy of the improved VINS-mono has been improved by about 10% than VINS-mono.<\/jats:p>","DOI":"10.3390\/rs13040772","type":"journal-article","created":{"date-parts":[[2021,2,19]],"date-time":"2021-02-19T23:40:38Z","timestamp":1613778038000},"page":"772","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":25,"title":["Robust Visual-Inertial Navigation System for Low Precision Sensors under Indoor and Outdoor Environments"],"prefix":"10.3390","volume":"13","author":[{"given":"Changhui","family":"Xu","sequence":"first","affiliation":[{"name":"Chinese Academy of Surveying &amp; Mapping, Beijing 100036, China"},{"name":"Key Laboratory of Surveying and Mapping Science and Geospatial Information Technology, Ministry of Natural Resources, Beijing 100036, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-0237-3663","authenticated-orcid":false,"given":"Zhenbin","family":"Liu","sequence":"additional","affiliation":[{"name":"School of Environment Science and Spatial Informatics, China University of Mining and Technology, Xuzhou 221116, China"}]},{"given":"Zengke","family":"Li","sequence":"additional","affiliation":[{"name":"School of Environment Science and Spatial Informatics, China University of Mining and Technology, Xuzhou 221116, China"}]}],"member":"1968","published-online":{"date-parts":[[2021,2,20]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"99","DOI":"10.1109\/MRA.2006.1638022","article-title":"Simultaneous Localization and Mapping: Part I","volume":"13","author":"Durrant","year":"2006","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"108","DOI":"10.1109\/MRA.2006.1678144","article-title":"Simultaneous localization and mapping (SLAM): Part II","volume":"13","author":"Bailey","year":"2006","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_3","first-page":"843","article-title":"Three-Dimensional Indoor Mobile Mapping with Fusion of Two-Dimensional Laser Scanner and RGB-D Camera Data","volume":"11","author":"Wen","year":"2013","journal-title":"IEEE Geosci. Remote. Sens. Lett"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Hsu, L.T., and Wen, W. (2020). New Integrated Navigation Scheme for the Level 4 Autonomous Vehicles in Dense Urban Areas, IEEE Symposium on Position Location and Navigation (PLANS).","DOI":"10.1109\/PLANS46316.2020.9109962"},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Chiang, K.-W., Le, D.T., Duong, T.T., and Sun, R. (2020). The Performance Analysis of INS\/GNSS\/V-SLAM Integration Scheme Using Smartphone Sensors for Land Vehicle Navigation Applications in GNSS-Challenging Environments. Remote Sens., 12.","DOI":"10.3390\/rs12111732"},{"key":"ref_6","unstructured":"Jung, S.H., and Taylor, C.J. (2001). Camera Trajectory Estimation Using Inertial Sensor Measurements and Structure from Motion Results, IEEE Computer Society Conference on Computer Vision & Pattern Recognition."},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Klein, G., and Murray, D. (2007). Parallel Tracking and Mapping for Small AR Workspaces, International Symposium on Mixed and Augmented Reality (ISMAR).","DOI":"10.1109\/ISMAR.2007.4538852"},{"key":"ref_8","unstructured":"Forster, C., Pizzoli, M., and Scaramuzza, D. (June, January 31). Svo: Fast semi-direct monocular visual odometry. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Engel, J., Thomas, S., and Cremers, D. (2014, January 6\u201312). Lsd-Salm: Large-scale direct monocular salm. Proceedings of the European Conference on Computer Vision, Cham, Switzerland.","DOI":"10.1007\/978-3-319-10605-2_54"},{"key":"ref_10","first-page":"1147","article-title":"ORB-SLAM: A Versatile and Accurate Monocular SLAM System","volume":"31","author":"Montiel","year":"2017","journal-title":"IEEE Trans. Robot."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"1309","DOI":"10.1109\/TRO.2016.2624754","article-title":"Slam for dummies. A Tutoril Approach to Simultaneous Localization and mapping: Toward the robust-perception age","volume":"32","author":"Riisgaard","year":"2016","journal-title":"IEEE Trans. Robot."},{"key":"ref_12","first-page":"855","article-title":"Summary of Simultaneous Location and Mapping Methods Based on Monocular Vision","volume":"25","author":"Liu","year":"2016","journal-title":"J. Comput. Aided Des. Graph."},{"key":"ref_13","first-page":"768","article-title":"Overview of Visual SLAM","volume":"11","author":"Quan","year":"2016","journal-title":"J. Intell. Syst."},{"key":"ref_14","unstructured":"Gao, X., and Zhang, T. (2017). Visual SLAM 14 Lectures: From Theory to Practice, Electronic Industry Press. [2nd ed.]."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Weiss, S.M., Achtelik, W.S., and Lynen, M. (2013, January 6\u201310). Real-Time onboard visual-inertial state estimation and self-calibration of mavs in unknown environments. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany.","DOI":"10.1109\/ICRA.2012.6225147"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Lynen, S.M., Achtelik, W.S., and Weiss, M. (2013, January 3\u20138). A robust and modular multi-sensor fusion approach applied to mav navigation. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Tokyo, Japan.","DOI":"10.1109\/IROS.2013.6696917"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Mourikis, A.I., and Roumeliotis, S.I. (2007, January 10\u201314). A multi-state constraint Kalman filter for vision-aided inertial navigation. Proceedings of the IEEE International Conference on Robotics and Automation, Roma, Italy.","DOI":"10.1109\/ROBOT.2007.364024"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"690","DOI":"10.1177\/0278364913481251","article-title":"High-precision, consistent EKF-based visual-inertial odometry","volume":"32","author":"Li","year":"2013","journal-title":"Int. J. Robot. Res."},{"key":"ref_19","unstructured":"Bloesch, M.S., Omari, M., and Siegwart, R. (October, January 28). Robust visual inertial odometry using a direct ekf-based approach. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots & Systems, Hamburg, Germany."},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Shen, S., Michael, N., and Kumar, V. (2015, January 26\u201330). Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA.","DOI":"10.1109\/ICRA.2015.7139939"},{"key":"ref_21","first-page":"1","article-title":"Monocular Visual-Inertial State Estimation with Online Initialization and Camera-IMU Extrinsic Calibration","volume":"14","author":"Yang","year":"2016","journal-title":"IEEE Autom. Sci. Eng."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"314","DOI":"10.1177\/0278364914554813","article-title":"Keyframe-based visual-inertial odometry using nonlinear optimization","volume":"34","author":"Leutenegger","year":"2015","journal-title":"Int. J. Robot. Res."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"796","DOI":"10.1109\/LRA.2017.2653359","article-title":"Visual-Inertial Monocular SLAM with Map Reuse","volume":"2","author":"Murartal","year":"2017","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"61","DOI":"10.1109\/TRO.2011.2170332","article-title":"Visual-Inertial-Aided Navigation for High-Dynamic Motion in Built Environments without Initial Conditions","volume":"28","author":"Lupton","year":"2012","journal-title":"IEEE Trans. Robot."},{"key":"ref_25","first-page":"112","article-title":"IMU pre-integration on manifold for efficient visual-inertial maximum-a-posteriori estimation","volume":"33","author":"Forster","year":"2015","journal-title":"Georgia Tech"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"1052","DOI":"10.1109\/TPAMI.2007.1049","article-title":"Monoslam: Real-Time Single Camera SLAM","volume":"29","author":"Davison","year":"2007","journal-title":"IEEE Pattern. Anal."},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Newcombe, R.A., Lovegrove, S.J., and Davison, A.J. (2011, January 6\u201313). Dtam: Dense tracking and mapping in real-time. Proceedings of the IEEE International Conference on Computer Vision (ICCV), Cham, Switzerland.","DOI":"10.1109\/ICCV.2011.6126513"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"611","DOI":"10.1109\/TPAMI.2017.2658577","article-title":"Direct Sparse Odometry","volume":"40","author":"Engel","year":"2017","journal-title":"IEEE Pattern. Anal."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"98","DOI":"10.1109\/TRO.2018.2853729","article-title":"VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator","volume":"34","author":"Qin","year":"2018","journal-title":"IEEE Trans. Robot."},{"key":"ref_30","unstructured":"Qin, T., and Pan, J. (2019). A General Optimization-based Framework for Local Odometry Estimation with Multiple Sensors. arXiv."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"381","DOI":"10.1145\/358669.358692","article-title":"Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography","volume":"24","author":"Fischler","year":"1981","journal-title":"Commun. ACM"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Hartley, R., and Zisserman, A. (2004). Multiple View Geometry in Computer Vision, Cambridge University Press.","DOI":"10.1017\/CBO9780511811685"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"2","DOI":"10.1080\/07468342.1996.11973744","article-title":"A Singularly Valuable Decomposition: The SVD of a Matrix","volume":"27","author":"Kalman","year":"1996","journal-title":"Coll. Math. J."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"1157","DOI":"10.1177\/0278364915620033","article-title":"The EuRoC micro aerial vehicle datasets","volume":"35","author":"Burri","year":"2016","journal-title":"Int. J. Robot. Res."},{"key":"ref_35","unstructured":"Joern, R., Janosch, N., and Thomas, S. (2016, January 16\u201321). Extending kalibr: Calibrating the extrinsics of multiple IMUs and of individual axes. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden."},{"key":"ref_36","unstructured":"J\u00fcrgen, S., Engelhard, N., and Endres, F. (2012, January 7\u201312). A benchmark for the evaluation of RGB-D SLAM systems. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems, Vilamoura, Portugal."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/13\/4\/772\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T05:26:35Z","timestamp":1760160395000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/13\/4\/772"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,2,20]]},"references-count":36,"journal-issue":{"issue":"4","published-online":{"date-parts":[[2021,2]]}},"alternative-id":["rs13040772"],"URL":"https:\/\/doi.org\/10.3390\/rs13040772","relation":{},"ISSN":["2072-4292"],"issn-type":[{"type":"electronic","value":"2072-4292"}],"subject":[],"published":{"date-parts":[[2021,2,20]]}}}