{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,27]],"date-time":"2026-01-27T09:29:03Z","timestamp":1769506143818,"version":"3.49.0"},"reference-count":24,"publisher":"MDPI AG","issue":"20","license":[{"start":{"date-parts":[[2023,10,23]],"date-time":"2023-10-23T00:00:00Z","timestamp":1698019200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["52101358"],"award-info":[{"award-number":["52101358"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["41906154"],"award-info":[{"award-number":["41906154"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["BE2022783"],"award-info":[{"award-number":["BE2022783"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["SH2022013"],"award-info":[{"award-number":["SH2022013"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"name":"Jiangsu Provincial Key Research and Development Program Social Development Project","award":["52101358"],"award-info":[{"award-number":["52101358"]}]},{"name":"Jiangsu Provincial Key Research and Development Program Social Development Project","award":["41906154"],"award-info":[{"award-number":["41906154"]}]},{"name":"Jiangsu Provincial Key Research and Development Program Social Development Project","award":["BE2022783"],"award-info":[{"award-number":["BE2022783"]}]},{"name":"Jiangsu Provincial Key Research and Development Program Social Development Project","award":["SH2022013"],"award-info":[{"award-number":["SH2022013"]}]},{"name":"Zhenjiang Key Research and Development Plan (Social Development)","award":["52101358"],"award-info":[{"award-number":["52101358"]}]},{"name":"Zhenjiang Key Research and Development Plan (Social Development)","award":["41906154"],"award-info":[{"award-number":["41906154"]}]},{"name":"Zhenjiang Key Research and Development Plan (Social Development)","award":["BE2022783"],"award-info":[{"award-number":["BE2022783"]}]},{"name":"Zhenjiang Key Research and Development Plan (Social Development)","award":["SH2022013"],"award-info":[{"award-number":["SH2022013"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>In this paper, we propose a robust and integrated visual odometry framework exploiting the optical flow and feature point method that achieves faster pose estimate and considerable accuracy and robustness during the odometry process. Our method utilizes optical flow tracking to accelerate the feature point matching process. In the odometry, two visual odometry methods are used: global feature point method and local feature point method. When there is good optical flow tracking and enough key points optical flow tracking matching is successful, the local feature point method utilizes prior information from the optical flow to estimate relative pose transformation information. In cases where there is poor optical flow tracking and only a small number of key points successfully match, the feature point method with a filtering mechanism is used for posing estimation. By coupling and correlating the two aforementioned methods, this visual odometry greatly accelerates the computation time for relative pose estimation. It reduces the computation time of relative pose estimation to 40% of that of the ORB_SLAM3 front-end odometry, while ensuring that it is not too different from the ORB_SLAM3 front-end odometry in terms of accuracy and robustness. The effectiveness of this method was validated and analyzed using the EUROC dataset within the ORB_SLAM3 open-source framework. The experimental results serve as supporting evidence for the efficacy of the proposed approach.<\/jats:p>","DOI":"10.3390\/s23208655","type":"journal-article","created":{"date-parts":[[2023,10,24]],"date-time":"2023-10-24T11:39:04Z","timestamp":1698147544000},"page":"8655","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":2,"title":["A Robust and Integrated Visual Odometry Framework Exploiting the Optical Flow and Feature Point Method"],"prefix":"10.3390","volume":"23","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-9058-2715","authenticated-orcid":false,"given":"Haiyang","family":"Qiu","sequence":"first","affiliation":[{"name":"School of Naval Architecture and Ocean Engineering, Guangzhou Maritime University, Guangzhou 510725, China"}]},{"given":"Xu","family":"Zhang","sequence":"additional","affiliation":[{"name":"School of Automation, Jiangsu University of Science and Technology, Zhenjiang 212013, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1863-4813","authenticated-orcid":false,"given":"Hui","family":"Wang","sequence":"additional","affiliation":[{"name":"School of Naval Architecture and Ocean Engineering, Guangzhou Maritime University, Guangzhou 510725, China"}]},{"given":"Dan","family":"Xiang","sequence":"additional","affiliation":[{"name":"School of Naval Architecture and Ocean Engineering, Guangzhou Maritime University, Guangzhou 510725, China"}]},{"given":"Mingming","family":"Xiao","sequence":"additional","affiliation":[{"name":"School of Naval Architecture and Ocean Engineering, Guangzhou Maritime University, Guangzhou 510725, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4097-9172","authenticated-orcid":false,"given":"Zhiyu","family":"Zhu","sequence":"additional","affiliation":[{"name":"School of Automation, Jiangsu University of Science and Technology, Zhenjiang 212013, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1887-6222","authenticated-orcid":false,"given":"Lei","family":"Wang","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430072, China"}]}],"member":"1968","published-online":{"date-parts":[[2023,10,23]]},"reference":[{"key":"ref_1","first-page":"855","article-title":"A survey of monocular simultaneous localization and mapping","volume":"28","author":"Liu","year":"2016","journal-title":"J. Comput.-Aided Des. Comput. Graph."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"1052","DOI":"10.1109\/TPAMI.2007.1049","article-title":"MonoSLAM: Real-time single camera SLAM","volume":"29","author":"Davison","year":"2007","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_3","first-page":"298","article-title":"Bundle adjustment\u2014A modern synthesis","volume":"Volume 1883","author":"Triggs","year":"2000","journal-title":"Vision Algorithms: Theory and Practice"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"1181","DOI":"10.1177\/0278364906072768","article-title":"Square Root SAM: Simultaneous localization and mapping via square root information smoothing","volume":"25","author":"Dellaert","year":"2006","journal-title":"Int. J. Robot. Res."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"217","DOI":"10.1177\/0278364911430419","article-title":"iSAM2: Incremental smoothing and mapping using the Bayes tree","volume":"31","author":"Kaess","year":"2012","journal-title":"Int. J. Robot. Res."},{"key":"ref_6","unstructured":"Agarwal, A., Mierle, K., and The Ceres Solver Team (2022, May 20). Ceres Solver. Available online: http:\/\/ceres-solver.org."},{"key":"ref_7","unstructured":"Kummerle, R., Grisetti, G., Strasdat, H., Konolige, K., and Burgard, W. (2011, January 9\u201313). g2o: A general framework for graph optimization. Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Bar-Shalom, Y., Li, X.R., and Kirubarajan, T. (2001). Estimation with Applications to Tracking and Navigation, John Wiley and Sons.","DOI":"10.1002\/0471221279"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"502","DOI":"10.1177\/0278364909353640","article-title":"Observability-based rules for designing consistent EKF SLAM estimators","volume":"29","author":"Huang","year":"2010","journal-title":"Int. J. Robot. Res."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"80","DOI":"10.1109\/MRA.2011.943233","article-title":"Visual odometry [tutorial]. Part I: The first 30 years and fundamentals","volume":"18","author":"Scaramuzza","year":"2011","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Klein, G., and Murray, D. (2007, January 13\u201316). Parallel tracking and mapping for small ar workspaces. Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan.","DOI":"10.1109\/ISMAR.2007.4538852"},{"key":"ref_12","first-page":"1255","article-title":"ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo and RGB-D Cameras","volume":"33","author":"Tardos","year":"2016","journal-title":"IEEE Trans. Robot."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"1874","DOI":"10.1109\/TRO.2021.3075644","article-title":"ORB-SLAM3: An Accurate Open-Source Library for Visual, Visual-Inertial and Multi-Map SLAM","volume":"37","author":"Campos","year":"2020","journal-title":"IEEE Trans. Robot."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"381","DOI":"10.1145\/358669.358692","article-title":"Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography","volume":"24","author":"Fischler","year":"1981","journal-title":"Commun. ACM"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"MacTavish, K., and Barfoot, T.D. (2015, January 3\u20135). At all costs: A comparison of robust cost functions for camera correspondence outliers. Proceedings of the 2015 12th Conference on Computer and Robot Vision, Halifax, NS, Canada.","DOI":"10.1109\/CRV.2015.52"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Irani, M., and Anandan, P. (1999, January 21\u201322). All about direct methods. Proceedings of the International Workshop on Vision Algorithms, Corfu, Greece.","DOI":"10.1007\/3-540-44480-7_18"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Kerl, C., Sturm, J., and Cremers, D. (2013, January 3\u20137). Dense visual slam for rgb-d cameras. Proceedings of the 2013 IEEE\/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan.","DOI":"10.1109\/IROS.2013.6696650"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Engel, J., Schops, T., and Cremers, D. (2014, January 6\u201312). Lsd-slam: Large-scale direct monocular slam. Proceedings of the European Conference on Computer Vision, Zurich, Switzerland.","DOI":"10.1007\/978-3-319-10605-2_54"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Gao, X., Wang, R., Demmel, N., and Cremers, D. (2018, January 1\u20135). Ldso: Direct sparse odometry with loop closure. Proceedings of the International Conference on Intelligent Robots and Systems, Madrid, Spain.","DOI":"10.1109\/IROS.2018.8593376"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"249","DOI":"10.1109\/TRO.2016.2623335","article-title":"SVO: Semidirect Visual Odometry for Monocular and Multicamera Systems","volume":"33","author":"Forster","year":"2017","journal-title":"IEEE Trans. Robot."},{"key":"ref_21","unstructured":"Rosten, E. (2006). Proceedings of the European Conference on Computer Vision, Springer."},{"key":"ref_22","unstructured":"Lucas, B.D., and Kanade, T. (1997). Proceedings of the 7th International Joint Conference on Artificial Intelligence, Morgan Kaufmann Publishers Inc."},{"key":"ref_23","unstructured":"Muja, M. (2009, January 5\u20138). Fast approximate nearest neighbors with automatic algorithm configuration. Proceedings of the VISSAPP, Lisboa, Portugal."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"1157","DOI":"10.1177\/0278364915620033","article-title":"The EuRoC micro aerial vehicle datasets","volume":"35","author":"Burri","year":"2016","journal-title":"Int. J. Robot. Res."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/20\/8655\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T21:10:33Z","timestamp":1760130633000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/20\/8655"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,10,23]]},"references-count":24,"journal-issue":{"issue":"20","published-online":{"date-parts":[[2023,10]]}},"alternative-id":["s23208655"],"URL":"https:\/\/doi.org\/10.3390\/s23208655","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,10,23]]}}}