{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,30]],"date-time":"2025-10-30T07:12:11Z","timestamp":1761808331242,"version":"build-2065373602"},"reference-count":37,"publisher":"MDPI AG","issue":"24","license":[{"start":{"date-parts":[[2019,12,15]],"date-time":"2019-12-15T00:00:00Z","timestamp":1576368000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Applied Sciences"],"abstract":"<jats:p>The main task while developing a mobile robot is to achieve accurate and robust navigation in a given environment. To achieve such a goal, the ability of the robot to localize itself is crucial. In outdoor, namely agricultural environments, this task becomes a real challenge because odometry is not always usable and global navigation satellite systems (GNSS) signals are blocked or significantly degraded. To answer this challenge, this work presents a solution for outdoor localization based on an omnidirectional visual odometry technique fused with a gyroscope and a low cost planar light detection and ranging (LIDAR), that is optimized to run in a low cost graphical processing unit (GPU). This solution, named FAST-FUSION, proposes to the scientific community three core contributions. The first contribution is an extension to the state-of-the-art monocular visual odometry (Libviso2) to work with omnidirectional cameras and single axis gyro to increase the system accuracy. The second contribution, it is an algorithm that considers low cost LIDAR data to estimate the motion scale and solve the limitations of monocular visual odometer systems. Finally, we propose an heterogeneous computing optimization that considers a Raspberry Pi GPU to improve the visual odometry runtime performance in low cost platforms. To test and evaluate FAST-FUSION, we created three open-source datasets in an outdoor environment. Results shows that FAST-FUSION is acceptable to run in real-time in low cost hardware and that outperforms the original Libviso2 approach in terms of time performance and motion estimation accuracy.<\/jats:p>","DOI":"10.3390\/app9245516","type":"journal-article","created":{"date-parts":[[2019,12,16]],"date-time":"2019-12-16T05:19:38Z","timestamp":1576473578000},"page":"5516","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":7,"title":["FAST-FUSION: An Improved Accuracy Omnidirectional Visual Odometry System with Sensor Fusion and GPU Optimization for Embedded Low Cost Hardware"],"prefix":"10.3390","volume":"9","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-6909-0209","authenticated-orcid":false,"given":"Andr\u00e9","family":"Aguiar","sequence":"first","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8486-6113","authenticated-orcid":false,"given":"Filipe","family":"Santos","sequence":"additional","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0317-4714","authenticated-orcid":false,"given":"Armando Jorge","family":"Sousa","sequence":"additional","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"},{"name":"Faculty of Engineering, University of Porto, 4200-465 Porto, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0255-5005","authenticated-orcid":false,"given":"Lu\u00eds","family":"Santos","sequence":"additional","affiliation":[{"name":"INESC TEC\u2014INESC Technology and Science, 4200-465 Porto, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2019,12,15]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"263","DOI":"10.1007\/s10846-008-9235-4","article-title":"Visual Navigation for Mobile Robots: A Survey","volume":"53","author":"Ortiz","year":"2008","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"449","DOI":"10.1177\/0278364906065543","article-title":"Toward Reliable Off Road Autonomous Vehicles Operating in Challenging Environments","volume":"25","author":"Kelly","year":"2006","journal-title":"Int. J. Robot. Res."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1897","DOI":"10.1186\/s40064-016-3573-7","article-title":"Review of visual odometry: Types, approaches, challenges, and applications","volume":"5","author":"Aqel","year":"2016","journal-title":"SpringerPlus"},{"key":"ref_4","unstructured":"Nister, D., Naroditsky, O., and Bergen, J. (July, January 27). Visual odometry. Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2004, Washington, DC, USA."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"80","DOI":"10.1109\/MRA.2011.943233","article-title":"Visual Odometry [Tutorial]","volume":"18","author":"Scaramuzza","year":"2011","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Gr\u00e4ter, J., Schwarze, T., and Lauer, M. (July, January 28). Robust scale estimation for monocular visual odometry using structure from motion and vanishing points. Proceedings of the 2015 IEEE Intelligent Vehicles Symposium (IV), Seoul, Korea.","DOI":"10.1109\/IVS.2015.7225730"},{"key":"ref_7","unstructured":"Zhang, Z., Rebecq, H., Forster, C., and Scaramuzza, D. (2016, January 16\u201321). Benefit of large field-of-view cameras for visual odometry. Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"18","DOI":"10.1109\/2.214439","article-title":"Heterogeneous computing: Challenges and opportunities","volume":"26","author":"Khokhar","year":"1993","journal-title":"Computer"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"69:1","DOI":"10.1145\/2788396","article-title":"A Survey of CPU-GPU Heterogeneous Computing Techniques","volume":"47","author":"Mittal","year":"2015","journal-title":"ACM Comput. Surv."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"66","DOI":"10.1109\/MCSE.2010.69","article-title":"OpenCL: A Parallel Programming Standard for Heterogeneous Computing Systems","volume":"12","author":"Stone","year":"2010","journal-title":"Comput. Sci. Eng."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Geiger, A., Ziegler, J., and Stiller, C. (2011, January 5\u20139). StereoScan: Dense 3d reconstruction in real-time. Proceedings of the 2011 IEEE Intelligent Vehicles Symposium (IV), Baden-Baden, Germany.","DOI":"10.1109\/IVS.2011.5940405"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"611","DOI":"10.1109\/TPAMI.2017.2658577","article-title":"Direct Sparse Odometry","volume":"40","author":"Engel","year":"2018","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"3693","DOI":"10.1109\/LRA.2018.2855443","article-title":"Omnidirectional DSO: Direct Sparse Odometry with Fisheye Cameras","volume":"3","author":"Matsuki","year":"2018","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Fleet, D., Pajdla, T., Schiele, B., and Tuytelaars, T. (2014). LSD-SLAM: Large-Scale Direct Monocular SLAM. Computer Vision\u2014ECCV 2014, Springer International Publishing.","DOI":"10.1007\/978-3-319-10599-4"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Caruso, D., Engel, J., and Cremers, D. (October, January 28). Large-scale direct SLAM for omnidirectional cameras. Proceedings of the 2015 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany.","DOI":"10.1109\/IROS.2015.7353366"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Forster, C., Pizzoli, M., and Scaramuzza, D. (June, January 31). SVO: Fast semi-direct monocular visual odometry. Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China.","DOI":"10.1109\/ICRA.2014.6906584"},{"key":"ref_17","unstructured":"Corke, P., Strelow, D., and Singh, S. (October, January 28). Omnidirectional visual odometry for a planetary rover. Proceedings of the 2004 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566), Sendai, Japan."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"1015","DOI":"10.1109\/TRO.2008.2004490","article-title":"Appearance-Guided Monocular Omnidirectional Visual Odometry for Outdoor Ground Vehicles","volume":"24","author":"Scaramuzza","year":"2008","journal-title":"IEEE Trans. Robot."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Tardif, J.P., Pavlidis, Y., and Daniilidis, K. (2008, January 22\u201326). Monocular visual odometry in urban environments using an omnidirectional camera. Proceedings of the 2008 IEEE\/RSJ International Conference on Intelligent Robots and Systems, Nice, France.","DOI":"10.1109\/IROS.2008.4651205"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Valiente, D., Gil, A., Reinoso, \u00d3., Juli\u00e1, M., and Holloway, M. (2017). Improved Omnidirectional Odometry for a View-Based Mapping Approach. Sensors, 17.","DOI":"10.3390\/s17020325"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Valiente, D., Gil, A., Pay\u00e1, L., Sebasti\u00e1n, J., and Reinoso, \u00d3. (2017). Robust Visual Localization with Dynamic Uncertainty Management in Omnidirectional SLAM. Appl. Sci., 7.","DOI":"10.3390\/app7121294"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Li, J., Wang, X., and Li, S. (2018). Spherical-Model-Based SLAM on Full-View Images for Indoor Environments. Appl. Sci., 8.","DOI":"10.3390\/app8112268"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"1157","DOI":"10.1177\/0278364904045593","article-title":"Motion Estimation from Image and Inertial Measurements","volume":"23","author":"Strelow","year":"2004","journal-title":"Int. J. Robot. Res."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Kaneko, M., and Nakamura, Y. (2011). Large-Scale Visual Odometry for Rough Terrain. Robotics Research, Springer.","DOI":"10.1007\/978-3-642-14743-2"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Usenko, V.C., Engel, J., St\u00fcckler, J., and Cremers, D. (2016, January 16\u201321). Direct visual-inertial odometry with stereo cameras. Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden.","DOI":"10.1109\/ICRA.2016.7487335"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Kneip, L., Chli, M., and Siegwart, R. (September, January 29). Robust Real-Time Visual Odometry with a Single Camera and an IMU. Proceedings of the British Machine Vision Conference 2011, Dundee, UK.","DOI":"10.5244\/C.25.16"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"287","DOI":"10.1007\/s10846-010-9490-z","article-title":"Fusion of IMU and Vision for Absolute Scale Estimation in Monocular SLAM","volume":"61","author":"Weiss","year":"2011","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Frost, D.P., Kahler, O., and Murray, D.W. (2016, January 16\u201321). Object-aware bundle adjustment for correcting monocular scale drift. Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA), Stockholm, Sweden.","DOI":"10.1109\/ICRA.2016.7487680"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Gr\u00e4ter, J., Wilczynski, A., and Lauer, M. (2018, January 1\u20135). LIMO: Lidar-Monocular Visual Odometry. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8594394"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"4981","DOI":"10.3390\/s140304981","article-title":"Enhanced Monocular Visual Odometry Integrated with Laser Distance Meter for Astronaut Navigation","volume":"14","author":"Wu","year":"2014","journal-title":"Sensors"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Giubilato, R., Chiodini, S., Pertile, M., and Debei, S. (2018, January 1\u20135). Scale Correct Monocular Visual Odometry Using a LiDAR Altimeter. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8594096"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Zhang, H., and Martin, F. (2013, January 22\u201323). CUDA accelerated robot localization and mapping. Proceedings of the 2013 IEEE Conference on Technologies for Practical Robot Applications (TePRA), Woburn, MA, USA.","DOI":"10.1109\/TePRA.2013.6556350"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Vargas, J.A.D., and Kurka, P.R.G. (2015, January 22\u201325). The Use of a Graphic Processing Unit (GPU) in a Real Time Visual Odometry Application. Proceedings of the 2015 IEEE International Conference on Dependable Systems and Networks Workshops, Rio de Janeiro, Brazil.","DOI":"10.1109\/DSN-W.2015.32"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Scaramuzza, D., Martinelli, A., and Siegwart, R. (2006, January 4\u20137). A Flexible Technique for Accurate Omnidirectional Camera Calibration and Structure from Motion. Proceedings of the Fourth IEEE International Conference on Computer Vision Systems (ICVS\u201906), New York, NY, USA.","DOI":"10.1109\/ICVS.2006.3"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Scaramuzza, D., Martinelli, A., and Siegwart, R. (2006, January 9\u201315). A Toolbox for Easily Calibrating Omnidirectional Cameras. Proceedings of the 2006 IEEE\/RSJ International Conference on Intelligent Robots and Systems, Beijing, China.","DOI":"10.1109\/IROS.2006.282372"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Harker, M., and O\u2019Leary, P. (2006, January 4\u20137). First Order Geometric Distance (The Myth of Sampsonus). Proceedings of the BMVC, Edinburgh, UK.","DOI":"10.5244\/C.20.10"},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Kohlbrecher, S., Meyer, J., von Stryk, O., and Klingauf, U. (2011, January 1\u20135). A Flexible and Scalable SLAM System with Full 3D Motion Estimation. Proceedings of the IEEE International Symposium on Safety, Security and Rescue Robotics (SSRR), Kyoto, Japan.","DOI":"10.1109\/SSRR.2011.6106777"}],"container-title":["Applied Sciences"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2076-3417\/9\/24\/5516\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T13:42:31Z","timestamp":1760190151000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2076-3417\/9\/24\/5516"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2019,12,15]]},"references-count":37,"journal-issue":{"issue":"24","published-online":{"date-parts":[[2019,12]]}},"alternative-id":["app9245516"],"URL":"https:\/\/doi.org\/10.3390\/app9245516","relation":{},"ISSN":["2076-3417"],"issn-type":[{"type":"electronic","value":"2076-3417"}],"subject":[],"published":{"date-parts":[[2019,12,15]]}}}