{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,2]],"date-time":"2026-02-02T12:39:14Z","timestamp":1770035954545,"version":"3.49.0"},"reference-count":33,"publisher":"SAGE Publications","issue":"3","license":[{"start":{"date-parts":[[2021,2,1]],"date-time":"2021-02-01T00:00:00Z","timestamp":1612137600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"published-print":{"date-parts":[[2021,10,14]]},"abstract":"<jats:p>If robot uses 2D lidar or binocular camera to locate obstacles, there will be some problems such as missing obstacle information or inaccurate obstacle locating, which will affect the normal work of the robot. In order to obtain accurate 3D obstacle information, this paper proposes an algorithm for fusing 2D lidar and binocular vision to complete the obstacle location. In this paper, the depth value of the 2D lidar point cloud is used as a benchmark. By fitting the error equation of the binocular camera point cloud depth value, the depth value of the camera point cloud is modified to obtain an accurate 3D camera point cloud, thereby obtaining an accurate 3D obstacle information. Many experiments have proved that the fusion algorithm of 2D lidar and binocular vision can obtain accurate 3D obstacle information. The method of fusion 2D lidar and binocular vision can approximately achieve the measurement effect of 3D lidar, and the point cloud of obstacles is relatively dense, so the accurate 3D obstacle information can be obtained. This method can reduce the influence of single sensor on the robot locating obstacles, thus completing the accurate locating of obstacles, which is of certain significance to robot navigation.<\/jats:p>","DOI":"10.3233\/jifs-189698","type":"journal-article","created":{"date-parts":[[2021,2,2]],"date-time":"2021-02-02T18:16:18Z","timestamp":1612289778000},"page":"4387-4394","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":6,"title":["Application of fusion 2D lidar and binocular vision in robot locating obstacles"],"prefix":"10.1177","volume":"41","author":[{"given":"Weiwei","family":"Shao","sequence":"first","affiliation":[{"name":"Anhui University of Technology, Maanshan, China"}]},{"given":"Handong","family":"Zhang","sequence":"additional","affiliation":[{"name":"Anhui University of Technology, Maanshan, China"}]},{"given":"Yuxiu","family":"Wu","sequence":"additional","affiliation":[{"name":"Anhui University of Technology, Maanshan, China"}]},{"given":"Na","family":"Sheng","sequence":"additional","affiliation":[{"name":"Maanshan University, Maanshan, China"}]}],"member":"179","published-online":{"date-parts":[[2021,2]]},"reference":[{"key":"e_1_3_1_2_2","doi-asserted-by":"publisher","DOI":"10.1109\/TITS.2017.2690577"},{"key":"e_1_3_1_3_2","doi-asserted-by":"publisher","DOI":"10.1109\/LRA.2020.2996795"},{"key":"e_1_3_1_4_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2019.2956563"},{"key":"e_1_3_1_5_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2019.2952173"},{"key":"e_1_3_1_6_2","doi-asserted-by":"publisher","DOI":"10.1587\/transfun.E100.A.510"},{"key":"e_1_3_1_7_2","doi-asserted-by":"publisher","DOI":"10.1108\/IR-06-2018-0113"},{"key":"e_1_3_1_8_2","doi-asserted-by":"publisher","DOI":"10.1109\/LRA.2019.2928261"},{"key":"e_1_3_1_9_2","first-page":"93733","article-title":"Omni-Directional Obstacle Detection for Vehicles Based on Depth Camera","volume":"8","author":"Zhao X.","year":"2020","unstructured":"ZhaoX., WuH., XuZ., MinH., Omni-Directional Obstacle Detection for Vehicles Based on Depth Camera, IEEE Robot Autom Lett8 (2020), 93733\u201393748.","journal-title":"IEEE Robot Autom Lett"},{"key":"e_1_3_1_10_2","doi-asserted-by":"publisher","DOI":"10.1049\/trit.2017.0020"},{"key":"e_1_3_1_11_2","doi-asserted-by":"publisher","DOI":"10.1109\/LRA.2020.2974654"},{"key":"e_1_3_1_12_2","doi-asserted-by":"publisher","DOI":"10.1109\/TIV.2019.2938110"},{"key":"e_1_3_1_13_2","doi-asserted-by":"publisher","DOI":"10.1109\/TRO.2019.2899783"},{"key":"e_1_3_1_14_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2020.2988464"},{"key":"e_1_3_1_15_2","doi-asserted-by":"publisher","DOI":"10.1109\/TGRS.2019.2923551"},{"key":"e_1_3_1_16_2","doi-asserted-by":"publisher","DOI":"10.1109\/TIE.2016.2521346"},{"key":"e_1_3_1_17_2","doi-asserted-by":"publisher","DOI":"10.5302\/J.ICROS.2019.18.0128"},{"key":"e_1_3_1_18_2","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2020.2982681"},{"key":"e_1_3_1_19_2","doi-asserted-by":"publisher","DOI":"10.1109\/TVT.2019.2929560"},{"key":"e_1_3_1_20_2","doi-asserted-by":"publisher","DOI":"10.1109\/TITS.2018.2840822"},{"key":"e_1_3_1_21_2","doi-asserted-by":"publisher","DOI":"10.4028\/www.scientific.net\/KEM.693.1397"},{"issue":"3","key":"e_1_3_1_22_2","first-page":"21","article-title":"Data fusion, a multidisciplinary technique","volume":"10","author":"Baeza M.","year":"2019","unstructured":"BaezaM., ReyesleonP., RojomendezJ., RodriguezfloresJ., PerezperezE., Data fusion, a multidisciplinary technique, Int J Comb Optim Probl Inform10(3) (2019), 21\u201332.","journal-title":"Int J Comb Optim Probl Inform"},{"key":"e_1_3_1_23_2","doi-asserted-by":"publisher","DOI":"10.1109\/TITS.2019.2891788"},{"key":"e_1_3_1_24_2","doi-asserted-by":"publisher","DOI":"10.1109\/LRA.2019.2922618"},{"issue":"1","key":"e_1_3_1_25_2","first-page":"131","article-title":"Mapping and Localization in 3D Environments Using a 2D Laser Scanner and a Stereo Camera","volume":"28","author":"Lin K.","year":"2012","unstructured":"LinK., ChangC., DopferA., WangC., Mapping and Localization in 3D Environments Using a 2D Laser Scanner and a Stereo Camera, J Inf Sci Eng28(1) (2012), 131\u2013144.","journal-title":"J Inf Sci Eng"},{"key":"e_1_3_1_26_2","doi-asserted-by":"publisher","DOI":"10.1142\/S2301385019410012"},{"key":"e_1_3_1_27_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.cviu.2019.04.002"},{"key":"e_1_3_1_28_2","doi-asserted-by":"publisher","DOI":"10.1109\/LRA.2019.2913077"},{"key":"e_1_3_1_29_2","doi-asserted-by":"publisher","DOI":"10.1109\/TPAMI.2016.2592904"},{"issue":"1","key":"e_1_3_1_30_2","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1109\/JPHOT.2017.2784958","article-title":"Analysis on Location Accuracy for the Binocular Stereo Vision System","volume":"10","author":"Yang L.","year":"2018","unstructured":"YangL., WangB., ZhangR., ZhouH., WangR., Analysis on Location Accuracy for the Binocular Stereo Vision System, IEEE Photonics J10(1) (2018), 1\u201316.","journal-title":"IEEE Photonics J"},{"key":"e_1_3_1_31_2","doi-asserted-by":"crossref","unstructured":"WangC. ZouX. TangY. LuoL. FengW. Localisation of litchi in an unstructured environment using binocular stereo vision 145 (2016) 39\u201351.","DOI":"10.1016\/j.biosystemseng.2016.02.004"},{"key":"e_1_3_1_32_2","doi-asserted-by":"crossref","unstructured":"IsaM.A. Sims-WaterhouseD. PianoS. LeachR. Volumetric error modelling of a stereo vision system for error correction in photogrammetric three-dimensional coordinate metrology 58 (2020) 188\u2013199.","DOI":"10.1016\/j.precisioneng.2020.04.010"},{"key":"e_1_3_1_33_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.optlaseng.2018.11.005"},{"issue":"3","key":"e_1_3_1_34_2","first-page":"148","article-title":"Double Sparse Representation for Point Cloud Registration","volume":"7","author":"Sun L.","year":"2019","unstructured":"SunL., ManabeY., YataN., Double Sparse Representation for Point Cloud Registration, Multimed Tools Appl7(3) (2019), 148\u2013158.","journal-title":"Multimed Tools Appl"}],"container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-189698","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.3233\/JIFS-189698","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-189698","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,2,2]],"date-time":"2026-02-02T02:16:13Z","timestamp":1769998573000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.3233\/JIFS-189698"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,2]]},"references-count":33,"journal-issue":{"issue":"3","published-print":{"date-parts":[[2021,10,14]]}},"alternative-id":["10.3233\/JIFS-189698"],"URL":"https:\/\/doi.org\/10.3233\/jifs-189698","relation":{},"ISSN":["1064-1246","1875-8967"],"issn-type":[{"value":"1064-1246","type":"print"},{"value":"1875-8967","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,2]]}}}