{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,8]],"date-time":"2026-04-08T16:54:41Z","timestamp":1775667281097,"version":"3.50.1"},"reference-count":18,"publisher":"Emerald","issue":"6","license":[{"start":{"date-parts":[[2021,6,4]],"date-time":"2021-06-04T00:00:00Z","timestamp":1622764800000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/www.emerald.com\/insight\/site-policies"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["IR"],"published-print":{"date-parts":[[2021,11,16]]},"abstract":"<jats:sec>\n<jats:title content-type=\"abstract-subheading\">Purpose<\/jats:title>\n<jats:p>To the industrial application of intelligent and connected vehicles (ICVs), the robustness and accuracy of environmental perception are critical in challenging conditions. However, the accuracy of perception is closely related to the performance of sensors configured on the vehicle. To enhance sensors\u2019 performance further to improve the accuracy of environmental perception, this paper aims to introduce an obstacle detection method based on the depth fusion of lidar and radar in challenging conditions, which could reduce the false rate resulting from sensors\u2019 misdetection.<\/jats:p>\n<\/jats:sec>\n<jats:sec>\n<jats:title content-type=\"abstract-subheading\">Design\/methodology\/approach<\/jats:title>\n<jats:p>Firstly, a multi-layer self-calibration method is proposed based on the spatial and temporal relationships. Next, a depth fusion model is proposed to improve the performance of obstacle detection in challenging conditions. Finally, the study tests are carried out in challenging conditions, including straight unstructured road, unstructured road with rough surface and unstructured road with heavy dust or mist.<\/jats:p>\n<\/jats:sec>\n<jats:sec>\n<jats:title content-type=\"abstract-subheading\">Findings<\/jats:title>\n<jats:p>The experimental tests in challenging conditions demonstrate that the depth fusion model, comparing with the use of a single sensor, can filter out the false alarm of radar and point clouds of dust or mist received by lidar. So, the accuracy of objects detection is also improved under challenging conditions.<\/jats:p>\n<\/jats:sec>\n<jats:sec>\n<jats:title content-type=\"abstract-subheading\">Originality\/value<\/jats:title>\n<jats:p>A multi-layer self-calibration method is conducive to improve the accuracy of the calibration and reduce the workload of manual calibration. Next, a depth fusion model based on lidar and radar can effectively get high precision by way of filtering out the false alarm of radar and point clouds of dust or mist received by lidar, which could improve ICVs\u2019 performance in challenging conditions.<\/jats:p>\n<\/jats:sec>","DOI":"10.1108\/ir-12-2020-0271","type":"journal-article","created":{"date-parts":[[2021,6,4]],"date-time":"2021-06-04T13:45:14Z","timestamp":1622814314000},"page":"792-802","source":"Crossref","is-referenced-by-count":18,"title":["Obstacle detection based on depth fusion of lidar and radar in challenging conditions"],"prefix":"10.1108","volume":"48","author":[{"given":"Guotao","family":"Xie","sequence":"first","affiliation":[]},{"given":"Jing","family":"Zhang","sequence":"additional","affiliation":[]},{"given":"Junfeng","family":"Tang","sequence":"additional","affiliation":[]},{"given":"Hongfei","family":"Zhao","sequence":"additional","affiliation":[]},{"given":"Ning","family":"Sun","sequence":"additional","affiliation":[]},{"given":"Manjiang","family":"Hu","sequence":"additional","affiliation":[]}],"member":"140","published-online":{"date-parts":[[2021,6,4]]},"reference":[{"key":"key2021111307184852700_ref001","article-title":"Object existence probability fusion using Dempster-Shafer theory in a high-level sensor data fusion architecture","volume-title":"2011 IEEE Intelligent Vehicles Symposium (IV)","year":"2011"},{"key":"key2021111307184852700_ref002","article-title":"Seeing through fog without seeing fog: deep multimodal sensor fusion in unseen adverse weather","year":"2019"},{"key":"key2021111307184852700_ref003","article-title":"Deep learning for image and point cloud fusion in autonomous driving: a review\u201d, arXiv preprint arXiv:2004.05224","year":"2020"},{"issue":"3","key":"key2021111307184852700_ref004","doi-asserted-by":"crossref","first-page":"475","DOI":"10.1109\/TITS.2009.2018319","article-title":"Obstacle detection and tracking for the urban challenge","volume":"10","year":"2009","journal-title":"IEEE Transactions on Intelligent Transportation Systems"},{"issue":"8","key":"key2021111307184852700_ref005","doi-asserted-by":"crossref","first-page":"783","DOI":"10.1049\/iet-its.2017.0370","article-title":"An automotive radar system for multiple-vehicle detection and tracking in urban environments","volume":"12","year":"2018","journal-title":"IET Intelligent Transport Systems"},{"issue":"9","key":"key2021111307184852700_ref006","doi-asserted-by":"crossref","first-page":"4224","DOI":"10.1109\/TII.2018.2822828","article-title":"Object classification using CNN-Based fusion of vision and LIDAR in autonomous vehicle environment","volume":"14","year":"2018","journal-title":"IEEE Transactions on Industrial Informatics"},{"key":"key2021111307184852700_ref007","doi-asserted-by":"crossref","first-page":"407","DOI":"10.1109\/ICARA.2011.6144918","article-title":"Radar\/lidar sensor fusion for car-following on highways","volume-title":"The 5th International Conference on Automation, Robotics and Applications","year":"2011"},{"issue":"1","key":"key2021111307184852700_ref008","first-page":"89","article-title":"Predicting the influence of rain on","volume":"8","year":"2019","journal-title":"LIDAR in ADAS. Electronics"},{"key":"key2021111307184852700_ref009","doi-asserted-by":"crossref","first-page":"2242","DOI":"10.1109\/ITSC.2016.7795918","article-title":"Test methodology for rain influence on automotive surround sensors","volume-title":"2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC)","year":"2016"},{"key":"key2021111307184852700_ref010","article-title":"Real time lidar and radar high-level fusion for obstacle detection and tracking with evaluation on a ground truth","year":"2018"},{"issue":"4","key":"key2021111307184852700_ref011","doi-asserted-by":"crossref","first-page":"464","DOI":"10.1016\/j.eng.2018.07.015","article-title":"A hardware platform framework for an intelligent vehicle based on a driving brain","volume":"4","year":"2018","journal-title":"Engineering"},{"key":"key2021111307184852700_ref012","doi-asserted-by":"crossref","first-page":"6459","DOI":"10.1109\/IROS40897.2019.8968026","article-title":"Curved-voxel clustering for accurate segmentation of 3D LiDAR point clouds with real-time performance","volume-title":"2019 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS)","year":"2019"},{"key":"key2021111307184852700_ref013","doi-asserted-by":"crossref","first-page":"591","DOI":"10.1109\/IVS.2018.8500665","article-title":"Leveraging spatio-temporal evidence and independent vision channel to improve multi-sensor fusion for vehicle environmental perception","volume-title":"2018 IEEE Intelligent Vehicles Symposium (IV)","year":"2018"},{"issue":"9","key":"key2021111307184852700_ref014","doi-asserted-by":"crossref","first-page":"1582","DOI":"10.3390\/su9091582","article-title":"Situational assessments based on uncertainty-risk awareness in complex traffic scenarios","volume":"9","year":"2017","journal-title":"Sustainability"},{"issue":"4","key":"key2021111307184852700_ref015","first-page":"1390","article-title":"Cooperative method of traffic signal optimization and speed control of connected vehicles at isolated intersections","volume":"20","year":"2018","journal-title":"IEEE Transactions on Intelligent Transportation Systems"},{"key":"key2021111307184852700_ref016","doi-asserted-by":"crossref","first-page":"85","DOI":"10.1016\/j.isprsjprs.2018.04.022","article-title":"Fusion of images and point clouds for the semantic segmentation of large-scale 3D scenes based on deep learning","volume":"143","year":"2018","journal-title":"ISPRS Journal of Photogrammetry and Remote Sensing"},{"key":"key2021111307184852700_ref017","doi-asserted-by":"crossref","first-page":"54","DOI":"10.1109\/IVS.2017.7995698","article-title":"Efficient L-shape fitting for vehicle detection using laser scanners","volume-title":"2017 IEEE Intelligent Vehicles Symposium (IV)","year":"2017"},{"key":"key2021111307184852700_ref018","first-page":"1","article-title":"Optimal sensor data fusion architecture for object detection in adverse weather conditions","volume-title":"2018 21st International Conference on Information Fusion (FUSION)","year":"2018"}],"container-title":["Industrial Robot: the international journal of robotics research and application"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.emerald.com\/insight\/content\/doi\/10.1108\/IR-12-2020-0271\/full\/xml","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/www.emerald.com\/insight\/content\/doi\/10.1108\/IR-12-2020-0271\/full\/html","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,7,24]],"date-time":"2025-07-24T21:40:39Z","timestamp":1753393239000},"score":1,"resource":{"primary":{"URL":"http:\/\/www.emerald.com\/ir\/article\/48\/6\/792-802\/187404"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,6,4]]},"references-count":18,"journal-issue":{"issue":"6","published-online":{"date-parts":[[2021,6,4]]},"published-print":{"date-parts":[[2021,11,16]]}},"alternative-id":["10.1108\/IR-12-2020-0271"],"URL":"https:\/\/doi.org\/10.1108\/ir-12-2020-0271","relation":{},"ISSN":["0143-991X","0143-991X"],"issn-type":[{"value":"0143-991X","type":"print"},{"value":"0143-991X","type":"print"}],"subject":[],"published":{"date-parts":[[2021,6,4]]}}}