{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,26]],"date-time":"2026-03-26T15:44:00Z","timestamp":1774539840492,"version":"3.50.1"},"reference-count":40,"publisher":"MDPI AG","issue":"23","license":[{"start":{"date-parts":[[2022,11,30]],"date-time":"2022-11-30T00:00:00Z","timestamp":1669766400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Recent advances in the fields of driverless cars, intelligent robots and remote-sensing measurement have shown that the use of LiDAR fused with cameras can provide more comprehensive and reliable sensing of surroundings. However, since it is difficult to extract features from sparse LiDAR data to create 3D\u20132D correspondences, finding a method for accurate external calibration of all types of LiDAR with cameras has become a research hotspot. To solve this problem, this paper proposes a method to directly obtain the 3D\u20132D correspondences of LiDAR\u2013camera systems to complete accurate calibration. In this method, a laser detector card is used as an auxiliary tool to directly obtain the correspondences between laser spots and image pixels, thus solving the problem of difficulty in extracting features from sparse LiDAR data. In addition, a two-stage framework from coarse to fine is designed in this paper, which not only can solve the perspective-n-point problem with observation errors, but also requires only four LiDAR data points and the corresponding pixel information for more accurate external calibration. Finally, extensive simulations and experimental results show that the effectiveness and accuracy of our method are better than existing methods.<\/jats:p>","DOI":"10.3390\/rs14236082","type":"journal-article","created":{"date-parts":[[2022,12,1]],"date-time":"2022-12-01T02:24:46Z","timestamp":1669861486000},"page":"6082","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":12,"title":["Extrinsic Calibration for LiDAR\u2013Camera Systems Using Direct 3D\u20132D Correspondences"],"prefix":"10.3390","volume":"14","author":[{"given":"Hao","family":"Yi","sequence":"first","affiliation":[{"name":"Key Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, China"},{"name":"Institute of Optics and Electronics, Chinese Academy of Sciences, Chengdu 610209, China"},{"name":"University of Chinese Academy of Sciences, Beijing 100049, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3084-0126","authenticated-orcid":false,"given":"Bo","family":"Liu","sequence":"additional","affiliation":[{"name":"Key Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, China"},{"name":"Institute of Optics and Electronics, Chinese Academy of Sciences, Chengdu 610209, China"},{"name":"University of Chinese Academy of Sciences, Beijing 100049, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2959-3968","authenticated-orcid":false,"given":"Bin","family":"Zhao","sequence":"additional","affiliation":[{"name":"Key Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, China"},{"name":"Institute of Optics and Electronics, Chinese Academy of Sciences, Chengdu 610209, China"},{"name":"University of Chinese Academy of Sciences, Beijing 100049, China"}]},{"given":"Enhai","family":"Liu","sequence":"additional","affiliation":[{"name":"Key Laboratory of Science and Technology on Space Optoelectronic Precision Measurement, Chinese Academy of Sciences, Chengdu 610209, China"},{"name":"Institute of Optics and Electronics, Chinese Academy of Sciences, Chengdu 610209, China"},{"name":"University of Chinese Academy of Sciences, Beijing 100049, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,11,30]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"881","DOI":"10.1007\/s10514-016-9564-2","article-title":"Monocular vision-based real-time target recognition and tracking for autonomously landing an UAV in a cluttered shipboard environment","volume":"41","author":"Lin","year":"2017","journal-title":"Auton. Robot."},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Li, Y.J., Zhang, Z., Luo, D., and Meng, G. (2015, January 8\u201310). Multi-sensor environmental perception and information fusion for vehicle safety. Proceedings of the 2015 IEEE International Conference on Information and Automation, Lijiang, China.","DOI":"10.1109\/ICInfA.2015.7279770"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"4224","DOI":"10.1109\/TII.2018.2822828","article-title":"Object classification using CNN-based fusion of vision and LIDAR in autonomous vehicle environment","volume":"14","author":"Gao","year":"2018","journal-title":"IEEE Trans. Ind. Inform."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Geng, K., Dong, G., Yin, G., and Hu, J. (2020). Deep dual-modal traffic objects instance segmentation method using camera and lidar data for autonomous driving. Remote Sens., 12.","DOI":"10.3390\/rs12203274"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"2122","DOI":"10.1364\/OE.381176","article-title":"Geometric calibration for LiDAR-camera system fusing 3D-2D and 3D-3D point correspondences","volume":"28","author":"An","year":"2020","journal-title":"Opt. Express"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"394","DOI":"10.1016\/j.optlaseng.2012.11.015","article-title":"Extrinsic calibration of a 3D LIDAR and a camera using a trihedron","volume":"51","author":"Gong","year":"2013","journal-title":"Opt. Lasers Eng."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"19058","DOI":"10.1364\/OE.392414","article-title":"Approach for accurate calibration of RGB-D cameras using spheres","volume":"28","author":"Liu","year":"2020","journal-title":"Opt. Express"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"168917","DOI":"10.1016\/j.ijleo.2022.168917","article-title":"Improvement to LiDAR-camera extrinsic calibration by using 3D\u20133D correspondences","volume":"259","author":"Nguyen","year":"2022","journal-title":"Optik"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"16242","DOI":"10.1364\/OE.453449","article-title":"Laser reflectance feature assisted accurate extrinsic calibration for non-repetitive scanning LiDAR and camera systems","volume":"30","author":"Lai","year":"2022","journal-title":"Opt. Express"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"18261","DOI":"10.1364\/OE.394331","article-title":"LiDAR-camera system extrinsic calibration by establishing virtual point correspondences from pseudo calibration objects","volume":"28","author":"An","year":"2020","journal-title":"Opt. Express"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"1027","DOI":"10.1109\/TPAMI.2015.2469299","article-title":"Globally optimal hand-eye calibration using branch-and-bound","volume":"38","author":"Heller","year":"2015","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"1021","DOI":"10.1109\/LRA.2019.2893612","article-title":"General hand\u2013eye calibration based on reprojection error minimization","volume":"4","author":"Koide","year":"2019","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"696","DOI":"10.1002\/rob.21542","article-title":"Automatic extrinsic calibration of vision and lidar by maximizing mutual information","volume":"32","author":"Pandey","year":"2015","journal-title":"J. Field Robot."},{"key":"ref_14","unstructured":"Taylor, Z., and Nieto, J. (2013, January 6\u201310). Automatic calibration of lidar and camera images using normalized mutual information. Proceedings of the 2013 IEEE International Conference on Robotics and Automation (ICRA), Karlsruhe, Germany."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Xu, D., Anguelov, D., and Jain, A. (2018, January 18\u201323). Pointfusion: Deep sensor fusion for 3d bounding box estimation. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA.","DOI":"10.1109\/CVPR.2018.00033"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Iyer, G., Ram, R.K., Murthy, J.K., and Krishna, K.M. (2018, January 1\u20135). CalibNet: Geometrically supervised extrinsic calibration using 3D spatial transformer networks. Proceedings of the 2018 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain.","DOI":"10.1109\/IROS.2018.8593693"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Yin, L., Luo, B., Wang, W., Yu, H., Wang, C., and Li, C. (2020). CoMask: Corresponding Mask-Based End-to-End Extrinsic Calibration of the Camera and LiDAR. Remote Sens., 12.","DOI":"10.3390\/rs12121925"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"2301","DOI":"10.1109\/IROS.2004.1389752","article-title":"Extrinsic calibration of a camera and laser range finder (improves camera calibration)","volume":"Volume 3","author":"Zhang","year":"2004","journal-title":"Proceedings of the 2004 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No. 04CH37566)"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Geiger, A., Moosmann, F., Car, \u00d6., and Schuster, B. (2012, January 14\u201318). Automatic camera and range sensor calibration using a single shot. Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA.","DOI":"10.1109\/ICRA.2012.6224570"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"5333","DOI":"10.3390\/s140305333","article-title":"Calibration between color camera and 3D LIDAR instruments with a polygonal planar board","volume":"14","author":"Park","year":"2014","journal-title":"Sensors"},{"key":"ref_21","unstructured":"Dhall, A., Chelani, K., Radhakrishnan, V., and Krishna, K.M. (2017). LiDAR-camera calibration using 3D-3D point correspondences. arXiv."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Guindel, C., Beltr\u00e1n, J., Mart\u00edn, D., and Garc\u00eda, F. (2017, January 16\u201319). Automatic extrinsic calibration for lidar-stereo vehicle sensor setups. Proceedings of the 2017 IEEE 20th International Conference on Intelligent Transportation Systems (ITSC), Yokohama, Japan.","DOI":"10.1109\/ITSC.2017.8317829"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Pusztai, Z., and Hajder, L. (2017, January 22\u201329). Accurate calibration of LiDAR-camera systems using ordinary boxes. Proceedings of the IEEE International Conference on Computer Vision Workshops, Venice, Italy.","DOI":"10.1109\/ICCVW.2017.53"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Lee, G.M., Lee, J.H., and Park, S.Y. (2017, January 16\u201318). Calibration of VLP-16 Lidar and multi-view cameras using a ball for 360 degree 3D color map acquisition. Proceedings of the 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI), Daegu, Republic of Korea.","DOI":"10.1109\/MFI.2017.8170408"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Chai, Z., Sun, Y., and Xiong, Z. (2018, January 9\u201312). A novel method for LiDAR camera calibration by plane fitting. Proceedings of the 2018 IEEE\/ASME International Conference on Advanced Intelligent Mechatronics (AIM), Auckland, New Zealand.","DOI":"10.1109\/AIM.2018.8452339"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"1330","DOI":"10.1109\/34.888718","article-title":"A flexible new technique for camera calibration","volume":"22","author":"Zhang","year":"2000","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_27","unstructured":"Press, W.H., Teukolsky, S.A., Vetterling, W.T., and Flannery, B.P. (2007). Numerical Recipes 3rd Edition: The Art of Scientific Computing, Cambridge University Press."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"627","DOI":"10.1142\/S0218001411008774","article-title":"A stable direct solution of perspective-three-point problem","volume":"25","author":"Li","year":"2011","journal-title":"Int. J. Pattern Recognit. Artif. Intell."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"31","DOI":"10.1016\/j.patrec.2018.02.028","article-title":"A simple, robust and fast method for the perspective-n-point problem","volume":"108","author":"Wang","year":"2018","journal-title":"Pattern Recognit. Lett."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"376","DOI":"10.1109\/34.88573","article-title":"Least-squares estimation of transformation parameters between two point patterns","volume":"13","author":"Umeyama","year":"1991","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"F\u00f6rstner, W. (2010). Minimal representations for uncertainty and estimation in projective spaces. Asian Conference on Computer Vision, Springer.","DOI":"10.1007\/978-3-642-19309-5_48"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"610","DOI":"10.1109\/34.862199","article-title":"Fast and globally convergent pose estimation from video images","volume":"22","author":"Lu","year":"2000","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"155","DOI":"10.1007\/s11263-008-0152-6","article-title":"Epnp: An accurate o (n) solution to the pnp problem","volume":"81","author":"Lepetit","year":"2009","journal-title":"Int. J. Comput. Vis."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"1444","DOI":"10.1109\/TPAMI.2012.41","article-title":"A robust O (n) solution to the perspective-n-point problem","volume":"34","author":"Li","year":"2012","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Hesch, J.A., and Roumeliotis, S.I. (2011, January 6\u201313). A direct least-squares (DLS) method for PnP. Proceedings of the 2011 International Conference on Computer Vision, Barcelona, Spain.","DOI":"10.1109\/ICCV.2011.6126266"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Zheng, Y., Kuang, Y., Sugimoto, S., Astrom, K., and Okutomi, M. (2013, January 1\u20138). Revisiting the pnp problem: A fast, general and optimal solution. Proceedings of the IEEE International Conference on Computer Vision, Sydney, Australia.","DOI":"10.1109\/ICCV.2013.291"},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Urban, S., Leitloff, J., and Hinz, S. (2016). Mlpnp-a real-time maximum likelihood solution to the perspective-n-point problem. arXiv.","DOI":"10.5194\/isprs-annals-III-3-131-2016"},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Ferraz Colomina, L., Binefa, X., and Moreno-Noguer, F. (2014, January 1\u20135). Leveraging feature uncertainty in the PnP problem. Proceedings of the BMVC 2014 British Machine Vision Conference, Nottingham, UK.","DOI":"10.5244\/C.28.83"},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Pusztai, Z., Eichhardt, I., and Hajder, L. (2018). Accurate calibration of multi-lidar-multi-camera systems. Sensors, 18.","DOI":"10.3390\/s18072139"},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Fang, C., Ding, S., Dong, Z., Li, H., Zhu, S., and Tan, P. (October, January 27). Single-shot is enough: Panoramic infrastructure based calibration of multiple cameras and 3D LiDARs. Proceedings of the 2021 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Prague, Czech Republic.","DOI":"10.1109\/IROS51168.2021.9636767"}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/23\/6082\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T01:30:44Z","timestamp":1760146244000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/14\/23\/6082"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,11,30]]},"references-count":40,"journal-issue":{"issue":"23","published-online":{"date-parts":[[2022,12]]}},"alternative-id":["rs14236082"],"URL":"https:\/\/doi.org\/10.3390\/rs14236082","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,11,30]]}}}