{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,24]],"date-time":"2026-02-24T17:12:27Z","timestamp":1771953147033,"version":"3.50.1"},"reference-count":77,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2018,2,22]],"date-time":"2018-02-22T00:00:00Z","timestamp":1519257600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100014219","name":"National Science Fund for Distinguished Young Scholars","doi-asserted-by":"publisher","award":["41725005"],"award-info":[{"award-number":["41725005"]}],"id":[{"id":"10.13039\/501100014219","id-type":"DOI","asserted-by":"publisher"}]},{"name":"National Natural Science Foundation Project","award":["41701530"],"award-info":[{"award-number":["41701530"]}]},{"name":"National Natural Science Foundation Project","award":["41531177"],"award-info":[{"award-number":["41531177"]}]},{"name":"National Natural Science Foundation Project","award":["41371431"],"award-info":[{"award-number":["41371431"]}]},{"name":"China Postdoctoral Science Found","award":["2016M600614"],"award-info":[{"award-number":["2016M600614"]}]},{"name":"Southern Power Grid Corporation Science and Technology Project","award":["GD-KJXM201509"],"award-info":[{"award-number":["GD-KJXM201509"]}]},{"name":"Key Laboratory of Spatial Data Mining &amp; Information Sharing of Ministry of Education, Fuzhou University","award":["2018LSDMIS06"],"award-info":[{"award-number":["2018LSDMIS06"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>Traditional indoor laser scanning trolley\/backpacks with multi-laser scanner, panorama cameras, and an inertial measurement unit (IMU) installed are a popular solution to the 3D indoor mapping problem. However, the cost of those mapping suits is quite expensive, and can hardly be replicated by consumer electronic components. The consumer RGB-Depth (RGB-D) camera (e.g., Kinect V2) is a low-cost option for gathering 3D point clouds. However, because of the narrow field of view (FOV), its collection efficiency and data coverages are lower than that of laser scanners. Additionally, the limited FOV leads to an increase of the scanning workload, data processing burden, and risk of visual odometry (VO)\/simultaneous localization and mapping (SLAM) failure. To find an efficient and low-cost way to collect 3D point clouds data with auxiliary information (i.e., color) for indoor mapping, in this paper we present a prototype indoor mapping solution that is built upon the calibration of multiple RGB-D sensors to construct an array with large FOV. Three time-of-flight (ToF)-based Kinect V2 RGB-D cameras are mounted on a rig with different view directions in order to form a large field of view. The three RGB-D data streams are synchronized and gathered by the OpenKinect driver. The intrinsic calibration that involves the geometry and depth calibration of single RGB-D cameras are solved by homography-based method and ray correction followed by range biases correction based on pixel-wise spline line functions, respectively. The extrinsic calibration is achieved through a coarse-to-fine scheme that solves the initial exterior orientation parameters (EoPs) from sparse control markers and further refines the initial value by an iterative closest point (ICP) variant minimizing the distance between the RGB-D point clouds and the referenced laser point clouds. The effectiveness and accuracy of the proposed prototype and calibration method are evaluated by comparing the point clouds derived from the prototype with ground truth data collected by a terrestrial laser scanner (TLS). The overall analysis of the results shows that the proposed method achieves the seamless integration of multiple point clouds from three Kinect V2 cameras collected at 30 frames per second, resulting in low-cost, efficient, and high-coverage 3D color point cloud collection for indoor mapping applications.<\/jats:p>","DOI":"10.3390\/rs10020328","type":"journal-article","created":{"date-parts":[[2018,2,22]],"date-time":"2018-02-22T12:57:16Z","timestamp":1519304236000},"page":"328","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":47,"title":["Calibrate Multiple Consumer RGB-D Cameras for Low-Cost and Efficient 3D Indoor Mapping"],"prefix":"10.3390","volume":"10","author":[{"given":"Chi","family":"Chen","sequence":"first","affiliation":[{"name":"State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China"},{"name":"Engineering Research Center for Spatio-Temporal Data Smart Acquisition and Application, Ministry of Education of China, Wuhan University, Wuhan 430079, China"},{"name":"Key Laboratory of Spatial Data Mining &amp; Information Sharing of Ministry of Education, Fuzhou University, Fuzhou 350000, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-7736-0803","authenticated-orcid":false,"given":"Bisheng","family":"Yang","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China"},{"name":"Engineering Research Center for Spatio-Temporal Data Smart Acquisition and Application, Ministry of Education of China, Wuhan University, Wuhan 430079, China"}]},{"given":"Shuang","family":"Song","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China"},{"name":"Engineering Research Center for Spatio-Temporal Data Smart Acquisition and Application, Ministry of Education of China, Wuhan University, Wuhan 430079, China"}]},{"given":"Mao","family":"Tian","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China"},{"name":"Engineering Research Center for Spatio-Temporal Data Smart Acquisition and Application, Ministry of Education of China, Wuhan University, Wuhan 430079, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4813-5126","authenticated-orcid":false,"given":"Jianping","family":"Li","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China"},{"name":"Engineering Research Center for Spatio-Temporal Data Smart Acquisition and Application, Ministry of Education of China, Wuhan University, Wuhan 430079, China"}]},{"given":"Wenxia","family":"Dai","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China"},{"name":"Engineering Research Center for Spatio-Temporal Data Smart Acquisition and Application, Ministry of Education of China, Wuhan University, Wuhan 430079, China"}]},{"given":"Lina","family":"Fang","sequence":"additional","affiliation":[{"name":"Key Laboratory of Spatial Data Mining &amp; Information Sharing of Ministry of Education, Fuzhou University, Fuzhou 350000, China"}]}],"member":"1968","published-online":{"date-parts":[[2018,2,22]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1560","DOI":"10.1109\/TCYB.2013.2271112","article-title":"Depth-Color Fusion Strategy for 3-D Scene Modeling With Kinect","volume":"43","author":"Camplani","year":"2013","journal-title":"IEEE Trans. Cybern."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"409","DOI":"10.1109\/JSTSP.2014.2381153","article-title":"Fast, Automated, Scalable Generation of Textured 3D Models of Indoor Environments","volume":"9","author":"Turner","year":"2015","journal-title":"IEEE J. Sel. Top. Signal Process."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1320","DOI":"10.1177\/0278364912455256","article-title":"Estimation, planning, and mapping for autonomous flight using an RGB-D camera in GPS-denied environments","volume":"31","author":"Bachrach","year":"2012","journal-title":"Int. J. Robot. Res."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.robot.2015.11.001","article-title":"Living with robots: Interactive environmental knowledge acquisition","volume":"78","author":"Gemignani","year":"2016","journal-title":"Robot. Auton. Syst."},{"key":"ref_5","unstructured":"(2017, March 28). Trimble Indoor Mapping Solution. Available online: http:\/\/www.trimble.com\/Indoor-Mobile-Mapping-Solution\/Indoor-Mapping.aspx."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"1318","DOI":"10.1109\/TCYB.2013.2265378","article-title":"Enhanced Computer Vision with Microsoft Kinect Sensor: A Review","volume":"43","author":"Han","year":"2013","journal-title":"IEEE Trans. Cybern."},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Grzegorzek, M., Theobalt, C., Koch, R., and Kolb, A. (2013). Technical Foundation and Calibration Methods for Time-of-Flight Cameras. Time-of-Flight and Depth Imaging. Sensors, Algorithms, and Applications: Dagstuhl 2012 Seminar on Time-of-Flight Imaging and GCPR 2013 Workshop on Imaging New Modalities, Springer.","DOI":"10.1007\/978-3-642-44964-2"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"262","DOI":"10.1109\/LGRS.2015.2508880","article-title":"Mapping Indoor Spaces by Adaptive Coarse-to-Fine Registration of RGB-D Data","volume":"13","author":"Basso","year":"2016","journal-title":"IEEE Geosci. Remote Sens. Lett."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"13070","DOI":"10.3390\/rs71013070","article-title":"Assessment and Calibration of a RGB-D Camera (Kinect V2 Sensor) Towards a Potential Use for Close-Range 3D Modeling","volume":"7","author":"Lachat","year":"2015","journal-title":"Remote Sens."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"247","DOI":"10.3390\/robotics3030247","article-title":"IMU and Multiple RGB-D Camera Fusion for Assisting Indoor Stop-and-Go 3D Terrestrial Laser Scanning","volume":"3","author":"Chow","year":"2014","journal-title":"Robotics"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"96","DOI":"10.1016\/j.isprsjprs.2014.12.014","article-title":"Automatic registration of unordered point clouds acquired by Kinect sensors using an overlap heuristic","volume":"102","author":"Weber","year":"2015","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Yang, S., Yi, X., Wang, Z., Wang, Y., and Yang, X. (2015, January 6\u20139). Visual SLAM using multiple RGB-D cameras. Proceedings of the IEEE International Conference on Robotics and Biomimetic (ROBIO), Zhuhai, China.","DOI":"10.1109\/ROBIO.2015.7418965"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Xiao, J., Owens, A., and Torralba, A. (2013, January 3\u20136). Sun3d: A database of big spaces reconstructed using sfm and object labels. Proceedings of the IEEE International Conference on Computer Vision, Sydney, Australia.","DOI":"10.1109\/ICCV.2013.458"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"3","DOI":"10.1002\/rob.20103","article-title":"Visual odometry for ground vehicle applications","volume":"23","author":"Naroditsky","year":"2006","journal-title":"J. Field Robot."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Christensen, H.I., and Khatib, O. (2017). Visual odometry and mapping for autonomous flight using an RGB-D camera. Robotics Research: The 15th International Symposium ISRR, Springer International Publishing.","DOI":"10.1007\/978-3-319-29363-9"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"598","DOI":"10.1177\/0278364914551008","article-title":"Real-time large-scale dense RGB-D SLAM with volumetric fusion","volume":"34","author":"Whelan","year":"2015","journal-title":"Int. J. Robot. Res."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"571","DOI":"10.1016\/j.robot.2015.09.026","article-title":"Dense RGB-D visual odometry using inverse depth","volume":"75","author":"Guerrero","year":"2016","journal-title":"Robot. Auton. Syst."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Endres, F., Hess, J., Engelhard, N., Sturm, J., Cremers, D., and Burgard, W. (2012, January 14\u201318). An Evaluation of the RGB-D SLAM System. Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2012), Saint Paul, MN, USA.","DOI":"10.1109\/ICRA.2012.6225199"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Davison, A.J., Cid, A.G., and Kita, N. (2004, January 5\u20137). Real-time 3D SLAM with wide-angle vision. Proceedings of the IFAC Symposium on Intelligent Autonomous Vehicles, Lisbon, Portugal.","DOI":"10.1016\/S1474-6670(17)32089-X"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"234","DOI":"10.1007\/s11263-016-0935-0","article-title":"MultiCol Bundle Adjustment: A Generic Method for Pose Estimation, Simultaneous Self-Calibration and Reconstruction for Arbitrary Multi-Camera Systems","volume":"121","author":"Urban","year":"2017","journal-title":"Int. J. Comput. Vis."},{"key":"ref_21","unstructured":"Blake, J., Martin, H., Machulis, K., Xiang, L., and Fisher, D. (2016, October 09). OpenKinect: Open Source Drivers for the Kinect for Windows V2 Device. Available online: https:\/\/github.com\/OpenKinect\/libfreenect2."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Izadi, S., Kim, D., Hilliges, O., Molyneaux, D., Newcombe, R., Kohli, P., Shotton, J., Hodges, S., Freeman, D., and Davison, A. (2011, January 16\u201319). KinectFusion: Real-time 3D reconstruction and interaction using a moving depth camera. Proceedings of the 24th Annual ACM Symposium on User Interface Software and Technology, Santa Barbara, CA, USA.","DOI":"10.1145\/2047196.2047270"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Newcombe, R.A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A.J., Kohli, P., Shotton, J., Hodges, S., and Fitzgibbon, A. (2011, January 26\u201329). KinectFusion: Real-time dense surface mapping and tracking. Proceedings of the IEEE International Symposium on Mixed and Augmented Reality, Basel, Switzerland.","DOI":"10.1109\/ISMAR.2011.6162880"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Rusu, R.B., and Cousins, S. (2011, January 9\u201313). 3D is here: Point Cloud Library (PCL). Proceedings of the IEEE International Conference on Robotics and Automation, Shanghai, China.","DOI":"10.1109\/ICRA.2011.5980567"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"239","DOI":"10.1109\/34.121791","article-title":"A method for registration of 3-D shapes","volume":"14","author":"Besl","year":"1992","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Labbe, M., and Michaud, F. (2014, January 14\u201318). Online global loop closure detection for large-scale multi-session graph-based SLAM. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS 2014), Chicago, IL, USA.","DOI":"10.1109\/IROS.2014.6942926"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"647","DOI":"10.1177\/0278364911434148","article-title":"RGB-D mapping: Using Kinect-style depth cameras for dense 3D modeling of indoor environments","volume":"31","author":"Henry","year":"2012","journal-title":"Int. J. Robot. Res."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"8569","DOI":"10.1007\/s11042-015-2772-5","article-title":"Rotated top-bottom dual-kinect for improved field of view","volume":"75","author":"Song","year":"2016","journal-title":"Multimed. Tools Appl."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Tsai, C.-Y., and Huang, C.-H. (2017). Indoor Scene Point Cloud Registration Algorithm Based on RGB-D Camera Calibration. Sensors, 17.","DOI":"10.3390\/s17081874"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"91","DOI":"10.1016\/j.robot.2017.03.008","article-title":"Using extended measurements and scene merging for efficient and robust point cloud registration","volume":"92","author":"Serafin","year":"2017","journal-title":"Robot. Auton. Syst."},{"key":"ref_31","unstructured":"(2017, May 19). Matterport Pro2 3D Camera. Available online: https:\/\/matterport.com\/pro2\u20133d-camera\/."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"2058","DOI":"10.1109\/TPAMI.2012.125","article-title":"Joint Depth and Color Camera Calibration with Distortion Correction","volume":"34","author":"Daniel","year":"2012","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"4893","DOI":"10.1109\/TIP.2014.2352851","article-title":"Robust 3D reconstruction with an RGB-D camera","volume":"23","author":"Wang","year":"2014","journal-title":"IEEE Trans. Image Process."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Darwish, W., Tang, S., Li, W., and Chen, W. (2017). A New Calibration Method for Commercial RGB-D Sensors. Sensors, 17.","DOI":"10.3390\/s17061204"},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"584","DOI":"10.1016\/j.robot.2015.09.024","article-title":"A metrological characterization of the Kinect V2 time-of-flight camera","volume":"75","author":"Corti","year":"2016","journal-title":"Robot. Auton. Syst."},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.cviu.2015.05.006","article-title":"Kinect range sensing: Structured-light versus Time-of-Flight Kinect","volume":"139","author":"Sarbolandi","year":"2015","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"465","DOI":"10.1109\/ACCESS.2013.2271860","article-title":"Photogrammetric bundle adjustment with self-calibration of the PrimeSense 3D camera technology: Microsoft Kinect","volume":"1","author":"Chow","year":"2013","journal-title":"IEEE Access"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"15205","DOI":"10.1364\/OE.23.015205","article-title":"Improved camera calibration method based on perpendicularity compensation for binocular stereo vision measurement system","volume":"23","author":"Fan","year":"2015","journal-title":"Opt. Express"},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"9134","DOI":"10.1364\/OE.22.009134","article-title":"Precise calibration of binocular vision system used for vision measurement","volume":"22","author":"Cui","year":"2014","journal-title":"Opt. Express"},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"281","DOI":"10.1117\/1.2897237","article-title":"Easy calibration technique for stereo vision using a circle grid","volume":"47","author":"Luo","year":"2008","journal-title":"Opt. Eng."},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"1631","DOI":"10.1088\/0957-0233\/14\/9\/314","article-title":"Two-step calibration of a stereo camera system for measurements in large volumes","volume":"14","author":"Machacek","year":"2003","journal-title":"Meas. Sci. Technol."},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"3338","DOI":"10.1364\/AO.51.003338","article-title":"Binocular vision system calibration based on a one-dimensional target","volume":"51","author":"Zhao","year":"2012","journal-title":"Appl. Opt."},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"313","DOI":"10.1007\/s00138-011-0333-0","article-title":"Appearance-based parameter optimization for accurate stereo camera calibration","volume":"23","author":"Habe","year":"2012","journal-title":"Mach. Vis. Appl."},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"257","DOI":"10.1007\/s11263-009-0232-2","article-title":"Accurate Camera Calibration from Multi-View Stereo and Bundle Adjustment","volume":"84","author":"Furukawa","year":"2009","journal-title":"Int. J. Comput. Vis."},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"2716","DOI":"10.1016\/j.patcog.2007.01.008","article-title":"Self-calibration of a stereo rig using monocular epipolar geometries","volume":"40","author":"Dornaika","year":"2007","journal-title":"Pattern Recognit."},{"key":"ref_46","doi-asserted-by":"crossref","first-page":"1536","DOI":"10.1109\/TIP.2009.2017824","article-title":"Continuous Stereo Self-Calibration by Camera Parameter Tracking","volume":"18","author":"Dang","year":"2009","journal-title":"IEEE Trans. Image Process."},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"616","DOI":"10.1109\/TVCG.2013.33","article-title":"Immersive Group-to-Group Telepresence","volume":"19","author":"Beck","year":"2013","journal-title":"IEEE Trans. Vis. Comput. Graph."},{"key":"ref_48","unstructured":"Avetisyan, R., Willert, M., Ohl, S., and Staadt, O. (2014, January 12\u201313). Calibration of Depth Camera Arrays. Proceedings of the SIGRAD 2014, Visual Computing, G\u00f6teborg, Sweden."},{"key":"ref_49","doi-asserted-by":"crossref","first-page":"1318","DOI":"10.1016\/j.cviu.2009.11.002","article-title":"Time-of-Flight sensor calibration for accurate range sensing","volume":"114","author":"Lindner","year":"2010","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_50","doi-asserted-by":"crossref","first-page":"1501","DOI":"10.1109\/TPAMI.2014.2363827","article-title":"Time-of-Flight Sensor Calibration for a Color and Depth Camera Pair","volume":"37","author":"Jiyoung","year":"2015","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_51","doi-asserted-by":"crossref","unstructured":"Beck, S., and Froehlich, B. (2015, January 23\u201324). Volumetric calibration and registration of multiple RGBD-sensors into a joint coordinate system. Proceedings of the IEEE Symposium on 3D User Interfaces (3DUI), Arles, France.","DOI":"10.1109\/3DUI.2015.7131731"},{"key":"ref_52","unstructured":"Avetisyan, R., Rosenke, C., and Staadt, O. (June, January 30). Flexible Calibration of Color and Depth Camera Arrays. Proceedings of the WSCG2016\u201424th International Conference in Central Europe on Computer Graphics, Visualization and Computer Vision, Plzen, Czech Republic."},{"key":"ref_53","doi-asserted-by":"crossref","unstructured":"Kainz, B., Hauswiesner, S., Reitmayr, G., Steinberger, M., Grasset, R., Gruber, L., Veas, E., Kalkofen, D., Seichter, H., and Schmalstieg, D. (2012, January 10\u201312). OmniKinect: Real-time dense volumetric data acquisition and applications. Proceedings of the 18th ACM Symposium on Virtual Reality Software and Technology, Toronto, ON, Canada.","DOI":"10.1145\/2407336.2407342"},{"key":"ref_54","doi-asserted-by":"crossref","unstructured":"Fern\u00e1ndez-Moral, E., Gonz\u00e1lez-Jim\u00e9nez, J., Rives, P., and Ar\u00e9valo, V. (2014, January 14\u201318). Extrinsic calibration of a set of range cameras in 5 s without pattern. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA.","DOI":"10.1109\/IROS.2014.6942595"},{"key":"ref_55","doi-asserted-by":"crossref","unstructured":"Heng, L., Li, B., and Pollefeys, M. (2013, January 3\u20137). Camodocal: Automatic intrinsic and extrinsic calibration of a rig with multiple generic cameras and odometry. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), Tokyo, Japan.","DOI":"10.1109\/IROS.2013.6696592"},{"key":"ref_56","doi-asserted-by":"crossref","unstructured":"Schneider, S., Luettel, T., and Wuensche, H.-J. (2013, January 3\u20137). Odometry-based online extrinsic sensor calibration. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS 2013), Tokyo, Japan.","DOI":"10.1109\/IROS.2013.6696515"},{"key":"ref_57","unstructured":"(2017, June 06). Microsoft Kinect V2 for Microsoft Windows. Available online: https:\/\/en.wikipedia.org\/wiki\/Kinect."},{"key":"ref_58","unstructured":"(2017, June 06). CubeEye 3D Depth Camera. Available online: http:\/\/www.cube-eye.co.kr\/."},{"key":"ref_59","unstructured":"(2017, June 06). PMD CamCube 3.0. Available online: http:\/\/www.pmdtec.com\/news_media\/video\/camcube.php."},{"key":"ref_60","doi-asserted-by":"crossref","unstructured":"Coughlan, J.M., and Yuille, A.L. (1999, January 20\u201327). Manhattan World: Compass direction from a single image by Bayesian inference. Proceedings of the Seventh IEEE International Conference on Computer Vision, Kerkyra, Greece.","DOI":"10.1109\/ICCV.1999.790349"},{"key":"ref_61","doi-asserted-by":"crossref","first-page":"27569","DOI":"10.3390\/s151127569","article-title":"Calibration of Kinect for Xbox One and Comparison between the Two Generations of Microsoft Sensors","volume":"15","author":"Pagliari","year":"2015","journal-title":"Sensors"},{"key":"ref_62","doi-asserted-by":"crossref","unstructured":"Fankhauser, P., Bloesch, M., Rodriguez, D., Kaestner, R., Hutter, M., and Siegwart, R. (2015, January 27\u201331). Kinect V2 for mobile robot navigation: Evaluation and modeling. Proceedings of the International Conference on Advanced Robotics (ICAR 2015), Istanbul, Turkey.","DOI":"10.1109\/ICAR.2015.7251485"},{"key":"ref_63","doi-asserted-by":"crossref","first-page":"44","DOI":"10.1109\/MM.2014.9","article-title":"The Xbox One System on a Chip and Kinect Sensor","volume":"34","author":"Sell","year":"2014","journal-title":"IEEE Micro"},{"key":"ref_64","doi-asserted-by":"crossref","unstructured":"Gui, P., Qin, Y., Hongmin, C., Tinghui, Z., and Chun, Y. (2014, January 11\u201314). Accurately calibrate kinect sensor using indoor control field. Proceedings of the 3rd International Workshop on Earth Observation and Remote Sensing Applications (EORSA 2014), Changsha, China.","DOI":"10.1109\/EORSA.2014.6927839"},{"key":"ref_65","doi-asserted-by":"crossref","first-page":"1330","DOI":"10.1109\/34.888718","article-title":"A flexible new technique for camera calibration","volume":"22","author":"Zhang","year":"2000","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_66","first-page":"444","article-title":"Decentering distortion of lenses","volume":"32","author":"Brown","year":"1966","journal-title":"Photogramm. Eng."},{"key":"ref_67","doi-asserted-by":"crossref","unstructured":"Lindner, M., and Kolb, A. (2006, January 6\u20138). Lateral and Depth Calibration of PMD-Distance Sensors. Proceedings of the 2nd International Symposium on Visual Computing, Lake Tahoe, NV, USA.","DOI":"10.1007\/11919629_53"},{"key":"ref_68","doi-asserted-by":"crossref","first-page":"214","DOI":"10.1111\/j.1467-8659.2007.01016.x","article-title":"Efficient RANSAC for point-cloud shape detection","volume":"26","author":"Schnabel","year":"2007","journal-title":"Comput. Graph. Forum"},{"key":"ref_69","doi-asserted-by":"crossref","first-page":"698","DOI":"10.1109\/TPAMI.1987.4767965","article-title":"Least-Squares Fitting of Two 3-D Point Sets","volume":"PAMI-9","author":"Arun","year":"1987","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_70","doi-asserted-by":"crossref","first-page":"6507","DOI":"10.1109\/JSEN.2015.2459139","article-title":"Analysis and Evaluation between the First and the Second Generation of RGB-D Sensors","volume":"15","author":"Diaz","year":"2015","journal-title":"IEEE Sens. J."},{"key":"ref_71","doi-asserted-by":"crossref","unstructured":"Jim\u00e9nez, D., Pizarro, D., Mazo, M., and Palazuelos, S. (2012, January 16\u201321). Modelling and correction of multipath interference in time of flight cameras. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Providence, RI, USA.","DOI":"10.1109\/CVPR.2012.6247763"},{"key":"ref_72","unstructured":"Chen, Y., and Medioni, G. (1991, January 9\u201311). Object modeling by registration of multiple range images. Proceedings of the IEEE International Conference on Robotics and Automation, Sacramento, CA, USA."},{"key":"ref_73","unstructured":"Chow, J.C.K., Ang, K.D., Lichti, D.D., and Teskey, W.F. (September, January 25). Performance analysis of a low-cost triangulation-based 3D camera: Microsoft Kinect system. Proceedings of the The 22nd Congress of the International Society for Photogrammetry and Remote Sensing, Melbourne, VIC, Australia."},{"key":"ref_74","doi-asserted-by":"crossref","unstructured":"Lachat, E., Macher, H., Mittet, M.A., Landes, T., and Grussenmeyer, P. (2015, January 25\u201327). First experiences with Kinect V2 sensor for close range 3D modelling. Proceedings of the 6th International Workshop on 3D Virtual Reconstruction and Visualization of Complex Architectures (3D-ARCH 2015), Avila, Spain.","DOI":"10.5194\/isprsarchives-XL-5-W4-93-2015"},{"key":"ref_75","doi-asserted-by":"crossref","first-page":"734","DOI":"10.1109\/TRO.2013.2242375","article-title":"Appearance-Based Loop Closure Detection for Online Large-Scale and Long-Term Operation","volume":"29","author":"Labbe","year":"2013","journal-title":"IEEE Trans. Robot."},{"key":"ref_76","doi-asserted-by":"crossref","unstructured":"Labbe, M., and Michaud, F. (2011, January 25\u201330). Memory management for real-time appearance-based loop closure detection. Proceedings of the IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS 2011), San Francisco, CA, USA.","DOI":"10.1109\/IROS.2011.6094602"},{"key":"ref_77","unstructured":"Born, M., and Wolf, E. (1999). Fraunhofer diffraction in optical instruments. Principles of Optics, Cambridge University Press. [7th ed.]."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/10\/2\/328\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T14:55:55Z","timestamp":1760194555000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/10\/2\/328"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2018,2,22]]},"references-count":77,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2018,2]]}},"alternative-id":["rs10020328"],"URL":"https:\/\/doi.org\/10.3390\/rs10020328","relation":{},"ISSN":["2072-4292"],"issn-type":[{"value":"2072-4292","type":"electronic"}],"subject":[],"published":{"date-parts":[[2018,2,22]]}}}