{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,11,7]],"date-time":"2025-11-07T13:34:58Z","timestamp":1762522498632,"version":"build-2065373602"},"reference-count":20,"publisher":"MDPI AG","issue":"16","license":[{"start":{"date-parts":[[2020,8,12]],"date-time":"2020-08-12T00:00:00Z","timestamp":1597190400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Remote Sensing"],"abstract":"<jats:p>The Zengwen desilting tunnel project installed an Elephant Trunk Steel Pipe (ETSP) at the bottom of the reservoir that is designed to connect the new bypass tunnel and reach downward to the sediment surface. Since ETSP is huge and its underwater installation is an unprecedented construction method, there are several uncertainties in its dynamic motion changes during installation. To assure construction safety, a 1:20 ETSP scale model was built to simulate the underwater installation procedure, and its six-degrees-of-freedom (6-DOF) motion parameters were monitored by offline underwater 3D rigid object tracking and photogrammetry. Three cameras were used to form a multicamera system, and several auxiliary devices\u2014such as waterproof housing, tripods, and a waterproof LED\u2014were adopted to protect the cameras and to obtain clear images in the underwater environment. However, since it is difficult for the divers to position the camera and ensure the camera field of view overlap, each camera can only observe the head, middle, and tail parts of ETSP, respectively, leading to a small overlap area among all images. Therefore, it is not possible to perform a traditional method via multiple images forward intersection, where the camera\u2019s positions and orientations have to be calibrated and fixed in advance. Instead, by tracking the 3D coordinates of ETSP and obtaining the camera orientation information via space resection, we propose a multicamera coordinate transformation and adopted a single-camera relative orientation transformation to calculate the 6-DOF motion parameters. The offline procedure is to first acquire the 3D coordinates of ETSP by taking multiposition images with a precalibrated camera in the air and then use the 3D coordinates as control points to perform the space resection of the calibrated underwater cameras. Finally, we calculated the 6-DOF of ETSP by using the camera orientation information through both multi- and single-camera approaches. In this study, we show the results of camera calibration in the air and underwater environment, present the 6-DOF motion parameters of ETSP underwater installation and the reconstructed 4D animation, and compare the differences between the multi- and single-camera approaches.<\/jats:p>","DOI":"10.3390\/rs12162600","type":"journal-article","created":{"date-parts":[[2020,8,13]],"date-time":"2020-08-13T02:58:02Z","timestamp":1597287482000},"page":"2600","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":6,"title":["Underwater 3D Rigid Object Tracking and 6-DOF Estimation: A Case Study of Giant Steel Pipe Scale Model Underwater Installation"],"prefix":"10.3390","volume":"12","author":[{"given":"Jyun-Ping","family":"Jhan","sequence":"first","affiliation":[{"name":"Department of Geomatics, National Cheng Kung University, Tainan 701, Taiwan"}]},{"given":"Jiann-Yeou","family":"Rau","sequence":"additional","affiliation":[{"name":"Department of Geomatics, National Cheng Kung University, Tainan 701, Taiwan"}]},{"given":"Chih-Ming","family":"Chou","sequence":"additional","affiliation":[{"name":"Kuo Toong International Co., Ltd., Kaohsiung 81357, Taiwan"}]}],"member":"1968","published-online":{"date-parts":[[2020,8,12]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"595","DOI":"10.1080\/02533839.2012.736778","article-title":"Typhoon Morakot meteorological analyses","volume":"37","author":"Yu","year":"2013","journal-title":"J. Chin. Inst. Eng."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"558","DOI":"10.1080\/02533839.2012.736771","article-title":"Disaster investigation and analysis of Typhoon Morakot","volume":"37","author":"Li","year":"2013","journal-title":"J. Chin. Inst. Eng."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"105227","DOI":"10.1016\/j.enggeo.2019.105227","article-title":"Sediment Sluice Tunnel of Zengwen Reservoir and construction of section with huge underground excavation adjacent to neighboring slope","volume":"260","author":"Chang","year":"2019","journal-title":"Eng. Geol."},{"key":"ref_4","first-page":"841","article-title":"4D animation reconstruction from multi-camera coordinates transformation ISPRS\u2014International archives of the photogrammetry","volume":"41","author":"Jhan","year":"2016","journal-title":"Remote Sens. Spat. Inf. Sci."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"293","DOI":"10.1080\/00221686.2011.578914","article-title":"Scale effects in physical hydraulic engineering models","volume":"49","author":"Heller","year":"2011","journal-title":"J. Hydraul. Res."},{"key":"ref_6","unstructured":"Nist\u00e9r, D., Naroditsky, O., and Bergen, J. (July, January 27). Visual odometry. Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, DC, USA."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"80","DOI":"10.1109\/MRA.2011.943233","article-title":"Visual odometry [tutorial]","volume":"18","author":"Scaramuzza","year":"2011","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"1052","DOI":"10.1109\/TPAMI.2007.1049","article-title":"MonoSLAM: Real-Time Single Camera SLAM","volume":"29","author":"Davison","year":"2007","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"558","DOI":"10.1016\/j.isprsjprs.2010.06.003","article-title":"Close range photogrammetry for industrial applications","volume":"65","author":"Luhmann","year":"2010","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Hartley, R., and Zisserman, A. (2004). Multiple View Geometry in Computer Vision, Cambridge University Press (CUP).","DOI":"10.1017\/CBO9780511811685"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1561\/0600000001","article-title":"Monocular model-based 3D tracking of rigid objects: A survey","volume":"1","author":"Lepetit","year":"2005","journal-title":"Found. Trends\u00ae Comput. Graph. Vis."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Pollefeys, M., Sinha, S.N., Guan, L., and Franco, J.S. (2009). Multi-view calibration, synchronization, and dynamic scene reconstruction. Multi-Camera Networks, Elsevier.","DOI":"10.1016\/B978-0-12-374633-7.00004-5"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"11271","DOI":"10.3390\/s120811271","article-title":"A Semi-automatic image-based close range 3D modeling pipeline using a multi-camera configuration","volume":"12","author":"Rau","year":"2012","journal-title":"Sensors"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"4","DOI":"10.1016\/j.cviu.2006.10.016","article-title":"Vision-based human motion analysis: An overview","volume":"108","author":"Poppe","year":"2007","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_15","first-page":"1","article-title":"Comparison between Single and Multi-Camera View Videogrammetry for Estimating 6DOF of a Rigid Body","volume":"9528","author":"Nocerino","year":"2015","journal-title":"SPIE Opt. Metrol."},{"key":"ref_16","first-page":"87910J","article-title":"High accuracy low-cost videogrammetric system: An application to 6DOF estimation of ship models","volume":"8791","author":"Nocerino","year":"2013","journal-title":"SPIE Opt. Metrol."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"73","DOI":"10.1111\/j.1477-9730.1986.tb00539.x","article-title":"On the calibration of underwater cameras","volume":"12","author":"Fryer","year":"1986","journal-title":"Photogramm. Rec."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"433","DOI":"10.1016\/j.isprsjprs.2010.05.004","article-title":"Photogrammetric modeling of underwater environments","volume":"65","author":"Telem","year":"2010","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"94","DOI":"10.1016\/S0924-2716(00)00010-1","article-title":"Design and implementation of a computational processing system for off-line digital close-range photogrammetry","volume":"55","author":"Fraser","year":"2000","journal-title":"ISPRS J. Photogramm. Remote Sens."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"149","DOI":"10.1016\/S0924-2716(97)00005-1","article-title":"Digital camera self-calibration","volume":"52","author":"Fraser","year":"1997","journal-title":"ISPRS J. Photogramm. Remote Sens."}],"container-title":["Remote Sensing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2072-4292\/12\/16\/2600\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T09:59:49Z","timestamp":1760176789000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2072-4292\/12\/16\/2600"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,8,12]]},"references-count":20,"journal-issue":{"issue":"16","published-online":{"date-parts":[[2020,8]]}},"alternative-id":["rs12162600"],"URL":"https:\/\/doi.org\/10.3390\/rs12162600","relation":{},"ISSN":["2072-4292"],"issn-type":[{"type":"electronic","value":"2072-4292"}],"subject":[],"published":{"date-parts":[[2020,8,12]]}}}