{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,12]],"date-time":"2026-03-12T08:22:35Z","timestamp":1773303755266,"version":"3.50.1"},"reference-count":30,"publisher":"Springer Science and Business Media LLC","issue":"8","license":[{"start":{"date-parts":[[2022,5,27]],"date-time":"2022-05-27T00:00:00Z","timestamp":1653609600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,5,27]],"date-time":"2022-05-27T00:00:00Z","timestamp":1653609600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"Hidss4Health"},{"DOI":"10.13039\/501100006360","name":"Bundesministerium f\u00fcr Wirtschaft und Energie","doi-asserted-by":"publisher","id":[{"id":"10.13039\/501100006360","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Int J CARS"],"published-print":{"date-parts":[[2022,8]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:sec>\n                <jats:title>\n                           <jats:bold>Purpose<\/jats:bold>\n                        <\/jats:title>\n                <jats:p>As human failure has been shown to be one primary cause for post-operative death, surgical training is of the utmost socioeconomic importance. In this context, the concept of surgical telestration has been introduced to enable experienced surgeons to efficiently and effectively mentor trainees in an intuitive way. While previous approaches to telestration have concentrated on overlaying drawings on surgical videos, we explore the augmented reality (AR) visualization of surgical hands to imitate the direct interaction with the situs.<\/jats:p>\n              <\/jats:sec><jats:sec>\n                <jats:title>\n                           <jats:bold>Methods<\/jats:bold>\n                        <\/jats:title>\n                <jats:p>We present a real-time hand tracking pipeline specifically designed for the application of surgical telestration. It comprises three modules, dedicated to (1) the coarse localization of the expert\u2019s hand and the subsequent (2) segmentation of the hand for AR visualization in the field of view of the trainee and (3) regression of keypoints making up the hand\u2019s skeleton. The semantic representation is obtained to offer the ability for structured reporting of the motions performed as part of the teaching.<\/jats:p>\n              <\/jats:sec><jats:sec>\n                <jats:title>\n                           <jats:bold>Results<\/jats:bold>\n                        <\/jats:title>\n                <jats:p>According to a comprehensive validation based on a large data set comprising more than 14,000 annotated images with varying application-relevant conditions, our algorithm enables real-time hand tracking and is sufficiently accurate for the task of surgical telestration. In a retrospective validation study, a mean detection accuracy of 98%, a mean keypoint regression accuracy of 10.0 px and a mean Dice Similarity Coefficient of 0.95 were achieved. In a prospective validation study, it showed uncompromised performance when the sensor, operator or gesture varied.<\/jats:p>\n              <\/jats:sec><jats:sec>\n                <jats:title>\n                           <jats:bold>Conclusion<\/jats:bold>\n                        <\/jats:title>\n                <jats:p>Due to its high accuracy and fast inference time, our neural network-based approach to hand tracking is well suited for an AR approach to surgical telestration. Future work should be directed to evaluating the clinical value of the approach.\n<\/jats:p>\n              <\/jats:sec>","DOI":"10.1007\/s11548-022-02637-9","type":"journal-article","created":{"date-parts":[[2022,5,27]],"date-time":"2022-05-27T16:03:00Z","timestamp":1653667380000},"page":"1477-1486","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":19,"title":["Robust hand tracking for surgical telestration"],"prefix":"10.1007","volume":"17","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-5312-9551","authenticated-orcid":false,"given":"Lucas-Raphael","family":"M\u00fcller","sequence":"first","affiliation":[]},{"given":"Jens","family":"Petersen","sequence":"additional","affiliation":[]},{"given":"Amine","family":"Yamlahi","sequence":"additional","affiliation":[]},{"given":"Philipp","family":"Wise","sequence":"additional","affiliation":[]},{"given":"Tim J.","family":"Adler","sequence":"additional","affiliation":[]},{"given":"Alexander","family":"Seitel","sequence":"additional","affiliation":[]},{"given":"Karl-Friedrich","family":"Kowalewski","sequence":"additional","affiliation":[]},{"given":"Beat","family":"M\u00fcller","sequence":"additional","affiliation":[]},{"given":"Hannes","family":"Kenngott","sequence":"additional","affiliation":[]},{"given":"Felix","family":"Nickel","sequence":"additional","affiliation":[]},{"given":"Lena","family":"Maier-Hein","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,5,27]]},"reference":[{"key":"2637_CR1","doi-asserted-by":"publisher","unstructured":"Nepogodiev D, Martin J, Biccard B, Makupe A, Bhangu A, Ademuyiwa A, Adisa AO, Aguilera ML, Chakrabortee S, Fitzgerald JE, Ghosh D, Glasbey JC, Harrison EM, Ingabire JCA, Salem H, Lapitan MC, Lawani I, Lissauer D, Magill L, Moore R, Osei-Bordom DC, Pinkney TD, Qureshi AU, Ramos-De la Medina A, Rayne S, Sundar S, Tabiri S, Verjee A, Yepez R, Garden OJ, Lilford R, Brocklehurst P, Morton DG, Bhangu A (2019) Lobal burden of postoperative death. Lance. https:\/\/doi.org\/10.1016\/S0140-6736(18)33139-8","DOI":"10.1016\/S0140-6736(18)33139-8"},{"key":"2637_CR2","doi-asserted-by":"publisher","DOI":"10.1001\/jamasurg.2021.3604","author":"F Nickel","year":"2021","unstructured":"Nickel F, Cizmic A, Chand M (2021) Telestration and augmented reality in minimally invasive surgery: an invaluable tool in the age of covid-19 for remote proctoring and telementoring. JAMA Surg. https:\/\/doi.org\/10.1001\/jamasurg.2021.3604","journal-title":"JAMA Surg"},{"key":"2637_CR3","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijsu.2017.08.029","author":"J Luck","year":"2017","unstructured":"Luck J, Hachach-Haram N, Greenfield M, Smith O, Billingsley M, Heyes R, Mosahebi A, Greenfield MJ (2017) ugmented reality in undergraduate surgical training: the PROXIMIE pilot. Int J Surg. https:\/\/doi.org\/10.1016\/j.ijsu.2017.08.029","journal-title":"Int J Surg"},{"key":"2637_CR4","doi-asserted-by":"publisher","DOI":"10.1007\/s00345-016-1944-x","author":"AM Jarc","year":"2017","unstructured":"Jarc AM, Stanley AA, Clifford T, Gill IS, Hung AJ (2017) Proctors exploit three-dimensional ghost tools during clinical-like training scenarios: a preliminary study. World J Urol. https:\/\/doi.org\/10.1007\/s00345-016-1944-x","journal-title":"World J Urol"},{"key":"2637_CR5","doi-asserted-by":"publisher","DOI":"10.1177\/1553350618813250","author":"S Erridge","year":"2019","unstructured":"Erridge S, Yeung DKT, Patel HRH, Purkayastha S (2019) Telementoring of surgeons: a systematic review. Surg Innov. https:\/\/doi.org\/10.1177\/1553350618813250","journal-title":"Surg Innov"},{"key":"2637_CR6","unstructured":"Nickel F, Petersen J, Onogur S, Schmidt M, Kowalewski K-F, Eisenmann M, Thiel C, Trent S, Weber C (2021) System and method for teaching minimally invasive interventions . https:\/\/patentscope.wipo.int\/search\/en\/detail.jsf?docId=WO2021176091 &tab=PCTBIBLIO"},{"key":"2637_CR7","unstructured":"Zhang M, Cheng X, Copeland D, Desai A, Guan MY, Brat GA, Yeung S (2021) Using computer vision to automate hand detection and tracking of surgeon movements in videos of open surgery. AMIA Annual symposium proceedings 2020"},{"key":"2637_CR8","doi-asserted-by":"publisher","DOI":"10.1007\/s11548-021-02369-2","author":"J Hein","year":"2021","unstructured":"Hein J, Seibold M, Bogo F, Farshad M, Pollefeys M, F\u00fcrnstahl P, Navab N (2021) Towards markerless surgical tool and hand pose estimation. Int J Comput Assisted Radiol Surg. https:\/\/doi.org\/10.1007\/s11548-021-02369-2","journal-title":"Int J Comput Assisted Radiol Surg"},{"key":"2637_CR9","doi-asserted-by":"crossref","unstructured":"Louis N, Zhou L, Yule SJ, Dias RD, Manojlovich M, Pagani FD, Likosky DS, Corso JJ (2021) Temporally guided articulated hand pose tracking in surgical videos. arXiv:2101.04281v2 [cs]","DOI":"10.2139\/ssrn.4019293"},{"key":"2637_CR10","unstructured":"Zhang F, Bazarevsky V, Vakunov A, Tkachenka A, Sung G, Chang C-L, Grundmann M (2020) MediaPipe hands: on-device real-time hand tracking. arXiv preprint arXiv:2006.10214v1"},{"key":"2637_CR11","unstructured":"Jocher GR (2022) ultralytics\/yolov5. GitHub .https:\/\/github.com\/ultralytics\/yolov5 Accessed 2022-01-17"},{"key":"2637_CR12","doi-asserted-by":"publisher","unstructured":"Deepa R, Tamilselvan E, Abrar ES, Sampath S (2019) Comparison of yolo, ssd, faster rcnn for real time tennis ball tracking for action decision networks. In: 2019 International conference on advances in computing and communication engineering (ICACCE), pp. 1\u20134 . https:\/\/doi.org\/10.1109\/ICACCE46606.2019.9079965","DOI":"10.1109\/ICACCE46606.2019.9079965"},{"key":"2637_CR13","unstructured":"Tan M, Le Q (2019) EfficientNet: Rethinking model scaling for convolutional neural networks. In: Chaudhuri, K., Salakhutdinov, R. (eds.) Proceedings of the 36th international conference on machine learning. Proceedings of machine learning research, 97, 6105\u20136114. PMLR. https:\/\/proceedings.mlr.press\/v97\/tan19a.html"},{"key":"2637_CR14","doi-asserted-by":"publisher","unstructured":"Wightman R (2019). PyTorch image models GitHub. https:\/\/doi.org\/10.5281\/zenodo.4414861","DOI":"10.5281\/zenodo.4414861"},{"key":"2637_CR15","doi-asserted-by":"publisher","unstructured":"Xie Q, Luong M-T, Hovy E, Le QV (2020) Self-training with noisy student improves imagenet classification. In: 2020 IEEE\/CVF conference on computer vision and pattern recognition (CVPR), 10684\u201310695 . https:\/\/doi.org\/10.1109\/CVPR42600.2020.01070","DOI":"10.1109\/CVPR42600.2020.01070"},{"key":"2637_CR16","doi-asserted-by":"crossref","unstructured":"Smith LN (2017) Cyclical learning rates for training neural networks. arXiv:1506.01186v6 [cs]","DOI":"10.1109\/WACV.2017.58"},{"key":"2637_CR17","doi-asserted-by":"publisher","DOI":"10.3390\/info11020125","author":"A Buslaev","year":"2020","unstructured":"Buslaev A, Iglovikov VI, Khvedchenya E, Parinov A, Druzhinin M, Kalinin AA (2020) Albumentations: fast and flexible image augmentations. Inform Int Interdiscip J. https:\/\/doi.org\/10.3390\/info11020125","journal-title":"Inform Int Interdiscip J"},{"key":"2637_CR18","doi-asserted-by":"publisher","unstructured":"Lin T-Y, Doll\u00e1r P, Girshick R, He K, Hariharan B, Belongie S (2017) Feature pyramid networks for object detection. In: 2017 IEEE Conference on computer vision and pattern recognition (CVPR), pp. 936\u2013944 . https:\/\/doi.org\/10.1109\/CVPR.2017.106","DOI":"10.1109\/CVPR.2017.106"},{"key":"2637_CR19","doi-asserted-by":"publisher","DOI":"10.1016\/j.media.2020.101920","author":"T Ro\u00df","year":"2021","unstructured":"Ro\u00df T, Reinke A, Full PM, Wagner M, Kenngott H, Apitz M, Hempe H, Mindroc-Filimon D, Scholz P, Tran TN, Bruno P, Arbel\u00e1ez P, Bian G-B, Bodenstedt S, Bolmgren JL, Bravo-S\u00e1nchez L, Chen H-B, Gonz\u00e1lez C, Guo D, Halvorsen P, Heng P-A, Hosgor E, Hou Z-G, Isensee F, Jha D, Jiang T, Jin Y, Kirtac K, Kletz S, Leger S, Li Z, Maier-Hein KH, Ni Z-L, Riegler MA, Schoeffmann K, Shi R, Speidel S, Stenzel M, Twick I, Wang G, Wang J, Wang L, Wang L, Zhang Y, Zhou Y-J, Zhu L, Wiesenfarth M, Kopp-Schneider A, M\u00fcller-Stich BP, Maier-Hein L (2021) Comparative validation of multi-instance instrument segmentation in endoscopy: results of the ROBUST-MIS 2019 challenge. Med Image Anal. https:\/\/doi.org\/10.1016\/j.media.2020.101920","journal-title":"Med Image Anal"},{"key":"2637_CR20","unstructured":"Ro\u00df T, Bruno P, Reinke A, Wiesenfarth M, Koeppel L, Full PM, Pekdemir B, Godau P, Trofimova D, Isensee F, Moccia S, Calimeri F, M\u00fcller-Stich BP, Kopp-Schneider A, Maier-Hein L (2021) How can we learn (more) from challenges? A statistical approach to driving future algorithm development. arXiv:2106.09302v1 [cs]"},{"key":"2637_CR21","unstructured":"Reinke A, Eisenmann M, Tizabi MD, Sudre CH, R\u00e4dsch T, Antonelli M, Arbel T, Bakas S, Cardoso MJ, Cheplygina V, Farahani K, Glocker B, Heckmann-N\u00f6tzel D, Isensee F, Jannin P, Kahn CE, Kleesiek J, Kurc T, Kozubek M, Landman BA, Litjens G, Maier-Hein K, Menze B, M\u00fcller H, Petersen J, Reyes M, Rieke N, Stieltjes B, Summers RM, Tsaftaris SA, van Ginneken B, Kopp-Schneider A, J\u00e4ger P, Maier-Hein L (2021) Common limitations of image processing metrics: a picture story. arXiv:2104.05642v2 [cs, eess]"},{"key":"2637_CR22","doi-asserted-by":"crossref","unstructured":"Wang J, Mueller F, Bernard F, Sorli S, Sotnychenko O, Qian N, Otaduy MA, Casas D, Theobalt C (2020) RGB2Hands: real-time tracking of 3D hand interactions from monocular RGB video. ACM Trans Graphics (TOG) 39(6)","DOI":"10.1145\/3414685.3417852"},{"key":"2637_CR23","doi-asserted-by":"crossref","unstructured":"Sridhar S, Mueller F, Oulasvirta A, Theobalt C (2015) Fast and robust hand tracking using detection-guided optimization. In: Proceedings of computer vision and pattern recognition (CVPR). http:\/\/handtracker.mpi-inf.mpg.de\/projects\/FastHandTracker\/","DOI":"10.1109\/CVPR.2015.7298941"},{"key":"2637_CR24","doi-asserted-by":"crossref","unstructured":"Sridhar S, Mueller F, Zollhoefer M, Casas D, Oulasvirta A, Theobalt C (2016) Real-time joint tracking of a hand manipulating an object from RGB-D input. In: Proceedings of European conference on computer vision (ECCV). http:\/\/handtracker.mpi-inf.mpg.de\/projects\/RealtimeHO\/","DOI":"10.1007\/978-3-319-46475-6_19"},{"key":"2637_CR25","doi-asserted-by":"crossref","unstructured":"Mueller F, Mehta D, Sotnychenko O, Sridhar S, Casas D, Theobalt C (2017) Real-time hand tracking under occlusion from an egocentric RGB-D sensor. In: Proceedings of international conference on computer vision (ICCV) . https:\/\/handtracker.mpi-inf.mpg.de\/projects\/OccludedHands\/","DOI":"10.1109\/ICCV.2017.131"},{"key":"2637_CR26","doi-asserted-by":"publisher","unstructured":"Caeiro-Rodr\u00edguez M, Otero-Gonz\u00e1lez I, Mikic-Fonte FA, Llamas-Nistal M (2021) A systematic review of commercial smart gloves: current status and applications. Sensors 21(8) . https:\/\/doi.org\/10.3390\/s21082667","DOI":"10.3390\/s21082667"},{"issue":"6","key":"2637_CR27","doi-asserted-by":"publisher","first-page":"1137","DOI":"10.1109\/TPAMI.2016.2577031","volume":"39","author":"S Ren","year":"2017","unstructured":"Ren S, He K, Girshick R, Sun J (2017) Faster r-cnn: towards real-time object detection with region proposal networks. IEEE Trans Pattern Anal Mach Intel 39(6):1137\u20131149. https:\/\/doi.org\/10.1109\/TPAMI.2016.2577031","journal-title":"IEEE Trans Pattern Anal Mach Intel"},{"key":"2637_CR28","doi-asserted-by":"crossref","unstructured":"He K, Zhang X, Ren S, Sun J (2015) Deep residual learning for image recognition. arXiv:1512.03385v1 [cs]","DOI":"10.1109\/CVPR.2016.90"},{"key":"2637_CR29","unstructured":"Yakubovskiy P (2020) Segmentation models pytorch. GitHub"},{"key":"2637_CR30","doi-asserted-by":"publisher","unstructured":"Ronneberger O, Fischer P, Brox T (2015) U-net: Convolutional networks for biomedical image segmentation. In: International conference on medical image computing and computer-assisted intervention, pp. 234\u2013241 . https:\/\/doi.org\/10.1007\/978-3-319-24574-4_28. Springer","DOI":"10.1007\/978-3-319-24574-4_28"}],"updated-by":[{"DOI":"10.1007\/s11548-022-02702-3","type":"correction","label":"Correction","source":"publisher","updated":{"date-parts":[[2022,7,8]],"date-time":"2022-07-08T00:00:00Z","timestamp":1657238400000}}],"container-title":["International Journal of Computer Assisted Radiology and Surgery"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11548-022-02637-9.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s11548-022-02637-9\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11548-022-02637-9.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,7,23]],"date-time":"2022-07-23T03:41:47Z","timestamp":1658547707000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s11548-022-02637-9"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,5,27]]},"references-count":30,"journal-issue":{"issue":"8","published-print":{"date-parts":[[2022,8]]}},"alternative-id":["2637"],"URL":"https:\/\/doi.org\/10.1007\/s11548-022-02637-9","relation":{"correction":[{"id-type":"doi","id":"10.1007\/s11548-022-02702-3","asserted-by":"object"}]},"ISSN":["1861-6429"],"issn-type":[{"value":"1861-6429","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,5,27]]},"assertion":[{"value":"1 February 2022","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"6 April 2022","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"27 May 2022","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"8 July 2022","order":4,"name":"change_date","label":"Change Date","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"Correction","order":5,"name":"change_type","label":"Change Type","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"A Correction to this paper has been published:","order":6,"name":"change_details","label":"Change Details","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"https:\/\/doi.org\/10.1007\/s11548-022-02702-3","URL":"https:\/\/doi.org\/10.1007\/s11548-022-02702-3","order":7,"name":"change_details","label":"Change Details","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"LRM, JP, AY, PW, TJA, AS, KFK, BM, HK, FN, LMH do not declare conflicts of interest.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflicts of interest"}},{"value":"The individuals shown in the figures of this manuscript have been informed and agreed that the respective images can be published.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Informed Consent"}}]}}