{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,8,1]],"date-time":"2025-08-01T04:08:19Z","timestamp":1754021299558,"version":"3.37.3"},"reference-count":35,"publisher":"Springer Science and Business Media LLC","issue":"12","license":[{"start":{"date-parts":[[2024,8,12]],"date-time":"2024-08-12T00:00:00Z","timestamp":1723420800000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2024,8,12]],"date-time":"2024-08-12T00:00:00Z","timestamp":1723420800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100006690","name":"Politecnico di Milano","doi-asserted-by":"crossref","id":[{"id":"10.13039\/501100006690","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Int J CARS"],"abstract":"<jats:title>Abstract<\/jats:title><jats:sec>\n                <jats:title>\n                           <jats:bold>Purpose:<\/jats:bold>\n                        <\/jats:title>\n                <jats:p>The integration of a surgical robotic instrument tracking module within optical microscopes holds the potential to advance microsurgery practices, as it facilitates automated camera movements, thereby augmenting the surgeon\u2019s capability in executing surgical procedures.<\/jats:p>\n              <\/jats:sec><jats:sec>\n                <jats:title>\n                           <jats:bold>Methods:<\/jats:bold>\n                        <\/jats:title>\n                <jats:p>In the present work, an innovative detection backbone based on spatial attention module is implemented to enhance the detection accuracy of small objects within the image. Additionally, we have introduced a robust data association technique, capable to re-track surgical instrument, mainly based on the knowledge of the dual-instrument robotics system, Intersection over Union metric and Kalman filter.<\/jats:p>\n              <\/jats:sec><jats:sec>\n                <jats:title>\n                           <jats:bold>Results:<\/jats:bold>\n                        <\/jats:title>\n                <jats:p>The effectiveness of this pipeline was evaluated through testing on a dataset comprising ten manually annotated videos of anastomosis procedures involving either animal or phantom vessels, exploiting the Symani\u00aeSurgical System\u2014a dedicated robotic platform designed for microsurgery. The multiple object tracking precision (MOTP) and the multiple object tracking accuracy (MOTA) are used to evaluate the performance of the proposed approach, and a new metric is computed to demonstrate the efficacy in stabilizing the tracking result along the video frames. An average MOTP of 74\u00b10.06% and a MOTA of 99\u00b10.03% over the test videos were found.<\/jats:p>\n              <\/jats:sec><jats:sec>\n                <jats:title>\n                           <jats:bold>Conclusion:<\/jats:bold>\n                        <\/jats:title>\n                <jats:p>These results confirm the potential of the proposed approach in enhancing precision and reliability in microsurgical instrument tracking. Thus, the integration of attention mechanisms and a tailored data association module could be a solid base for automatizing the motion of optical microscopes.<\/jats:p>\n              <\/jats:sec>","DOI":"10.1007\/s11548-024-03246-4","type":"journal-article","created":{"date-parts":[[2024,8,12]],"date-time":"2024-08-12T13:08:25Z","timestamp":1723468105000},"page":"2351-2362","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["A dual-instrument Kalman-based tracker to enhance robustness of microsurgical tools tracking"],"prefix":"10.1007","volume":"19","author":[{"ORCID":"https:\/\/orcid.org\/0000-0003-4385-9416","authenticated-orcid":false,"given":"Mattia","family":"Magro","sequence":"first","affiliation":[]},{"given":"Nicola","family":"Covallero","sequence":"additional","affiliation":[]},{"given":"Elena","family":"Gambaro","sequence":"additional","affiliation":[]},{"given":"Emanuele","family":"Ruffaldi","sequence":"additional","affiliation":[]},{"given":"Elena","family":"De Momi","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2024,8,12]]},"reference":[{"issue":"1","key":"3246_CR1","doi-asserted-by":"publisher","first-page":"7","DOI":"10.20517\/2347-9264.2022.101","volume":"10","author":"E Gousopoulos","year":"2023","unstructured":"Gousopoulos E, Gr\u00fcnherz L, Giovanoli P, Lindenblatt N (2023) Robotic-assisted microsurgery for lymphedema treatment. Plast Aesth Res 10(1):7","journal-title":"Plast Aesth Res"},{"issue":"5","key":"3246_CR2","doi-asserted-by":"publisher","first-page":"607","DOI":"10.1177\/15533506231191211","volume":"30","author":"HS Ghandourah","year":"2023","unstructured":"Ghandourah HS, Schols RM, Wolfs JA, Altaweel F, Mulken TJ (2023) Robotic microsurgery in plastic and reconstructive surgery: a literature review. Surg Innov 30(5):607\u2013614","journal-title":"Surg Innov"},{"key":"3246_CR3","doi-asserted-by":"publisher","first-page":"22","DOI":"10.3389\/fsurg.2018.00022","volume":"5","author":"YP Tan","year":"2018","unstructured":"Tan YP, Liverneaux P, Wong JK (2018) Current limitations of surgical robotics in reconstructive plastic microsurgery. Front Surg 5:22","journal-title":"Front Surg"},{"key":"3246_CR4","volume-title":"Robotics in plastic surgery","author":"L Gruenherz","year":"2023","unstructured":"Gruenherz L, Gousopoulos E, Barbon C, Uyulmaz S, Giovanoli P, Lindenblatt N (2023) Robotics in plastic surgery. Chirurgie, Heidelberg"},{"key":"3246_CR5","doi-asserted-by":"publisher","DOI":"10.1109\/TMRB.2023.3258524","author":"E Iovene","year":"2023","unstructured":"Iovene E, Casella A, Iordache AV, Fu J, Pessina F, Riva M, Ferrigno G, Momi ED (2023) Towards exoscope automation in neurosurgery: a markerless visual-servoing approach. IEEE Trans Med Robot Bion. https:\/\/doi.org\/10.1109\/TMRB.2023.3258524","journal-title":"IEEE Trans Med Robot Bion"},{"issue":"3","key":"3246_CR6","doi-asserted-by":"publisher","first-page":"358","DOI":"10.1016\/j.jhsg.2023.01.011","volume":"5","author":"DF Villavisanis","year":"2023","unstructured":"Villavisanis DF, Zhang D, Shay PL, Taub PJ, Venkatramani H, Melamed E (2023) Assisting in microsurgery: operative and technical considerations. J Hand Surg Global Online 5(3):358\u2013362","journal-title":"J Hand Surg Global Online"},{"key":"3246_CR7","doi-asserted-by":"crossref","unstructured":"Moln\u00e1r C, Nagy TD, Elek RN, Haidegger T (2020) Visual servoing-based camera control for the da vinci surgical system. In: 2020 IEEE 18th international symposium on intelligent systems and informatics (SISY), pp 107\u2013112. IEEE","DOI":"10.1109\/SISY50555.2020.9217086"},{"issue":"5","key":"3246_CR8","doi-asserted-by":"publisher","first-page":"1253","DOI":"10.1016\/j.surg.2020.10.039","volume":"169","author":"TM Ward","year":"2021","unstructured":"Ward TM, Mascagni P, Ban Y, Rosman G, Padoy N, Meireles O, Hashimoto DA (2021) Computer vision in surgery. Surgery 169(5):1253\u20131256","journal-title":"Surgery"},{"issue":"11","key":"3246_CR9","doi-asserted-by":"publisher","first-page":"3212","DOI":"10.1109\/TNNLS.2018.2876865","volume":"30","author":"Z-Q Zhao","year":"2019","unstructured":"Zhao Z-Q, Zheng P, Xu S, Wu X (2019) Object detection with deep learning: A review. IEEE Trans Neural Netw Learn Syst 30(11):3212\u20133232","journal-title":"IEEE Trans Neural Netw Learn Syst"},{"key":"3246_CR10","doi-asserted-by":"crossref","unstructured":"Zhang Y, Kim M, Jin S (2023) Real-time detection and tracking of surgical instrument based on yolov5 and deepsort. In: 2023 32nd IEEE international conference on robot and human interactive communication (RO-MAN), pp 1758\u20131763. IEEE","DOI":"10.1109\/RO-MAN57019.2023.10309495"},{"key":"3246_CR11","doi-asserted-by":"publisher","first-page":"17","DOI":"10.1016\/j.neucom.2018.01.092","volume":"300","author":"A Brunetti","year":"2018","unstructured":"Brunetti A, Buongiorno D, Trotta GF, Bevilacqua V (2018) Computer vision and deep learning techniques for pedestrian detection and tracking: a survey. Neurocomputing 300:17\u201333","journal-title":"Neurocomputing"},{"key":"3246_CR12","doi-asserted-by":"publisher","first-page":"1959","DOI":"10.1007\/s11548-018-1860-1","volume":"13","author":"Z Wang","year":"2018","unstructured":"Wang Z, Majewicz Fey A (2018) Deep learning with convolutional neural network for objective skill evaluation in robot-assisted surgery. Int J Comput Assist Radiol Surg 13:1959\u20131970","journal-title":"Int J Comput Assist Radiol Surg"},{"key":"3246_CR13","doi-asserted-by":"publisher","DOI":"10.1016\/j.robot.2021.103945","volume":"149","author":"Y Wang","year":"2022","unstructured":"Wang Y, Sun Q, Liu Z, Gu L (2022) Visual detection and tracking algorithms for minimally invasive surgical instruments: a comprehensive review of the state-of-the-art. Robot Auton Syst 149:103945","journal-title":"Robot Auton Syst"},{"issue":"4","key":"3246_CR14","doi-asserted-by":"publisher","first-page":"13","DOI":"10.1145\/1177352.1177355","volume":"38","author":"A Yilmaz","year":"2006","unstructured":"Yilmaz A, Javed O, Shah M (2006) Object tracking: a survey. Acm Comput Surv 38(4):13","journal-title":"Acm Comput Surv"},{"key":"3246_CR15","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1186\/s40537-021-00444-8","volume":"8","author":"L Alzubaidi","year":"2021","unstructured":"Alzubaidi L, Zhang J, Humaidi AJ, Al-Dujaili A, Duan Y, Al-Shamma O, Santamar\u00eda J, Fadhel MA, Al-Amidie M, Farhan L (2021) Review of deep learning: concepts, CNN architectures, challenges, applications, future directions. J Big Data 8:1\u201374","journal-title":"J Big Data"},{"issue":"7","key":"3246_CR16","doi-asserted-by":"publisher","first-page":"1542","DOI":"10.1109\/TMI.2017.2665671","volume":"36","author":"D Sarikaya","year":"2017","unstructured":"Sarikaya D, Corso JJ, Guru KA (2017) Detection and localization of robotic tools in robot-assisted surgery videos using deep neural networks for region proposal and detection. IEEE Trans Med Imaging 36(7):1542\u20131549","journal-title":"IEEE Trans Med Imaging"},{"key":"3246_CR17","doi-asserted-by":"publisher","first-page":"23748","DOI":"10.1109\/ACCESS.2020.2969885","volume":"8","author":"B Zhang","year":"2020","unstructured":"Zhang B, Wang S, Dong L, Chen P (2020) Surgical tools detection based on modulated anchoring network in laparoscopic videos. IEEE Access 8:23748\u201323758","journal-title":"IEEE Access"},{"issue":"6","key":"3246_CR18","doi-asserted-by":"publisher","first-page":"275","DOI":"10.1049\/htl.2019.0064","volume":"6","author":"Z Zhao","year":"2019","unstructured":"Zhao Z, Cai T, Chang F, Cheng X (2019) Real-time surgical instrument detection in robot-assisted surgery using a convolutional neural network cascade. Healthc Technol Lett 6(6):275\u2013279","journal-title":"Healthc Technol Lett"},{"key":"3246_CR19","doi-asserted-by":"crossref","unstructured":"Choi B, Jo K, Choi S, Choi J (2017) Surgical-tools detection based on convolutional neural network in laparoscopic robot-assisted surgery. In: 2017 39th annual international conference of the IEEE engineering in medicine and biology society (EMBC), pp 1756\u20131759. IEEE","DOI":"10.1109\/EMBC.2017.8037183"},{"issue":"3","key":"3246_CR20","doi-asserted-by":"publisher","first-page":"2714","DOI":"10.1109\/LRA.2019.2917163","volume":"4","author":"E Colleoni","year":"2019","unstructured":"Colleoni E, Moccia S, Du X, De Momi E, Stoyanov D (2019) Deep learning based robotic tool detection and articulation estimation with spatio-temporal layers. IEEE Robot Autom Lett 4(3):2714\u20132721","journal-title":"IEEE Robot Autom Lett"},{"key":"3246_CR21","doi-asserted-by":"publisher","first-page":"181723","DOI":"10.1109\/ACCESS.2020.3028910","volume":"8","author":"G Wang","year":"2020","unstructured":"Wang G, Wang S (2020) Surgical tools detection based on training sample adaptation in laparoscopic videos. IEEE Access 8:181723\u2013181732","journal-title":"IEEE Access"},{"key":"3246_CR22","doi-asserted-by":"publisher","first-page":"228853","DOI":"10.1109\/ACCESS.2020.3046258","volume":"8","author":"P Shi","year":"2020","unstructured":"Shi P, Zhao Z, Hu S, Chang F (2020) Real-time surgical tool detection in minimally invasive surgery based on attention-guided convolutional neural network. IEEE Access 8:228853\u2013228862","journal-title":"IEEE Access"},{"key":"3246_CR23","unstructured":"Beal J, Kim E, Tzeng E, Park DH, Zhai A, Kislyuk D (2020) Toward transformer-based object detection. arXiv:2012.09958"},{"key":"3246_CR24","doi-asserted-by":"publisher","first-page":"1123","DOI":"10.1093\/jcde\/qwac049","volume":"9","author":"K Liu","year":"2022","unstructured":"Liu K, Zhao Z, Shi P, Li F, Song H (2022) Real-time surgical tool detection in computer-aided surgery based on enhanced feature-fusion convolutional neural network. J Comput Des Eng 9:1123\u20131134. https:\/\/doi.org\/10.1093\/jcde\/qwac049","journal-title":"J Comput Des Eng"},{"key":"3246_CR25","doi-asserted-by":"crossref","unstructured":"Redmon J, Divvala S, Girshick R, Farhadi A (2016) You only look once: Unified, real-time object detection. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 779\u2013788","DOI":"10.1109\/CVPR.2016.91"},{"issue":"4","key":"3246_CR26","doi-asserted-by":"publisher","first-page":"1680","DOI":"10.3390\/make5040083","volume":"5","author":"J Terven","year":"2023","unstructured":"Terven J, C\u00f3rdova-Esparza D-M, Romero-Gonz\u00e1lez J-A (2023) A comprehensive review of YOLO architectures in computer vision: from YOLOv1 to YOLOv8 and YOLO-NAS. Mach Learn Knowl Extr 5(4):1680\u20131716","journal-title":"Mach Learn Knowl Extr"},{"key":"3246_CR27","unstructured":"Team R YOLO-NAS by deci achieves state-of-the-art performance on object detection using neural architecture search. https:\/\/deci.ai\/blog\/yolo-nas-object-detection-foundation-model\/ Accessed 20 Dec 2023"},{"key":"3246_CR28","doi-asserted-by":"crossref","unstructured":"Bewley A, Ge Z, Ott L, Ramos F, Upcroft B (2016) Simple online and realtime tracking. In: 2016 IEEE international conference on image processing (ICIP), pp. 3464\u20133468. IEEE","DOI":"10.1109\/ICIP.2016.7533003"},{"key":"3246_CR29","doi-asserted-by":"crossref","unstructured":"Wojke N, Bewley A, Paulus D (2017) Simple online and realtime tracking with a deep association metric. In: 2017 IEEE international conference on image processing (ICIP), pp 3645\u20133649. IEEE","DOI":"10.1109\/ICIP.2017.8296962"},{"issue":"6","key":"3246_CR30","doi-asserted-by":"publisher","first-page":"159","DOI":"10.1049\/htl.2019.0068","volume":"6","author":"L Qiu","year":"2019","unstructured":"Qiu L, Li C, Ren H (2019) Real-time surgical instrument tracking in robot-assisted surgery using multi-domain convolutional neural network. Healthc Technol Lett 6(6):159\u2013164","journal-title":"Healthc Technol Lett"},{"key":"3246_CR31","doi-asserted-by":"crossref","unstructured":"Chen Z, Zhao Z, Cheng X (2017) Surgical instruments tracking based on deep learning with lines detection and spatio-temporal context. In: 2017 Chinese Automation Congress (CAC), pp 2711\u20132714. IEEE","DOI":"10.1109\/CAC.2017.8243236"},{"key":"3246_CR32","doi-asserted-by":"crossref","unstructured":"Girshick R (2015) Fast r-cnn. In: Proceedings of the IEEE international conference on computer vision, pp. 1440\u20131448","DOI":"10.1109\/ICCV.2015.169"},{"key":"3246_CR33","doi-asserted-by":"publisher","unstructured":"Jocher, G., Stoken, A., Borovec, J., NanoCode012, ChristopherSTAN, Changyu, L., Laughing, tkianai, Hogan, A., lorenzomammana, yxNONG, AlexWang1900, Diaconu, L., Marc, wanghaoyang0106, ml5ah, Doug, Ingham, F., Frederik, Guilhen, Hatovix, Poznanski, J., Fang, J., Yu, L., changyu98, Wang, M., Gupta, N., Akhtar, O., PetrDvoracek, Rai, P.: ultralytics\/yolov5: V3.1 - Bug Fixes and Performance Improvements. https:\/\/doi.org\/10.5281\/zenodo.4154370","DOI":"10.5281\/zenodo.4154370"},{"key":"3246_CR34","unstructured":"Team R Mufasa: Server for Massively Parallel Computation. https:\/\/biohpc.deib.polimi.it\/index.php?title=System Accessed 20 Dec 2023"},{"key":"3246_CR35","doi-asserted-by":"crossref","unstructured":"Bernardin K, Stiefelhagen R (2008) Evaluating multiple object tracking performance: the clear mot metrics. EURASIP J Image Video Process 2008:1\u201310","DOI":"10.1155\/2008\/246309"}],"container-title":["International Journal of Computer Assisted Radiology and Surgery"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11548-024-03246-4.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s11548-024-03246-4\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11548-024-03246-4.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,11,29]],"date-time":"2024-11-29T15:14:49Z","timestamp":1732893289000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s11548-024-03246-4"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,8,12]]},"references-count":35,"journal-issue":{"issue":"12","published-online":{"date-parts":[[2024,12]]}},"alternative-id":["3246"],"URL":"https:\/\/doi.org\/10.1007\/s11548-024-03246-4","relation":{},"ISSN":["1861-6429"],"issn-type":[{"type":"electronic","value":"1861-6429"}],"subject":[],"published":{"date-parts":[[2024,8,12]]},"assertion":[{"value":"10 January 2024","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"26 July 2024","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"12 August 2024","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare that they have no conflict of interest.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}},{"value":"In the creation of the dataset used in this study, all applicable international, national, and\/or institutional guidelines for the care and use of animals were followed. The dataset has not been collected specifically for this study.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethical approval"}},{"value":"This articles does not contain or uses patient data.","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Informed consent"}}]}}