{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,11]],"date-time":"2026-03-11T16:19:47Z","timestamp":1773245987745,"version":"3.50.1"},"reference-count":33,"publisher":"MDPI AG","issue":"20","license":[{"start":{"date-parts":[[2023,10,22]],"date-time":"2023-10-22T00:00:00Z","timestamp":1697932800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"the National Key Research and Development Program of China","award":["2022YFB4702800"],"award-info":[{"award-number":["2022YFB4702800"]}]},{"name":"the National Key Research and Development Program of China","award":["22KPXMRC00040"],"award-info":[{"award-number":["22KPXMRC00040"]}]},{"name":"the National Key Research and Development Program of China","award":["20YFZCSY00830"],"award-info":[{"award-number":["20YFZCSY00830"]}]},{"name":"the National Key Research and Development Program of China","award":["18ZXZNGX00340"],"award-info":[{"award-number":["18ZXZNGX00340"]}]},{"name":"the National Key Research and Development Program of China","award":["KQTD20210811090143060"],"award-info":[{"award-number":["KQTD20210811090143060"]}]},{"name":"the Science and Technology Program of Tianjin","award":["2022YFB4702800"],"award-info":[{"award-number":["2022YFB4702800"]}]},{"name":"the Science and Technology Program of Tianjin","award":["22KPXMRC00040"],"award-info":[{"award-number":["22KPXMRC00040"]}]},{"name":"the Science and Technology Program of Tianjin","award":["20YFZCSY00830"],"award-info":[{"award-number":["20YFZCSY00830"]}]},{"name":"the Science and Technology Program of Tianjin","award":["18ZXZNGX00340"],"award-info":[{"award-number":["18ZXZNGX00340"]}]},{"name":"the Science and Technology Program of Tianjin","award":["KQTD20210811090143060"],"award-info":[{"award-number":["KQTD20210811090143060"]}]},{"name":"the Technology Research and Development Program of Tianjin","award":["2022YFB4702800"],"award-info":[{"award-number":["2022YFB4702800"]}]},{"name":"the Technology Research and Development Program of Tianjin","award":["22KPXMRC00040"],"award-info":[{"award-number":["22KPXMRC00040"]}]},{"name":"the Technology Research and Development Program of Tianjin","award":["20YFZCSY00830"],"award-info":[{"award-number":["20YFZCSY00830"]}]},{"name":"the Technology Research and Development Program of Tianjin","award":["18ZXZNGX00340"],"award-info":[{"award-number":["18ZXZNGX00340"]}]},{"name":"the Technology Research and Development Program of Tianjin","award":["KQTD20210811090143060"],"award-info":[{"award-number":["KQTD20210811090143060"]}]},{"name":"Shenzhen Science and Technology Program","award":["2022YFB4702800"],"award-info":[{"award-number":["2022YFB4702800"]}]},{"name":"Shenzhen Science and Technology Program","award":["22KPXMRC00040"],"award-info":[{"award-number":["22KPXMRC00040"]}]},{"name":"Shenzhen Science and Technology Program","award":["20YFZCSY00830"],"award-info":[{"award-number":["20YFZCSY00830"]}]},{"name":"Shenzhen Science and Technology Program","award":["18ZXZNGX00340"],"award-info":[{"award-number":["18ZXZNGX00340"]}]},{"name":"Shenzhen Science and Technology Program","award":["KQTD20210811090143060"],"award-info":[{"award-number":["KQTD20210811090143060"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>The robotic surgery environment represents a typical scenario of human\u2013robot cooperation. In such a scenario, individuals, robots, and medical devices move relative to each other, leading to unforeseen mutual occlusion. Traditional methods use binocular OTS to focus on the local surgical site, without considering the integrity of the scene, and the work space is also restricted. To address this challenge, we propose the concept of a fully perception robotic surgery environment and build a global\u2013local joint positioning framework. Furthermore, based on data characteristics, an improved Kalman filter method is proposed to improve positioning accuracy. Finally, drawing from the view margin model, we design a method to evaluate positioning accuracy in a dynamic occlusion environment. The experimental results demonstrate that our method yields better positioning results than classical filtering methods.<\/jats:p>","DOI":"10.3390\/s23208637","type":"journal-article","created":{"date-parts":[[2023,10,22]],"date-time":"2023-10-22T07:02:53Z","timestamp":1697958173000},"page":"8637","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":2,"title":["Full-Perception Robotic Surgery Environment with Anti-Occlusion Global\u2013Local Joint Positioning"],"prefix":"10.3390","volume":"23","author":[{"given":"Hongpeng","family":"Wang","sequence":"first","affiliation":[{"name":"College of Artificial Intelligence, Nankai University, Tianiin 300353, China"},{"name":"Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China"},{"name":"Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, Nankai University, Tianjin 300350, China"}]},{"given":"Tianzuo","family":"Liu","sequence":"additional","affiliation":[{"name":"College of Artificial Intelligence, Nankai University, Tianiin 300353, China"},{"name":"Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China"}]},{"given":"Jianren","family":"Chen","sequence":"additional","affiliation":[{"name":"College of Artificial Intelligence, Nankai University, Tianiin 300353, China"},{"name":"Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5171-102X","authenticated-orcid":false,"given":"Chongshan","family":"Fan","sequence":"additional","affiliation":[{"name":"College of Artificial Intelligence, Nankai University, Tianiin 300353, China"},{"name":"Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5162-1665","authenticated-orcid":false,"given":"Yanding","family":"Qin","sequence":"additional","affiliation":[{"name":"College of Artificial Intelligence, Nankai University, Tianiin 300353, China"},{"name":"Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China"},{"name":"Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, Nankai University, Tianjin 300350, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-9664-4534","authenticated-orcid":false,"given":"Jianda","family":"Han","sequence":"additional","affiliation":[{"name":"College of Artificial Intelligence, Nankai University, Tianiin 300353, China"},{"name":"Institute of Intelligence Technology and Robotic Systems, Shenzhen Research Institute of Nankai University, Shenzhen 518083, China"},{"name":"Engineering Research Center of Trusted Behavior Intelligence, Ministry of Education, Nankai University, Tianjin 300350, China"}]}],"member":"1968","published-online":{"date-parts":[[2023,10,22]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"108","DOI":"10.1159\/000098982","article-title":"A universal system for interactive image-directed neurosurgery","volume":"58","author":"Maciunas","year":"1992","journal-title":"Ster. Funct Neurosurg."},{"key":"ref_2","first-page":"1","article-title":"An Efficient Magnetic Tracking Method Using Uniaxial Sensing Coil","volume":"50","author":"Song","year":"2014","journal-title":"IEEE Trans. Magn."},{"key":"ref_3","unstructured":"Morris, A., Dickinson, M., and Zalzala, A. (1991, January 25\u201328). An enhanced ultrasonic system for robot end effector tracking. Proceedings of the International Conference on Control 1991, Control\u201991, Edinburgh, UK."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"435","DOI":"10.1049\/iet-cvi.2011.0059","article-title":"Marker-based quadri-ocular tracking system for surgery","volume":"6","author":"He","year":"2012","journal-title":"Proc. IET Comput. Vis."},{"key":"ref_5","first-page":"921","article-title":"Marker-based surgical instrument tracking using dual kinect sensors","volume":"11","author":"Ren","year":"2013","journal-title":"IEEE Trans. Autom. Sci. Eng."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Wang, J., Ren, H., and Meng, M.Q.H. (2014, January 4\u20137). A preliminary study on surgical instrument tracking based on multiple modules of monocular pose estimation. Proceedings of the 4th Annual IEEE International Conference on Cyber Technology in Automation, Control and Intelligent, Hong Kong, China.","DOI":"10.1109\/CYBER.2014.6917451"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Wiles, A.D., Thompson, D.G., and Frantz, D.D. (2004, January 14\u201319). Accuracy assessment and interpretation for optical tracking systems. Proceedings of the Medical Imaging 2004: Visualization, Image-Guided Procedures, and Display, San Diego, CA, USA.","DOI":"10.1117\/12.536128"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"261","DOI":"10.2106\/JBJS.F.00601","article-title":"Navigated total knee replacement: A meta-analysis","volume":"89","author":"Bauwens","year":"2007","journal-title":"JBJS"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"45","DOI":"10.1007\/s11548-008-0268-8","article-title":"Localization and registration accuracy in image guided neurosurgery: A clinical study","volume":"4","author":"Shamir","year":"2009","journal-title":"Int. J. Comput. Assist. Radiol. Surg."},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Garc\u00eda-V\u00e1zquez, V., Marinetto, E., Santos-Miranda, J., Calvo, F., Desco, M., and Pascau, J. (2013). Feasibility of integrating a multi-camera optical tracking system in intra-operative electron radiation therapy scenarios. Phys. Med. Biol., 58.","DOI":"10.1088\/0031-9155\/58\/24\/8769"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Tran, D.T., Sakurai, R., Yamazoe, H., and Lee, J.H. (July, January 28). PCA-based surgical phases estimation with a multi-camera system. Proceedings of the 2017 14th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), Jeju, Republic of Korea.","DOI":"10.1109\/URAI.2017.7992903"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"e1889","DOI":"10.1002\/rcs.1889","article-title":"A novel rotational matrix and translation vector algorithm: Geometric accuracy for augmented reality in oral and maxillofacial surgeries","volume":"14","author":"Murugesan","year":"2018","journal-title":"Int. J. Med. Robot. Comput. Assist. Surg."},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Pfeiffer, J.H., Borb\u00e1th, \u00c1., Dietz, C., and Lueth, T.C. (2016, January 13\u201315). A new module combining two tracking cameras to expand the workspace of surgical navigation systems. Proceedings of the 2016 IEEE\/SICE International Symposium on System Integration (SII), Sapporo, Japan.","DOI":"10.1109\/SII.2016.7844044"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Wang, J., Qi, L., and Meng, M.Q.H. (2015, January 8\u201310). Robot-assisted occlusion avoidance for surgical instrument optical tracking system. Proceedings of the 2015 IEEE International Conference on Information and Automation, Lijiang, China.","DOI":"10.1109\/ICInfA.2015.7279316"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Tobergte, A., Pomarlan, M., and Hirzinger, G. (2009, January 10\u201315). Robust multi sensor pose estimation for medical applications. Proceedings of the 2009 IEEE\/RSJ International Conference on Intelligent Robots and Systems, St. Louis, MO, USA.","DOI":"10.1109\/IROS.2009.5354696"},{"key":"ref_16","first-page":"2","article-title":"Computer vision in the operating room: Opportunities and caveats","volume":"3","author":"Mascagni","year":"2020","journal-title":"IEEE Trans. Med. Rob. Bionics"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1053\/j.semtcvs.2019.10.011","article-title":"Cognitive engineering to improve patient safety and outcomes in cardiothoracic surgery","volume":"32","author":"Zenati","year":"2020","journal-title":"Semin. Thorac. Cardiovasc. Surg."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"901","DOI":"10.1109\/TMRB.2022.3196321","article-title":"Fluoroscopy-Guided Robotic System for Transforaminal Lumbar Epidural Injections","volume":"4","author":"Gao","year":"2022","journal-title":"IEEE Trans. Med. Robot. Bionics"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Wang, H., and Wang, P. (2022, January 11\u201315). Clinical Study of Cervical Spine Motion Trajectory Based on Computer Motion Capture Technology. Proceedings of the 2022 World Automation Congress (WAC), San Antonio, TX, USA.","DOI":"10.23919\/WAC55640.2022.9934162"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"347","DOI":"10.1016\/j.physio.2013.03.001","article-title":"Affordable clinical gait analysis: An assessment of the marker tracking accuracy of a new low-cost optical 3D motion analysis system","volume":"99","author":"Carse","year":"2013","journal-title":"Physiotherapy"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"2085","DOI":"10.1016\/j.jbiomech.2016.05.007","article-title":"Analysis of accuracy in optical motion capture\u2013A protocol for laboratory setup evaluation","volume":"49","author":"Eichelberger","year":"2016","journal-title":"J. Biomech."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"64359","DOI":"10.1109\/ACCESS.2018.2878323","article-title":"Multicamera optical tracker assessment for computer aided surgery applications","volume":"6","author":"Marinetto","year":"2018","journal-title":"IEEE Access"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Min, Z., Zhu, D., and Meng, M.Q.H. (2016, January 1\u20133). Accuracy assessment of an N-ocular motion capture system for surgical tool tip tracking using pivot calibration. Proceedings of the 2016 IEEE International Conference on Information and Automation (ICIA), Ningbo, China.","DOI":"10.1109\/ICInfA.2016.7832079"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"237","DOI":"10.1016\/j.jbiomech.2017.05.006","article-title":"Accuracy map of an optical motion capture system with 42 or 21 cameras in a large measurement volume","volume":"58","author":"Aurand","year":"2017","journal-title":"J. Biomech."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Meng, Y., You, Y., Geng, P., Song, Z., Wang, H., and Qin, Y. (2021, January 27\u201331). Development of an intra-operative active navigation system for robot-assisted surgery. Proceedings of the 2021 IEEE International Conference on Robotics and Biomimetics (ROBIO), Sanya, China.","DOI":"10.1109\/ROBIO54168.2021.9739506"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Skurowski, P., and Pawlyta, M. (2019). On the noise complexity in an optical motion capture facility. Sensors, 19.","DOI":"10.20944\/preprints201909.0178.v1"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"927","DOI":"10.1016\/S0031-3203(01)00076-0","article-title":"Optimal camera placement for accurate reconstruction","volume":"35","author":"Olague","year":"2002","journal-title":"Pattern Recognit."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"1209","DOI":"10.1109\/TVCG.2016.2637334","article-title":"Optimal camera placement for motion capture systems","volume":"23","author":"Rahimian","year":"2016","journal-title":"IEEE Trans. Vis. Comput. Graph."},{"key":"ref_29","unstructured":"Tan, J., Li, D., Zhang, J.Q., Hu, B., and Lu, Q. (December, January 28). Biased Kalman filter. Proceedings of the 2011 Fifth International Conference on Sensing Technology, Palmerston North, New Zealand."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"280","DOI":"10.1117\/1.601615","article-title":"Analysis of uncertainty bounds due to quantization for three-dimensional position estimation using multiple cameras","volume":"37","author":"Wu","year":"1998","journal-title":"Opt. Eng."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"217","DOI":"10.1007\/s00138-007-0094-y","article-title":"An occlusion metric for selecting robust camera configurations","volume":"19","author":"Chen","year":"2008","journal-title":"Mach. Vis. Appl."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"110040","DOI":"10.1016\/j.asoc.2023.110040","article-title":"EGNN: Graph structure learning based on evolutionary computation helps more in graph neural networks","volume":"135","author":"Liu","year":"2023","journal-title":"Appl. Soft Comput."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"1108","DOI":"10.1007\/s12555-021-0882-6","article-title":"Output-feedback Robust Tracking Control of Uncertain Systems via Adaptive Learning","volume":"21","author":"Jun","year":"2023","journal-title":"Int. J. Control Autom. Syst."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/20\/8637\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T21:09:57Z","timestamp":1760130597000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/20\/8637"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,10,22]]},"references-count":33,"journal-issue":{"issue":"20","published-online":{"date-parts":[[2023,10]]}},"alternative-id":["s23208637"],"URL":"https:\/\/doi.org\/10.3390\/s23208637","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,10,22]]}}}