{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,13]],"date-time":"2026-03-13T22:50:49Z","timestamp":1773442249076,"version":"3.50.1"},"reference-count":36,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2022,3,25]],"date-time":"2022-03-25T00:00:00Z","timestamp":1648166400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"Michigan Translational Research and Commercialization (MTRAC) and","award":["380137\/23R343 and 192748\/117E37"],"award-info":[{"award-number":["380137\/23R343 and 192748\/117E37"]}]},{"name":"US Department of Veterans Affairs National Center for Patient Safety","award":["VA701-15-Q-O179\/2VHF"],"award-info":[{"award-number":["VA701-15-Q-O179\/2VHF"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Robotics"],"abstract":"<jats:p>Positioning a camera during laparoscopic and robotic procedures is challenging and essential for successful operations. During surgery, if the camera view is not optimal, surgery becomes more complex and potentially error-prone. To address this need, we have developed a voice interface to an autonomous camera system that can trigger behavioral changes and be more of a partner to the surgeon. Similarly to a human operator, the camera can take cues from the surgeon to help create optimized surgical camera views. It has the advantage of nominal behavior that is helpful in most general cases and has a natural language interface that makes it dynamically customizable and on-demand. It permits the control of a camera with a higher level of abstraction. This paper shows the implementation details and usability of a voice-activated autonomous camera system. A voice activation test on a limited set of practiced key phrases was performed using both online and offline voice recognition systems. The results show an on-average greater than 94% recognition accuracy for the online system and 86% accuracy for the offline system. However, the response time of the online system was greater than 1.5 s, whereas the local system was 0.6 s. This work is a step towards cooperative surgical robots that will effectively partner with human operators to enable more robust surgeries. A video link of the system in operation is provided in this paper.<\/jats:p>","DOI":"10.3390\/robotics11020040","type":"journal-article","created":{"date-parts":[[2022,3,27]],"date-time":"2022-03-27T21:31:25Z","timestamp":1648416685000},"page":"40","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":18,"title":["A Natural Language Interface for an Autonomous Camera Control System on the da Vinci Surgical Robot"],"prefix":"10.3390","volume":"11","author":[{"given":"Maysara","family":"Elazzazi","sequence":"first","affiliation":[{"name":"Department of Electrical and Computer Engineering, Wayne State University, Detroit, MI 48202, USA"}]},{"given":"Luay","family":"Jawad","sequence":"additional","affiliation":[{"name":"Department of Computer Science, Wayne State University, Detroit, MI 48202, USA"}]},{"given":"Mohammed","family":"Hilfi","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, Wayne State University, Detroit, MI 48202, USA"}]},{"given":"Abhilash","family":"Pandya","sequence":"additional","affiliation":[{"name":"Department of Electrical and Computer Engineering, Wayne State University, Detroit, MI 48202, USA"}]}],"member":"1968","published-online":{"date-parts":[[2022,3,25]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"56","DOI":"10.1109\/MRA.2021.3101646","article-title":"Accelerating Surgical Robotics Research: A Review of 10 Years with the da Vinci Research Kit","volume":"28","author":"Mariani","year":"2021","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"310","DOI":"10.3390\/robotics3030310","article-title":"A Review of Camera Viewpoint Automation in Robotic and Laparoscopic Surgery","volume":"3","author":"Pandya","year":"2014","journal-title":"Robotics"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"576","DOI":"10.1002\/rcs.1716","article-title":"Task analysis of laparoscopic camera control schemes","volume":"12","author":"Ellis","year":"2016","journal-title":"Int. J. Med. Robot. Comput. Assist. Surg."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Staub, C., Can, S., Knoll, A., Nitsch, V., Karl, I., and F\u00e4rber, B. (2011, January 14\u201317). Implementation and evaluation of a gesture-based input method in robotic surgery. Proceedings of the 2011 IEEE International Workshop on Haptic Audio Visual Environments and Games, Qinhuangdao, China.","DOI":"10.1109\/HAVE.2011.6088384"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1415","DOI":"10.1007\/s004649900871","article-title":"Laparoscopic visual field: Voice vs foot pedal interfaces for control of the AESOP robot","volume":"12","author":"Allaf","year":"1998","journal-title":"Surg. Endosc."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"369","DOI":"10.1080\/13645700500381685","article-title":"Voice recognition interfaces (VRI) optimize the utilization of theatre staff and time during laparoscopic cholecystectomy","volume":"14","author":"Mohammed","year":"2005","journal-title":"Minim. Invasive Ther. Allied Technol."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"1216","DOI":"10.1007\/s00464-003-9200-z","article-title":"The AESOP robot system in laparoscopic surgery: Increased risk or advantage for surgeon and patient?","volume":"18","author":"Kraft","year":"2004","journal-title":"Surg. Endosc."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"900","DOI":"10.1097\/SLA.0000000000004419","article-title":"Natural Language Processing in Surgery: A Systematic Review and Meta-analysis","volume":"273","author":"Mellia","year":"2021","journal-title":"Ann. Surg."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"2748","DOI":"10.1093\/humrep\/13.10.2748","article-title":"One year of experience working with the aid of a robotic assistant (the voice-controlled optic holder AESOP) in gynaecological endoscopic surgery","volume":"13","author":"Mettler","year":"1998","journal-title":"Hum. Reprod."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"291","DOI":"10.1007\/s11548-016-1480-6","article-title":"Touchless interaction with software in interventional radiology and surgery: A systematic literature review","volume":"12","author":"Mewes","year":"2016","journal-title":"Int. J. Comput. Assist. Radiol. Surg."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"123","DOI":"10.1055\/s-2006-939679","article-title":"The voice-controlled robotic assist scope holder AESOP for the endoscopic approach to the sella","volume":"16","author":"Nathan","year":"2006","journal-title":"Skull Base"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"575","DOI":"10.1007\/s00464-012-2488-9","article-title":"Integrated operation systems and voice recognition in minimally invasive surgery: Comparison of two systems","volume":"27","author":"Perrakis","year":"2013","journal-title":"Surg. Endosc."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"1131","DOI":"10.1007\/BF00705739","article-title":"AESOP robotic arm","volume":"8","author":"Unger","year":"1994","journal-title":"Surg. Endosc."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"263","DOI":"10.1002\/rcs.1531","article-title":"Visual servoing in medical robotics: A survey. Part I: Endoscopic and direct vision imaging\u2014Techniques and applications","volume":"10","author":"Azizian","year":"2014","journal-title":"Int. J. Med. Robot. Comput. Assist. Surg."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"40","DOI":"10.1109\/51.566151","article-title":"Real-time visual servoing for laparoscopic surgery. Controlling robot motion with color image segmentation","volume":"16","author":"Wei","year":"1997","journal-title":"Eng. Med. Biol. Mag. IEEE"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Bihlmaier, A., and Worn, H. (2015, January 17\u201318). Learning surgical know-how: Dexterity for a cognitive endoscope robot. Proceedings of the Cybernetics and Intelligent Systems (CIS) and IEEE Conference on Robotics, Automation and Mechatronics (RAM), 2015 IEEE 7th International Conference on Engineering Education (ICEED), Kanazawa, Japan.","DOI":"10.1109\/ICCIS.2015.7274610"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Da Col, T., Mariani, A., Deguet, A., Menciassi, A., Kazanzides, P., and De Momi, E. (2020, January 25\u201329). Scan: System for camera autonomous navigation in robotic-assisted surgery. Proceedings of the 2020 IEEE\/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA.","DOI":"10.1109\/IROS45743.2020.9341548"},{"key":"ref_18","unstructured":"Eslamian, S., Reisner, L.A., King, B.W., and Pandya, A.K. (2016). Towards the implementation of an autonomous camera algorithm on the da vinci platform. Medicine Meets Virtual Reality 22, IOS Press."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"e2036","DOI":"10.1002\/rcs.2036","article-title":"Development and evaluation of an autonomous camera control algorithm on the da Vinci Surgical System","volume":"16","author":"Eslamian","year":"2020","journal-title":"Int. J. Med. Robot. Comput. Assist. Surg."},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Weede, O., Bihlmaier, A., Hutzl, J., M\u00fcller-Stich, B.P., and W\u00f6rn, H. (2013, January 4\u20136). Towards cognitive medical robotics in minimal invasive surgery. Proceedings of the Conference on Advances in Robotics, Pune, India.","DOI":"10.1145\/2506095.2506137"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Weede, O., Monnich, H., Muller, B., and Worn, H. (2011, January 9\u201313). An intelligent and autonomous endoscopic guidance system for minimally invasive surgery. Proceedings of the IEEE International Confrence on Robotics and Automation, Shanghai, China. Available online: https:\/\/go.exlibris.link\/j6RcCL1h.","DOI":"10.1109\/ICRA.2011.5980216"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Composto, A.M., Reisner, L.A., Pandya, A.K., Edelman, D.A., Jacobs, K.L., and Bagian, T.M. (2017). Methods to Characterize Operating Room Variables in Robotic Surgery to Enhance Patient Safety. Advances in Human Factors and Ergonomics in Healthcare, Springer.","DOI":"10.1007\/978-3-319-41652-6_21"},{"key":"ref_23","unstructured":"(2022, January 15). Vosk Offline Speech Recognition API. Available online: https:\/\/alphacephei.com\/vosk\/."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Chen, Z., Deguet, A., Taylor, R., DiMaio, S., Fischer, G., and Kazanzides, P. (2013, January 22\u201326). An Open-Source Hardware and Software Platform for Telesurgical Robotics Research. Proceedings of the MICCAI Workshop on Systems and Architecture for Computer Assisted Interventions, Nagoya, Japan.","DOI":"10.54294\/2dcog6"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Quigley, M., Conley, K., Gerkey, B., Faust, J., Foote, T., Leibs, J., Wheeler, R., and Ng, A.Y. (2009, January 12\u201317). ROS: An open-source Robot Operating System. Proceedings of the ICRA Workshop on Open Source Software, Kobe, Japan.","DOI":"10.1109\/MRA.2010.936956"},{"key":"ref_26","unstructured":"Open Source Robotics Foundation (2016, March 01). RViz. Available online: http:\/\/wiki.ros.org\/rviz."},{"key":"ref_27","unstructured":"(2022, January 24). Alexa Skills Builder. Available online: https:\/\/developer.amazon.com\/en-US\/alexa."},{"key":"ref_28","unstructured":"(2022, March 02). Ngrok. Available online: https:\/\/ngrok.com\/."},{"key":"ref_29","unstructured":"Povey, D., Ghoshal, A., Boulianne, G., Burget, L., Glembek, O., Goel, N., Hannemann, M., Motlicek, P., Qian, Y., and Schwarz, P. (2011, January 11\u201315). The Kaldi speech recognition toolkit. Proceedings of the IEEE 2011 Workshop on Automatic Speech Recognition and Understanding, Waikoloa, HI, USA."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1002\/rcs.2166","article-title":"An entropy-based approach to detect and localize intraoperative bleeding during minimally invasive surgery","volume":"16","author":"Rahbar","year":"2020","journal-title":"Int. J. Med. Robot. Comput. Assist. Surg."},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Daneshgar Rahbar, M., Ying, H., and Pandya, A. (2021). Visual Intelligence: Prediction of Unintentional Surgical-Tool-Induced Bleeding during Robotic and Laparoscopic Surgery. Robotics, 10.","DOI":"10.3390\/robotics10010037"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Pandya, A., Eslamian, S., Ying, H., Nokleby, M., and Reisner, L.A. (2019). A Robotic Recording and Playback Platform for Training Surgeons and Learning Autonomous Behaviors Using the da Vinci Surgical System. Robotics, 8.","DOI":"10.3390\/robotics8010009"},{"key":"ref_33","unstructured":"(2022, February 15). Available online: https:\/\/github.com\/careslab\/dvrk_voice."},{"key":"ref_34","unstructured":"(2022, February 15). Available online: https:\/\/github.com\/careslab\/dvrk_autocamera."},{"key":"ref_35","unstructured":"(2022, February 15). Available online: https:\/\/github.com\/careslab\/dvrk_assistant_bridge."},{"key":"ref_36","unstructured":"(2022, February 15). Available online: https:\/\/youtu.be\/UZa7xCtOYT0."}],"container-title":["Robotics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2218-6581\/11\/2\/40\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T22:43:31Z","timestamp":1760136211000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2218-6581\/11\/2\/40"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,3,25]]},"references-count":36,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2022,4]]}},"alternative-id":["robotics11020040"],"URL":"https:\/\/doi.org\/10.3390\/robotics11020040","relation":{},"ISSN":["2218-6581"],"issn-type":[{"value":"2218-6581","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,3,25]]}}}