{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,15]],"date-time":"2026-04-15T01:24:58Z","timestamp":1776216298983,"version":"3.50.1"},"reference-count":44,"publisher":"MDPI AG","issue":"9","license":[{"start":{"date-parts":[[2023,4,23]],"date-time":"2023-04-23T00:00:00Z","timestamp":1682208000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001823","name":"the Operational Program Research, Development and Education","doi-asserted-by":"publisher","award":["CZ.02.1.01\/0.0\/0.0\/17_049\/0008425"],"award-info":[{"award-number":["CZ.02.1.01\/0.0\/0.0\/17_049\/0008425"]}],"id":[{"id":"10.13039\/501100001823","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001823","name":"the Operational Program Research, Development and Education","doi-asserted-by":"publisher","award":["SP2023\/060"],"award-info":[{"award-number":["SP2023\/060"]}],"id":[{"id":"10.13039\/501100001823","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001823","name":"the state budget of the Czech Republic","doi-asserted-by":"publisher","award":["CZ.02.1.01\/0.0\/0.0\/17_049\/0008425"],"award-info":[{"award-number":["CZ.02.1.01\/0.0\/0.0\/17_049\/0008425"]}],"id":[{"id":"10.13039\/501100001823","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001823","name":"the state budget of the Czech Republic","doi-asserted-by":"publisher","award":["SP2023\/060"],"award-info":[{"award-number":["SP2023\/060"]}],"id":[{"id":"10.13039\/501100001823","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>The article explores the possibilities of using hand gestures as a control interface for robotic systems in a collaborative workspace. The development of hand gesture control interfaces has become increasingly important in everyday life as well as professional contexts such as manufacturing processes. We present a system designed to facilitate collaboration between humans and robots in manufacturing processes that require frequent revisions of the robot path and that allows direct definition of the waypoints, which differentiates our system from the existing ones. We introduce a novel and intuitive approach to human\u2013robot cooperation through the use of simple gestures. As part of a robotic workspace, a proposed interface was developed and implemented utilising three RGB-D sensors for monitoring the operator\u2019s hand movements within the workspace. The system employs distributed data processing through multiple Jetson Nano units, with each unit processing data from a single camera. MediaPipe solution is utilised to localise the hand landmarks in the RGB image, enabling gesture recognition. We compare the conventional methods of defining robot trajectories with their developed gesture-based system through an experiment with 20 volunteers. The experiment involved verification of the system under realistic conditions in a real workspace closely resembling the intended industrial application. Data collected during the experiment included both objective and subjective parameters. The results indicate that the gesture-based interface enables users to define a given path objectively faster than conventional methods. We critically analyse the features and limitations of the developed system and suggest directions for future research. Overall, the experimental results indicate the usefulness of the developed system as it can speed up the definition of the robot\u2019s path.<\/jats:p>","DOI":"10.3390\/s23094219","type":"journal-article","created":{"date-parts":[[2023,4,24]],"date-time":"2023-04-24T03:04:08Z","timestamp":1682305448000},"page":"4219","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":17,"title":["Hand Gesture Interface for Robot Path Definition in Collaborative Applications: Implementation and Comparative Study"],"prefix":"10.3390","volume":"23","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-6942-4280","authenticated-orcid":false,"given":"Ale\u0161","family":"Vysock\u00fd","sequence":"first","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB\u2014Technical University of Ostrava, 17. Listopadu 2172\/15, 708 00 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0009-0009-9453-9958","authenticated-orcid":false,"given":"Tom\u00e1\u0161","family":"Po\u0161tulka","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB\u2014Technical University of Ostrava, 17. Listopadu 2172\/15, 708 00 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-4946-8638","authenticated-orcid":false,"given":"Jakub","family":"Chlebek","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB\u2014Technical University of Ostrava, 17. Listopadu 2172\/15, 708 00 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2357-806X","authenticated-orcid":false,"given":"Tom\u00e1\u0161","family":"Kot","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB\u2014Technical University of Ostrava, 17. Listopadu 2172\/15, 708 00 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3099-1963","authenticated-orcid":false,"given":"Jan","family":"Maslowski","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB\u2014Technical University of Ostrava, 17. Listopadu 2172\/15, 708 00 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8984-153X","authenticated-orcid":false,"given":"Stefan","family":"Grushko","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB\u2014Technical University of Ostrava, 17. Listopadu 2172\/15, 708 00 Ostrava, Czech Republic"}]}],"member":"1968","published-online":{"date-parts":[[2023,4,23]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"903","DOI":"10.17973\/MMSJ.2016_06_201611","article-title":"Human-Robot Collaboration in Industry","volume":"2016","author":"Vysocky","year":"2016","journal-title":"MM Sci. J."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"416","DOI":"10.1108\/IR-03-2015-0059","article-title":"The Integration of Contactless Static Pose Recognition and Dynamic Hand Motion Tracking Control System for Industrial Human and Robot Collaboration","volume":"42","author":"Tang","year":"2015","journal-title":"Ind. Robot Int. J."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"5497","DOI":"10.1007\/s00170-022-09107-1","article-title":"Precise Positioning of Collaborative Robotic Manipulators Using Hand-Guiding","volume":"120","author":"Safeea","year":"2022","journal-title":"Int. J. Adv. Manuf. Technol."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"121","DOI":"10.1016\/j.techfore.2021.121284","article-title":"Human-Machine Interface in Smart Factory: A Systematic Literature Review","volume":"174","author":"Kumar","year":"2022","journal-title":"Technol. Forecast. Soc. Chang."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"123","DOI":"10.1016\/j.procir.2020.05.213","article-title":"Programming Cobots by Voice: A Human-Centered, Web-Based Approach","volume":"97","author":"Ionescu","year":"2021","journal-title":"Procedia CIRP"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1017\/S0263574714000010","article-title":"Command-Based Voice Teleoperation of a Mobile Robot via a Human-Robot Interface","volume":"33","author":"Poncela","year":"2015","journal-title":"Robotica"},{"key":"ref_7","first-page":"505","article-title":"Robot-by-voice: Experiments on Commanding an Industrial Robot Using the Human Voice. Industrial Robot","volume":"32","year":"2005","journal-title":"Int. J."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Scalera, L., Seriani, S., Gallina, P., Lentini, M., and Gasparetto, A. (2021). Human\u2013Robot Interaction through Eye Tracking for Artistic Drawing. Robotics, 10.","DOI":"10.3390\/robotics10020054"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"4154","DOI":"10.17973\/MMSJ.2020_11_2020064","article-title":"Tuning Perception and Motion Planning Parameters for Moveit! Framework","volume":"2020","author":"Grushko","year":"2020","journal-title":"MM Sci. J."},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Grushko, S., Vysock\u00fd, A., Heczko, D., and Bobovsk\u00fd, Z. (2021). Intuitive Spatial Tactile Feedback for Better Awareness about Robot Trajectory during Human\u2013Robot Collaboration. Sensors, 21.","DOI":"10.3390\/s21175748"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Liu, H., Xi, Y., Song, W., Um, K., and Cho, K. (2013, January 21\u201322). Gesture-Based NUI Application for Real-Time Path Modification. Proceedings of the 2013 IEEE 11th International Conference on Dependable, Autonomic and Secure Computing, Chengdu, China.","DOI":"10.1109\/DASC.2013.104"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"032042","DOI":"10.1088\/1742-6596\/1187\/3\/032042","article-title":"Natural Gesture Control of a Delta Robot Using Leap Motion","volume":"1187","author":"Zhang","year":"2019","journal-title":"J. Phys. Conf. Ser."},{"key":"ref_13","unstructured":"Takahashi, S. (2023, March 27). Hand-Gesture-Recognition-Using-Mediapipe. Available online: https:\/\/github.com\/Kazuhito00\/hand-gesture-recognition-using-mediapipe."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Kamath, V., and Bhat, S. (2014, January 21\u201322). Kinect Sensor Based Real-Time Robot Path Planning Using Hand Gesture and Clap Sound. Proceedings of the International Conference on Circuits, Communication, Control and Computing, Bangalore, India.","DOI":"10.1109\/CIMCA.2014.7057774"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Quintero, C.P., Fomena, R.T., Shademan, A., Wolleb, N., Dick, T., and Jagersand, M. (2013, January 6\u201310). SEPO: Selecting by Pointing as an Intuitive Human-Robot Command Interface. Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany.","DOI":"10.1109\/ICRA.2013.6630719"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Vysock\u00fd, A., Grushko, S., O\u0161\u010d\u00e1dal, P., Kot, T., Babjak, J., J\u00e1no\u0161, R., Sukop, M., and Bobovsk\u00fd, Z. (2020). Analysis of Precision and Stability of Hand Tracking with Leap Motion Sensor. Sensors, 20.","DOI":"10.3390\/s20154088"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"3702","DOI":"10.3390\/s140203702","article-title":"An Analysis of the Precision and Reliability of the Leap Motion Sensor and Its Suitability for Static and Dynamic Tracking","volume":"14","author":"Guna","year":"2014","journal-title":"Sensors"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"4194","DOI":"10.17973\/MMSJ.2020_12_2020057","article-title":"A Depth Image Quality Benchmark of Three Popular Low-Cost Depth Cameras","volume":"2020","author":"Jha","year":"2020","journal-title":"MM Sci. J."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"99734","DOI":"10.1109\/ACCESS.2022.3206948","article-title":"Generating Synthetic Depth Image Dataset for Industrial Applications of Hand Localization","volume":"10","author":"Vysocky","year":"2022","journal-title":"IEEE Access"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"M\u00fcezzino\u011flu, T., and Karak\u00f6se, M. (2021). An Intelligent Human\u2013Unmanned Aerial Vehicle Interaction Approach in Real Time Based on Machine Learning Using Wearable Gloves. Sensors, 21.","DOI":"10.3390\/s21051766"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Carneiro, M.R., Rosa, L.P., de Almeida, A.T., and Tavakoli, M. (2022, January 4\u20138). Tailor-Made Smart Glove for Robot Teleoperation, Using Printed Stretchable Sensors. Proceedings of the 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft), Edinburgh, UK.","DOI":"10.1109\/RoboSoft54090.2022.9762214"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Grushko, S., Vysock\u00fd, A., O\u0161\u010d\u00e1dal, P., Vocetka, M., Nov\u00e1k, P., and Bobovsk\u00fd, Z. (2021). Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory. Sensors, 21.","DOI":"10.3390\/s21113673"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"159","DOI":"10.3233\/ICA-200637","article-title":"Human-Robot Interaction in Industry 4.0 Based on an Internet of Things Real-Time Gesture Control System","volume":"28","author":"Olivares","year":"2021","journal-title":"Integr. Comput.-Aided Eng."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"036025","DOI":"10.1088\/1741-2552\/ab8682","article-title":"Efficient Correction of Armband Rotation for Myoelectric-Based Gesture Control Interface","volume":"17","author":"He","year":"2020","journal-title":"J. Neural Eng."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Caeiro-Rodr\u00edguez, M., Otero-Gonz\u00e1lez, I., Mikic-Fonte, F.A., and Llamas-Nistal, M. (2021). A Systematic Review of Commercial Smart Gloves: Current Status and Applications. Sensors, 21.","DOI":"10.3390\/s21082667"},{"key":"ref_26","first-page":"1","article-title":"AuraRing: Precise Electromagnetic Finger Tracking. Proc","volume":"3","author":"Parizi","year":"2020","journal-title":"ACM Interact. Mob. Wearable Ubiquitous Technol."},{"key":"ref_27","first-page":"100175","article-title":"Augmented Reality Smart Glasses in Industrial Assembly: Current Status and Future Challenges","volume":"20","author":"Danielsson","year":"2020","journal-title":"J. Ind. Inf. Integr."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Oudah, M., Al-Naji, A., and Chahl, J. (2020). Hand Gesture Recognition Based on Computer Vision: A Review of Techniques. J. Imaging, 6.","DOI":"10.3390\/jimaging6080073"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"M\u00fcezzino\u011flu, T., and Karak\u00f6se, M. (November, January 12). Wearable Glove Based Approach for Human-UAV Interaction. Proceedings of the 2020 IEEE International Symposium on Systems Engineering (ISSE), Vienna, Austria.","DOI":"10.1109\/ISSE49799.2020.9272208"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"DelPreto, J., and Rus, D. (2020, January 9). Plug-and-Play Gesture Control Using Muscle and Motion Sensors. Proceedings of the 2020 ACM\/IEEE International Conference on Human-Robot Interaction, Association for Computing Machinery, New York, NY, USA.","DOI":"10.1145\/3319502.3374823"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Pomykalski, P., Wo\u017aniak, M.P., Wo\u017aniak, P.W., Grudzie\u0144, K., Zhao, S., and Romanowski, A. (2020, January 25\u201330). Considering Wake Gestures for Smart Assistant Use. Proceedings of the CHI \u201920: CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA.","DOI":"10.1145\/3334480.3383089"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"45","DOI":"10.1016\/j.smhl.2017.12.003","article-title":"Ultigesture: A Wristband-Based Platform for Continuous Gesture Control in Healthcare","volume":"11","author":"Zhao","year":"2019","journal-title":"Smart Health"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"O\u0161\u010d\u00e1dal, P., Heczko, D., Vysock\u00fd, A., Mlotek, J., Nov\u00e1k, P., Virgala, I., Sukop, M., and Bobovsk\u00fd, Z. (2020). Improved Pose Estimation of Aruco Tags Using a Novel 3D Placement Strategy. Sensors, 20.","DOI":"10.3390\/s20174825"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"O\u0161\u010d\u00e1dal, P., Spurn\u00fd, T., Kot, T., Grushko, S., Suder, J., Heczko, D., Nov\u00e1k, P., and Bobovsk\u00fd, Z. (2022). Distributed Camera Subsystem for Obstacle Detection. Sensors, 22.","DOI":"10.3390\/s22124588"},{"key":"ref_35","unstructured":"Grushko, S. (2023, March 27). Fork of Google\u2019s MediaPipe (v0.8.9) for Jetson Nano (JetPack 4.6) CUDA (10.2). Available online: https:\/\/github.com\/anion0278\/mediapipe-jetson."},{"key":"ref_36","unstructured":"(2023, March 27). Hand Landmarks Detection Guide. Available online: https:\/\/google.github.io\/mediapipe\/solutions\/hands.html."},{"key":"ref_37","unstructured":"(2023, March 01). How to Calculate Z-Score and Its Meaning. Available online: https:\/\/www.investopedia.com\/terms\/z\/zscore.asp."},{"key":"ref_38","unstructured":"(2023, February 23). APA Dictionary of Psychology. Available online: https:\/\/dictionary.apa.org\/."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"904","DOI":"10.1177\/154193120605000909","article-title":"NASA-Task Load Index (NASA-TLX); 20 Years Later","volume":"50","author":"Hart","year":"2006","journal-title":"Proc. Hum. Factors Ergon. Soc. Annu. Meet."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"1522","DOI":"10.1177\/154193120805201946","article-title":"Measurement Invariance of the NASA TLX","volume":"52","author":"Bustamante","year":"2008","journal-title":"Proc. Hum. Factors Ergon. Soc. Annu. Meet."},{"key":"ref_41","unstructured":"Brooke, J. (1995). SUS: A Quick and Dirty Usability Scale. Usability Evaluation In Industry, CRC Press. [1st ed.]."},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Bolano, G., Roennau, A., and Dillmann, R. (2018, January 27\u201331). Transparent Robot Behavior by Adding Intuitive Visual and Acoustic Feedback to Motion Replanning. Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China.","DOI":"10.1109\/ROMAN.2018.8525671"},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"101891","DOI":"10.1016\/j.rcim.2019.101891","article-title":"AR-Based Interaction for Human-Robot Collaborative Manufacturing","volume":"63","author":"Hietanen","year":"2020","journal-title":"Robot. Comput.-Integr. Manuf."},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"1729881418755780","DOI":"10.1177\/1729881418755780","article-title":"Kinect v2 Infrared Images Correction","volume":"15","author":"Krys","year":"2018","journal-title":"Int. J. Adv. Robot. Syst."}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/9\/4219\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T19:21:53Z","timestamp":1760124113000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/23\/9\/4219"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,4,23]]},"references-count":44,"journal-issue":{"issue":"9","published-online":{"date-parts":[[2023,5]]}},"alternative-id":["s23094219"],"URL":"https:\/\/doi.org\/10.3390\/s23094219","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,4,23]]}}}