{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T23:36:48Z","timestamp":1776123408245,"version":"3.50.1"},"reference-count":39,"publisher":"MDPI AG","issue":"11","license":[{"start":{"date-parts":[[2021,5,25]],"date-time":"2021-05-25T00:00:00Z","timestamp":1621900800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"Research Platform focused on Industry 4.0 and Robotics in Ostrava Agglomeration project","award":["CZ.02.1.01\/0.0\/0.0\/17_049\/0008425"],"award-info":[{"award-number":["CZ.02.1.01\/0.0\/0.0\/17_049\/0008425"]}]},{"name":"Specific research project financed by the state budget of the Czech Republic","award":["SP2021\/47"],"award-info":[{"award-number":["SP2021\/47"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>In a collaborative scenario, the communication between humans and robots is a fundamental aspect to achieve good efficiency and ergonomics in the task execution. A lot of research has been made related to enabling a robot system to understand and predict human behaviour, allowing the robot to adapt its motion to avoid collisions with human workers. Assuming the production task has a high degree of variability, the robot\u2019s movements can be difficult to predict, leading to a feeling of anxiety in the worker when the robot changes its trajectory and approaches since the worker has no information about the planned movement of the robot. Additionally, without information about the robot\u2019s movement, the human worker cannot effectively plan own activity without forcing the robot to constantly replan its movement. We propose a novel approach to communicating the robot\u2019s intentions to a human worker. The improvement to the collaboration is presented by introducing haptic feedback devices, whose task is to notify the human worker about the currently planned robot\u2019s trajectory and changes in its status. In order to verify the effectiveness of the developed human-machine interface in the conditions of a shared collaborative workspace, a user study was designed and conducted among 16 participants, whose objective was to accurately recognise the goal position of the robot during its movement. Data collected during the experiment included both objective and subjective parameters. Statistically significant results of the experiment indicated that all the participants could improve their task completion time by over 45% and generally were more subjectively satisfied when completing the task with equipped haptic feedback devices. The results also suggest the usefulness of the developed notification system since it improved users\u2019 awareness about the motion plan of the robot.<\/jats:p>","DOI":"10.3390\/s21113673","type":"journal-article","created":{"date-parts":[[2021,5,25]],"date-time":"2021-05-25T22:02:23Z","timestamp":1621980143000},"page":"3673","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":48,"title":["Improved Mutual Understanding for Human-Robot Collaboration: Combining Human-Aware Motion Planning with Haptic Feedback Devices for Communicating Planned Trajectory"],"prefix":"10.3390","volume":"21","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-8984-153X","authenticated-orcid":false,"given":"Stefan","family":"Grushko","sequence":"first","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB-TU Ostrava, 70800 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6942-4280","authenticated-orcid":false,"given":"Ale\u0161","family":"Vysock\u00fd","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB-TU Ostrava, 70800 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-6973-1501","authenticated-orcid":false,"given":"Petr","family":"O\u0161\u010d\u00e1dal","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB-TU Ostrava, 70800 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6620-010X","authenticated-orcid":false,"given":"Michal","family":"Vocetka","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB-TU Ostrava, 70800 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2103-7294","authenticated-orcid":false,"given":"Petr","family":"Nov\u00e1k","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB-TU Ostrava, 70800 Ostrava, Czech Republic"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4134-5251","authenticated-orcid":false,"given":"Zdenko","family":"Bobovsk\u00fd","sequence":"additional","affiliation":[{"name":"Department of Robotics, Faculty of Mechanical Engineering, VSB-TU Ostrava, 70800 Ostrava, Czech Republic"}]}],"member":"1968","published-online":{"date-parts":[[2021,5,25]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"903","DOI":"10.17973\/MMSJ.2016_06_201611","article-title":"Human-robot collaboration in industry","volume":"2016","author":"Vysocky","year":"2016","journal-title":"MM Sci. J."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"701","DOI":"10.1016\/j.cirp.2019.05.002","article-title":"Symbiotic Human-robot collaborative assembly","volume":"68","author":"Wang","year":"2019","journal-title":"CIRP Ann."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"248","DOI":"10.1016\/j.mechatronics.2018.02.009","article-title":"Survey on human\u2013robot collaboration in industrial settings: Safety, intuitive interfaces and applications","volume":"55","author":"Villani","year":"2018","journal-title":"Mechatronics"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Mohammadi Amin, F., Rezayati, M., van de Venn, H.W., and Karimpour, H. (2020). A mixed-perception approach for safe human\u2013robot collaboration in industrial automation. Sensors, 20.","DOI":"10.20944\/preprints202009.0119.v1"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"4289","DOI":"10.1109\/LRA.2018.2865034","article-title":"Operator awareness in human\u2013robot collaboration through wearable vibrotactile feedback","volume":"3","author":"Casalino","year":"2018","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Bonci, A., Cen Cheng, P.D., Indri, M., Nabissi, G., and Sibona, F. (2021). Human-robot perception in industrial environments: A survey. Sensors, 21.","DOI":"10.3390\/s21051571"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Tang, K.-H., Ho, C.-F., Mehlich, J., and Chen, S.-T. (2020). Assessment of handover prediction models in estimation of cycle times for manual assembly tasks in a human\u2013robot collaborative environment. Appl. Sci., 10.","DOI":"10.3390\/app10020556"},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Mainprice, J., and Berenson, D. (2013, January 3\u20137). Human-robot collaborative manipulation planning using early prediction of human motion. Proceedings of the 2013 IEEE\/RSJ International Conference on Intelligent Robots and Systems, Tokyo, Japan.","DOI":"10.1109\/IROS.2013.6696368"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Hermann, A., Mauch, F., Fischnaller, K., Klemm, S., Roennau, A., and Dillmann, R. (2015, January 2\u20134). Anticipate your surroundings: Predictive collision detection between dynamic obstacles and planned robot trajectories on the GPU. Proceedings of the 2015 European Conference on Mobile Robots (ECMR), Lincoln, UK.","DOI":"10.1109\/ECMR.2015.7324047"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Li, G., Liu, Z., Cai, L., and Yan, J. (2020). Standing-posture recognition in human\u2013robot collaboration based on deep learning and the dempster\u2013shafer evidence theory. Sensors, 20.","DOI":"10.3390\/s20041158"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Feleke, A.G., Bi, L., and Fei, W. (2021). EMG-based 3D hand motor intention prediction for information transfer from human to robot. Sensors, 21.","DOI":"10.3390\/s21041316"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Scimmi, L.S., Melchiorre, M., Troise, M., Mauro, S., and Pastorelli, S. (2021). A practical and effective layout for a safe human-robot collaborative assembly task. Appl. Sci., 11.","DOI":"10.3390\/app11041763"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Mi\u0161eikis, J., Glette, K., Elle, O.J., and Torresen, J. (2016, January 6\u20139). Multi 3D camera mapping for predictive and reflexive robot manipulator trajectory estimation. Proceedings of the 2016 IEEE Symposium Series on Computational Intelligence (SSCI), Athens, Greece.","DOI":"10.1109\/SSCI.2016.7850237"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Bolano, G., Roennau, A., and Dillmann, R. (2018, January 27\u201331). Transparent robot behavior by adding intuitive visual and acoustic feedback to motion replanning. Proceedings of the 2018 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), Nanjing, China.","DOI":"10.1109\/ROMAN.2018.8525671"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"916","DOI":"10.1080\/0951192X.2015.1130251","article-title":"Human\u2013Robot Interaction review and challenges on task planning and programming","volume":"29","author":"Tsarouchi","year":"2016","journal-title":"Int. J. Comput. Integr. Manuf."},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Lee, W., Park, C.H., Jang, S., and Cho, H.-K. (2020). Design of effective robotic gaze-based social cueing for users in task-oriented situations: How to overcome in-attentional blindness?. Appl. Sci., 10.","DOI":"10.3390\/app10165413"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"59","DOI":"10.1109\/MRA.2018.2815655","article-title":"Better teaming through visual cues: how projecting imagery in a workspace can improve human-robot collaboration","volume":"25","author":"Rathore","year":"2018","journal-title":"IEEE Robot. Autom. Mag."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Andersen, R.S., Madsen, O., Moeslund, T.B., and Amor, H.B. (2016, January 26\u201331). Projecting robot intentions into human environments. Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA.","DOI":"10.1109\/ROMAN.2016.7745145"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Bambu\u015dek, D., Materna, Z., Kapinus, M., Beran, V., and Smr\u017e, P. (2019, January 14\u201318). Combining interactive spatial augmented reality with head-mounted display for end-user collaborative robot programming. Proceedings of the 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), New Delhi, India.","DOI":"10.1109\/RO-MAN46459.2019.8956315"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"33","DOI":"10.1007\/s12008-013-0191-2","article-title":"A novel augmented reality-based interface for robot path planning","volume":"8","author":"Fang","year":"2014","journal-title":"Int. J. Interact. Des. Manuf."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"101891","DOI":"10.1016\/j.rcim.2019.101891","article-title":"AR-based interaction for human-robot collaborative manufacturing","volume":"63","author":"Hietanen","year":"2020","journal-title":"Robot. Comput. Integr. Manuf."},{"key":"ref_22","unstructured":"Clair, A.S., and Matari\u0107, M. (2015, January 2\u20135). How robot verbal feedback can improve team performance in human-robot task collaborations. Proceedings of the 2015 10th ACM\/IEEE International Conference on Human-Robot Interaction (HRI), Portland, OR, USA."},{"key":"ref_23","unstructured":"Barros, P.G., de Lindeman, R.W., and Ward, M.O. (2011, January 19\u201320). Enhancing robot teleoperator situation awareness and performance using vibro-tactile and graphical feedback. Proceedings of the 2011 IEEE Symposium on 3D User Interfaces (3DUI), Singapore."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Li, H., Sarter, N.B., Sebok, A., and Wickens, C.D. (2012). The design and evaluation of visual and tactile warnings in support of space teleoperation. Proc. Hum. Factors Ergon. Soc. Annu. Meet.","DOI":"10.1037\/e572172013-277"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Sziebig, G., and Korondi, P. (2017, January 19\u201321). Remote operation and assistance in human robot interactions with vibrotactile feedback. Proceedings of the 2017 IEEE 26th International Symposium on Industrial Electronics (ISIE), Edinburgh, UK.","DOI":"10.1109\/ISIE.2017.8001513"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"21","DOI":"10.1177\/0018720814565188","article-title":"Analyzing the effects of human-aware motion planning on close-proximity human-robot collaboration","volume":"57","author":"Lasota","year":"2015","journal-title":"Hum. Factors J. Hum. Factors Ergon. Soc."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"2394","DOI":"10.1109\/LRA.2018.2812906","article-title":"Human-aware robotic assistant for collaborative assembly: Integrating human motion prediction with planning in time","volume":"3","author":"Unhelkar","year":"2018","journal-title":"IEEE Robot. Autom. Lett."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"105057","DOI":"10.1016\/j.compag.2019.105057","article-title":"Banana detection based on color and texture features in the natural environment","volume":"167","author":"Fu","year":"2019","journal-title":"Comput. Electron. Agric."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"76","DOI":"10.1016\/j.ifacol.2019.12.500","article-title":"Kiwifruit detection in field images using faster R-CNN with VGG16","volume":"52","author":"Song","year":"2019","journal-title":"IFAC Pap. OnLine"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Lim, G.M., Jatesiktat, P., Keong Kuah, C.W., and Tech Ang, W. (2019, January 23\u201327). Hand and object segmentation from depth image using fully convolutional network. Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany.","DOI":"10.1109\/EMBC.2019.8857700"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Tang, Y., Chen, M., Wang, C., Luo, L., Li, J., Lian, G., and Zou, X. (2020). Recognition and localization methods for vision-based fruit picking robots: A review. Front. Plant Sci., 11.","DOI":"10.3389\/fpls.2020.00510"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"170","DOI":"10.1016\/j.optlaseng.2019.06.011","article-title":"High-accuracy multi-camera reconstruction enhanced by adaptive point cloud correction algorithm","volume":"122","author":"Chen","year":"2019","journal-title":"Opt. Lasers Eng."},{"key":"ref_33","unstructured":"Ioan Sucan, S.C. (2021, March 04). MoveIt!. Available online: http:\/\/moveit.ros.org."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Pan, J., Chitta, S., and Manocha, D. (2012, January 14\u201318). FCL: A general purpose library for collision and proximity queries. Proceedings of the 2012 IEEE International Conference on Robotics and Automation, Saint Paul, MN, USA.","DOI":"10.1109\/ICRA.2012.6225337"},{"key":"ref_35","unstructured":"Ganesan, R.K. (2021, March 02). Mediating Human-Robot Collaboration Through Mixed Reality Cues. Available online: https:\/\/www.semanticscholar.org\/paper\/Mediating-Human-Robot-Collaboration-through-Mixed-Ganesan\/de797205f4359044639071fa8935cd23aa3fa5c9."},{"key":"ref_36","unstructured":"(2021, March 04). Practice Effect\u2014APA Dictionary of Psychology. Available online: https:\/\/dictionary.apa.org\/practice-effect."},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Aggravi, M., Salvietti, G., and Prattichizzo, D. (2016, January 26\u201331). Haptic wrist guidance using vibrations for human-robot teams. Proceedings of the 2016 25th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN), New York, NY, USA.","DOI":"10.1109\/ROMAN.2016.7745098"},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Scheggi, S., Chinello, F., and Prattichizzo, D. Vibrotactile Haptic Feedback for Human-Robot. Interaction in Leader-Follower Tasks. ACM PETRA \u201912: Proceedings of the 5th International Conference on PErvasive Technologies Related to Assistive Environments, Heraklion, Crete, Greece, 6\u20138 June 2012.","DOI":"10.1145\/2413097.2413161"},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Scheggi, S., Aggravi, M., Morbidi, F., and Prattichizzo, D. (June, January 31). Cooperative Human-Robot. Haptic Navigation. Proceedings of the 2014 IEEE International Conference on Robotics and Automation (ICRA), Hong Kong, China.","DOI":"10.1109\/ICRA.2014.6907245"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/21\/11\/3673\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T06:07:37Z","timestamp":1760162857000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/21\/11\/3673"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,5,25]]},"references-count":39,"journal-issue":{"issue":"11","published-online":{"date-parts":[[2021,6]]}},"alternative-id":["s21113673"],"URL":"https:\/\/doi.org\/10.3390\/s21113673","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,5,25]]}}}