{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,20]],"date-time":"2026-02-20T22:24:03Z","timestamp":1771626243864,"version":"3.50.1"},"reference-count":40,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2021,3,26]],"date-time":"2021-03-26T00:00:00Z","timestamp":1616716800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Robotics"],"abstract":"<jats:p>In this paper, authors present a novel architecture for controlling an industrial robot via an eye tracking interface for artistic purposes. Humans and robots interact thanks to an acquisition system based on an eye tracker device that allows the user to control the motion of a robotic manipulator with his gaze. The feasibility of the robotic system is evaluated with experimental tests in which the robot is teleoperated to draw artistic images. The tool can be used by artists to investigate novel forms of art and by amputees or people with movement disorders or muscular paralysis, as an assistive technology for artistic drawing and painting, since, in these cases, eye motion is usually preserved.<\/jats:p>","DOI":"10.3390\/robotics10020054","type":"journal-article","created":{"date-parts":[[2021,3,26]],"date-time":"2021-03-26T06:59:42Z","timestamp":1616741982000},"page":"54","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":39,"title":["Human\u2013Robot Interaction through Eye Tracking for Artistic Drawing"],"prefix":"10.3390","volume":"10","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-0770-0275","authenticated-orcid":false,"given":"Lorenzo","family":"Scalera","sequence":"first","affiliation":[{"name":"Polytechnic Department of Engineering and Architecture, University of Udine, 33100 Udine, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5563-9595","authenticated-orcid":false,"given":"Stefano","family":"Seriani","sequence":"additional","affiliation":[{"name":"Department of Engineering and Architecture, University of Trieste, 34127 Trieste, Italy"}]},{"given":"Paolo","family":"Gallina","sequence":"additional","affiliation":[{"name":"Department of Engineering and Architecture, University of Trieste, 34127 Trieste, Italy"}]},{"given":"Mattia","family":"Lentini","sequence":"additional","affiliation":[{"name":"Department of Engineering and Architecture, University of Trieste, 34127 Trieste, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9902-9783","authenticated-orcid":false,"given":"Alessandro","family":"Gasparetto","sequence":"additional","affiliation":[{"name":"Polytechnic Department of Engineering and Architecture, University of Udine, 33100 Udine, Italy"}]}],"member":"1968","published-online":{"date-parts":[[2021,3,26]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Majaranta, P., and Bulling, A. (2014). Eye tracking and eye-based human\u2013computer interaction. Advances in Physiological Computing, Springer.","DOI":"10.1007\/978-1-4471-6392-3_3"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"950","DOI":"10.1177\/1545968315575611","article-title":"Usability and workload of access technology for people with severe motor impairment: A comparison of brain-computer interfacing and eye tracking","volume":"29","author":"Pasqualotto","year":"2015","journal-title":"Neurorehabilit. Neural Repair"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1645","DOI":"10.3758\/s13428-017-0998-z","article-title":"Threats to the validity of eye-movement research in psychology","volume":"50","author":"Orquin","year":"2018","journal-title":"Behav. Res. Methods"},{"key":"ref_4","first-page":"32","article-title":"Eye tracking in neuromarketing: A research agenda for marketing studies","volume":"7","author":"Rocha","year":"2015","journal-title":"Int. J. Psychol. Stud."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"347","DOI":"10.1109\/TCE.2012.6227433","article-title":"Real-time eye gaze tracking for gaming design and consumer electronics systems","volume":"58","author":"Corcoran","year":"2012","journal-title":"IEEE Trans. Consum. Electron."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Maimon-Mor, R.O., Fernandez-Quesada, J., Zito, G.A., Konnaris, C., Dziemian, S., and Faisal, A.A. (2017, January 17\u201320). Towards free 3D end-point control for robotic-assisted human reaching using binocular eye tracking. Proceedings of the 2017 International Conference on Rehabilitation Robotics (ICORR), London, UK.","DOI":"10.1109\/ICORR.2017.8009388"},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Schiatti, L., Tessadori, J., Barresi, G., Mattos, L.S., and Ajoudani, A. (2017, January 17\u201320). Soft brain-machine interfaces for assistive robotics: A novel control approach. Proceedings of the 2017 International Conference on Rehabilitation Robotics (ICORR), London, UK.","DOI":"10.1109\/ICORR.2017.8009357"},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Wang, Y., Zeng, H., Song, A., Xu, B., Li, H., Zhu, L., Wen, P., and Liu, J. (2017, January 25\u201328). Robotic arm control using hybrid brain-machine interface and augmented reality feedback. Proceedings of the 2017 8th International IEEE\/EMBS Conference on Neural Engineering (NER), Shanghai, China.","DOI":"10.1109\/NER.2017.8008377"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"W\u00f6hle, L., and Gebhard, M. (2021). Towards Robust Robot Control in Cartesian Space Using an Infrastructureless Head-and Eye-Gaze Interface. Sensors, 21.","DOI":"10.3390\/s21051798"},{"key":"ref_10","unstructured":"Gips, J., and Olivieri, P. (1996, January 4\u20136). EagleEyes: An eye control system for persons with disabilities. Proceedings of the Eleventh International Conference on Technology and Persons with Disabilities, Los Angeles, CA, USA."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Van der Kamp, J., and Sundstedt, V. (2011, January 26\u201327). Gaze and voice controlled drawing. Proceedings of the 1st Conference on Novel Gaze-Controlled Applications, Karlskrona, Sweden.","DOI":"10.1145\/1983302.1983311"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Heikkil\u00e4, H. (2013). Tools for a Gaze-Controlled Drawing Application\u2013Comparing Gaze Gestures against Dwell Buttons. IFIP Conference on Human-Computer Interaction, Springer.","DOI":"10.1007\/978-3-642-40480-1_12"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Santella, A., and DeCarlo, D. (2002, January 3\u20135). Abstracted painterly renderings using eye-tracking data. Proceedings of the 2nd International Symposium on Non-Photorealistic Animation and Rendering, Annecy, France.","DOI":"10.1145\/508530.508544"},{"key":"ref_14","unstructured":"Graham Fink (2021, January 26). Eye Drawings. Available online: https:\/\/grahamfink.com\/eye-drawings."},{"key":"ref_15","unstructured":"Bradley, J.P. (2018). The Delirious Abstract Machines of Jean Tinguely. Ecosophical Aesthetics: Art, Ethics and Ecology with Guattari, Bloomsbury Publishing."},{"key":"ref_16","first-page":"141","article-title":"The further exploits of AARON, painter","volume":"4","author":"Cohen","year":"1995","journal-title":"Stanf. Humanit. Rev."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"348","DOI":"10.1016\/j.cag.2013.01.012","article-title":"Portrait drawing by Paul the robot","volume":"37","author":"Tresset","year":"2013","journal-title":"Comput. Graph."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"G\u00fclzow, J.M., Paetzold, P., and Deussen, O. (2020). Recent Developments Regarding Painting Robots for Research in Automatic Painting, Artificial Creativity, and Machine Learning. Appl. Sci., 10.","DOI":"10.3390\/app10103396"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"103263","DOI":"10.1016\/j.robot.2019.103263","article-title":"Interactive system for painting artworks by regions using a robot","volume":"121","year":"2019","journal-title":"Robot. Auton. Syst."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"17","DOI":"10.1016\/j.robot.2019.02.009","article-title":"Advanced tone rendition technique for a painting robot","volume":"115","author":"Karimov","year":"2019","journal-title":"Robot. Auton. Syst."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"871","DOI":"10.1007\/s10846-018-0937-y","article-title":"Watercolour robotic painting: A novel automatic system for artistic rendering","volume":"95","author":"Scalera","year":"2019","journal-title":"J. Intell. Robot. Syst."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Scalera, L., Seriani, S., Gasparetto, A., and Gallina, P. (2018). Busker Robot: A robotic painting system for rendering images into watercolour artworks. IFToMM Symposium on Mechanism Design for Robotics, Springer.","DOI":"10.1007\/978-3-030-00365-4_1"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Beltramello, A., Scalera, L., Seriani, S., and Gallina, P. (2020). Artistic Robotic Painting Using the Palette Knife Technique. Robotics, 9.","DOI":"10.3390\/robotics9010015"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Guo, C., Bai, T., Lu, Y., Lin, Y., Xiong, G., Wang, X., and Wang, F.Y. (2020, January 20\u201321). Skywork-daVinci: A novel CPSS-based painting support system. Proceedings of the 16th International Conference on Automation Science and Engineering (CASE), Hong Kong, China.","DOI":"10.1109\/CASE48305.2020.9216814"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Bidgoli, A., De Guevara, M.L., Hsiung, C., Oh, J., and Kang, E. (September, January 31). Artistic Style in Robotic Painting; a Machine Learning Approach to Learning Brushstroke from Human Artists. Proceedings of the 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), Naples, Italy.","DOI":"10.1109\/RO-MAN47096.2020.9223533"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"143","DOI":"10.3389\/frobt.2020.580415","article-title":"Interactive Multi-Robot Painting Through Colored Motion Trails","volume":"7","author":"Santos","year":"2020","journal-title":"Front. Robot. AI"},{"key":"ref_27","unstructured":"Gatys, L.A., Ecker, A.S., and Bethge, M. (July, January 26). Image style transfer using convolutional neural networks. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Scalera, L., Seriani, S., Gasparetto, A., and Gallina, P. (2019). Non-photorealistic rendering techniques for artistic robotic painting. Robotics, 8.","DOI":"10.3390\/robotics8010010"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Karimov, A., Kopets, E., Kolev, G., Leonov, S., Scalera, L., and Butusov, D. (2021). Image Preprocessing for Artistic Robotic Painting. Inventions, 6.","DOI":"10.3390\/inventions6010019"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Quintero, C.P., Dehghan, M., Ramirez, O., Ang, M.H., and Jagersand, M. (June, January 29). Flexible virtual fixture interface for path specification in tele-manipulation. Proceedings of the 2017 IEEE International Conference on Robotics and Automation (ICRA), Singapore.","DOI":"10.1109\/ICRA.2017.7989631"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"81","DOI":"10.3389\/fnbot.2018.00081","article-title":"Acceptability Study of A3-K3 Robotic Architecture for a Neurorobotics Painting","volume":"12","author":"Tramonte","year":"2019","journal-title":"Front. Neurorobot."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"191","DOI":"10.1007\/978-3-030-55807-9_22","article-title":"A Novel Robotic System for Painting with Eyes","volume":"91","author":"Scalera","year":"2021","journal-title":"Mech. Mach. Sci."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Yarbus, A.L. (1967). Eye movements during perception of complex objects. Eye Movements and Vision, Springer.","DOI":"10.1007\/978-1-4899-5379-7"},{"key":"ref_34","unstructured":"Carpenter, R.H. (1988). Movements of the Eyes, Pion Limited. [2nd ed.]."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Land, M., and Tatler, B. (2009). Looking and Acting: Vision and Eye Movements in Natural Behaviour, Oxford University Press.","DOI":"10.1093\/acprof:oso\/9780198570943.001.0001"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"1457","DOI":"10.1080\/17470210902816461","article-title":"The 35th Sir Frederick Bartlett Lecture: Eye movements and attention in reading, scene perception, and visual search","volume":"62","author":"Rayner","year":"2009","journal-title":"Q. J. Exp. Psychol."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"397","DOI":"10.3758\/BF03201553","article-title":"Survey of eye movement recording methods","volume":"7","author":"Young","year":"1975","journal-title":"Behav. Res. Methods Instrum."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"923","DOI":"10.3758\/s13428-016-0762-9","article-title":"Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research","volume":"49","author":"Gibaldi","year":"2017","journal-title":"Behav. Res. Methods"},{"key":"ref_39","unstructured":"Biagiotti, L., and Melchiorri, C. (2008). Trajectory Planning for Automatic Machines and Robots, Springer."},{"key":"ref_40","unstructured":"(2021, January 26). This Person Does Not Exist. Available online: https:\/\/www.thispersondoesnotexist.com\/."}],"container-title":["Robotics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2218-6581\/10\/2\/54\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T05:41:20Z","timestamp":1760161280000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2218-6581\/10\/2\/54"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,3,26]]},"references-count":40,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2021,6]]}},"alternative-id":["robotics10020054"],"URL":"https:\/\/doi.org\/10.3390\/robotics10020054","relation":{},"ISSN":["2218-6581"],"issn-type":[{"value":"2218-6581","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,3,26]]}}}