{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T18:42:46Z","timestamp":1776105766892,"version":"3.50.1"},"reference-count":54,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2024,12,5]],"date-time":"2024-12-05T00:00:00Z","timestamp":1733356800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Robot. AI"],"abstract":"<jats:p>Musical performance relies on nonverbal cues for conveying information among musicians. Human musicians use bodily gestures to communicate their interpretation and intentions to their collaborators, from mood and expression to anticipatory cues regarding structure and tempo. Robotic Musicians can use their physical bodies in a similar way when interacting with fellow musicians. The paper presents a new theoretical framework to classify musical gestures and a study evaluating the effect of robotic gestures on synchronization between human musicians and Shimon - a robotic marimba player developed at Georgia Tech. Shimon utilizes head and arm movements to signify musical information such as expected notes, tempo, and beat. The study, in which piano players were asked to play along with Shimon, assessed the effectiveness of these gestures on human-robot synchronization. Subjects were evaluated for their ability to synchronize with unknown tempo changes as communicated by Shimon\u2019s ancillary and social gestures. The results demonstrate the significant contribution of non-instrumental gestures to human-robot synchronization, highlighting the importance of non-music-making gestures for anticipation and coordination in human-robot musical collaboration. Subjects also indicated more positive feelings when interacting with the robot\u2019s ancillary and social gestures, indicating the role of these gestures in supporting engaging and enjoyable musical experiences.<\/jats:p>","DOI":"10.3389\/frobt.2024.1461615","type":"journal-article","created":{"date-parts":[[2024,12,5]],"date-time":"2024-12-05T05:10:44Z","timestamp":1733375444000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":3,"title":["Music, body, and machine: gesture-based synchronization in human-robot musical interaction"],"prefix":"10.3389","volume":"11","author":[{"given":"Xuedan","family":"Gao","sequence":"first","affiliation":[]},{"given":"Amit","family":"Rogel","sequence":"additional","affiliation":[]},{"given":"Raghavasimhan","family":"Sankaranarayanan","sequence":"additional","affiliation":[]},{"given":"Brody","family":"Dowling","sequence":"additional","affiliation":[]},{"given":"Gil","family":"Weinberg","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2024,12,5]]},"reference":[{"key":"B1","first-page":"1","article-title":"Nonverbal immediacy in interpersonal communication","volume-title":"Multichannel integrations of nonverbal behavior","author":"Andersen","year":"2014"},{"key":"B2","doi-asserted-by":"publisher","first-page":"98","DOI":"10.1016\/j.neuropsychologia.2013.11.012","article-title":"Sensorimotor communication in professional quartets","volume":"55","author":"Badino","year":"2014","journal-title":"Neuropsychologia"},{"key":"B3","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/2133366.2133368","article-title":"Emotional body language displayed by artificial agents","volume":"2","author":"Beck","year":"2012","journal-title":"ACM Trans. Interact. Intelligent Syst. (TiiS)"},{"key":"B4","doi-asserted-by":"publisher","first-page":"1177","DOI":"10.1007\/s00426-017-0893-3","article-title":"Communication for coordination: gesture kinematics and conventionality affect synchronization success in piano duos","volume":"82","author":"Bishop","year":"2018","journal-title":"Psychol. Res."},{"key":"B5","doi-asserted-by":"crossref","first-page":"708","DOI":"10.1109\/IROS.2005.1545011","article-title":"Effects of nonverbal communication on efficiency and robustness in human-robot teamwork","volume-title":"2005 IEEE\/RSJ international conference on intelligent robots and systems","author":"Breazeal","year":"2005"},{"key":"B6","volume-title":"Towards an embodied musical mind: generative algorithms for robotic musicians","author":"Bretan","year":"2017"},{"key":"B7","doi-asserted-by":"publisher","first-page":"100","DOI":"10.1145\/2818994","article-title":"A survey of robotic musicianship","volume":"59","author":"Bretan","year":"2016","journal-title":"Commun. ACM"},{"key":"B8","article-title":"A unit selection methodology for music generation using deep neural networks","volume-title":"International conference on innovative computing and cloud computing","author":"Bretan","year":"2016"},{"key":"B9","doi-asserted-by":"publisher","first-page":"109","DOI":"10.1007\/s12193-009-0022-8","article-title":"Communication of musical expression by means of mobile robot gestures","volume":"3","author":"Burger","year":"2010","journal-title":"J. Multimodal User Interfaces"},{"key":"B10","first-page":"1","article-title":"Instrumental gesture and musical composition","volume-title":"ICMC 1988-international computer music conference","author":"Cadoz","year":"1988"},{"key":"B11","article-title":"Gesture-music","volume-title":"Trends in gestural control of music","author":"Cadoz","year":"2000"},{"key":"B12","doi-asserted-by":"publisher","first-page":"103166","DOI":"10.1016\/j.actpsy.2020.103166","article-title":"The influence of performing gesture type on interpersonal musical timing, and the role of visual contact and tempo","volume":"210","author":"Coorevits","year":"2020","journal-title":"Acta Psychol."},{"key":"B13","doi-asserted-by":"publisher","first-page":"781","DOI":"10.1080\/01691864.2014.889577","article-title":"Natural human\u2013robot musical interaction: understanding the music conductor gestures by using the wb-4 inertial measurement system","volume":"28","author":"Cosentino","year":"2014","journal-title":"Adv. Robot."},{"key":"B14","doi-asserted-by":"publisher","first-page":"28","DOI":"10.1111\/j.1749-6632.2001.tb05723.x","article-title":"Music, cognition, culture, and evolution","volume":"930","author":"Cross","year":"2001","journal-title":"Ann. N. Y. Acad. Sci."},{"key":"B15","first-page":"85","volume-title":"La gestique de gould: \u00e9l\u00e9ments pour une s\u00e9miologie du geste musical","author":"Delalande","year":"1988"},{"key":"B16","doi-asserted-by":"publisher","first-page":"59","DOI":"10.1109\/mra.2018.2815655","article-title":"Better teaming through visual cues: how projecting imagery in a workspace can improve human-robot collaboration","volume":"25","author":"Ganesan","year":"2018","journal-title":"IEEE Robotics and Automation Mag."},{"key":"B17","doi-asserted-by":"publisher","first-page":"89","DOI":"10.5898\/jhri.3.1.hoffman","article-title":"Designing robots with movement in mind","volume":"3","author":"Hoffman","year":"2014","journal-title":"J. Human-Robot Interact."},{"key":"B18","first-page":"582","article-title":"Gesture-based human-robot jazz improvisation","volume-title":"2010 IEEE international conference on robotics and automation","author":"Hoffman","year":""},{"key":"B19","doi-asserted-by":"publisher","first-page":"3097","DOI":"10.1145\/1753846.1753925","article-title":"Shimon: an interactive improvisational robotic marimba player","volume":"10","author":"Hoffman","year":"","journal-title":"CHI \u201910 Ext. Abstr. Hum. Factors Comput. Syst."},{"key":"B20","first-page":"718","article-title":"Synchronization in human-robot musicianship","volume-title":"19th international Symposium in Robot and human interactive communication","author":"Hoffman","year":""},{"key":"B21","doi-asserted-by":"publisher","first-page":"133","DOI":"10.1007\/s10514-011-9237-0","article-title":"Interactive improvisation with a robotic marimba player","volume":"31","author":"Hoffman","year":"2011","journal-title":"Aut. Robots"},{"key":"B22","doi-asserted-by":"publisher","first-page":"21","DOI":"10.1016\/j.tics.2004.11.003","article-title":"The motor theory of social cognition: a critique","volume":"9","author":"Jacob","year":"2005","journal-title":"Trends cognitive Sci."},{"key":"B23","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/s0166-4328(02)00384-4","article-title":"The mechanism of self-recognition in humans","volume":"142","author":"Jeannerod","year":"2003","journal-title":"Behav. brain Res."},{"key":"B24","first-page":"24","article-title":"Musical gestures: concepts and methods in research","volume-title":"Musical gestures","author":"Jensenius","year":"2010"},{"key":"B25","doi-asserted-by":"publisher","first-page":"143","DOI":"10.1016\/0167-8493(87)90002-7","article-title":"The robot musician \u2018wabot-2\u2019(waseda robot-2)","volume":"3","author":"Kato","year":"1987","journal-title":"Robotics"},{"key":"B27","doi-asserted-by":"publisher","first-page":"2220","DOI":"10.1080\/17470218.2010.497843","article-title":"Follow you, follow me: continuous mutual prediction and adaptation in joint tapping","volume":"63","author":"Konvalinka","year":"2010","journal-title":"Q. J. Exp. Psychol."},{"key":"B28","article-title":"The expressive moment: How interaction (with music) shapes human empowerment (MIT press)","author":"Leman","year":"2016"},{"key":"B29","doi-asserted-by":"crossref","first-page":"1964","DOI":"10.1109\/IROS.2010.5650427","article-title":"Robot musical accompaniment: integrating audio and visual cues for real-time synchronization with a human flutist","volume-title":"2010 IEEE\/RSJ international Conference on intelligent Robots and systems","author":"Lim","year":"2010"},{"key":"B30","doi-asserted-by":"publisher","first-page":"363","DOI":"10.1163\/156855311x614626","article-title":"A musical robot that synchronizes with a coplayer using non-verbal cues","volume":"26","author":"Lim","year":"2012","journal-title":"Adv. Robot."},{"key":"B31","doi-asserted-by":"publisher","first-page":"55","DOI":"10.20965\/jrm.2022.p0055","article-title":"Analysis of timing and effect of visual cue on turn-taking in human-robot interaction","volume":"34","author":"Obo","year":"2022","journal-title":"J. Robotics Mechatronics"},{"key":"B32","first-page":"84","article-title":"Intuitive analysis, creation and manipulation of midi data with pretty_midi","volume-title":"15th international society for music information retrieval conference late breaking and demo papers","author":"Raffel","year":"2014"},{"key":"B33","doi-asserted-by":"crossref","first-page":"165","DOI":"10.1145\/3472307.3484175","article-title":"Say what? collaborative pop lyric generation using multitask transfer learning","volume-title":"Proceedings of the 9th international conference on human-agent interaction","author":"Ram","year":"2021"},{"key":"B34","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1145\/3537972.3537985","article-title":"Robogroove: creating fluid motion for dancing robotic arms","volume-title":"Proceedings of the 8th international conference on movement and computing","author":"Rogel","year":"2022"},{"key":"B35","doi-asserted-by":"crossref","DOI":"10.21428\/92fbeb44.0ad83109","article-title":"Design of hathaani - a robotic violinist for carnatic music","volume-title":"Nime 2021","author":"Sankaranarayanan","year":"2021"},{"key":"B36","first-page":"292","article-title":"The reciprocity between ancillary gesture and music structure performed by expert musicians","volume-title":"Nime","author":"Santos","year":"2019"},{"key":"B37","first-page":"285","article-title":"Robotic dancing, emotional gestures and prosody: a framework for gestures of three robotic platforms","volume-title":"Sound and robotics","author":"Savery","year":"2024"},{"key":"B38","first-page":"1","article-title":"Establishing human-robot trust through music-driven robotic emotion prosody and gesture","volume-title":"2019 28th IEEE international conference on robot and human interactive communication","author":"Savery","year":"2019"},{"key":"B39","doi-asserted-by":"crossref","first-page":"52","DOI":"10.4324\/9780429356797-7","article-title":"Robotics: fast and curious: a cnn for ethical deep learning musical generation","volume-title":"Artificial intelligence and music ecosystem","author":"Savery","year":"2022"},{"key":"B40","doi-asserted-by":"publisher","first-page":"823","DOI":"10.1007\/978-3-030-72116-9_29","article-title":"Shimon sings-robotic musicianship finds its voice","author":"Savery","year":"2021","journal-title":"Handb. Artif. Intell. Music Found. Adv. Approaches, Dev. Creativity"},{"key":"B41","article-title":"Shimon the rapper: a real-time system for human-robot interactive rap battles","volume-title":"Proceedings of the 11th international conference on computational creativity (ICCC\u201920)","author":"Savery","year":"2020"},{"key":"B42","first-page":"188","article-title":"Lemur guitarbot: midi robotic string instrument","volume-title":"Proceedings of the conference on new interfaces for musical expression (NIME)","author":"Singer","year":"2003"},{"key":"B43","doi-asserted-by":"publisher","first-page":"12","DOI":"10.1162\/comj.2006.30.4.12","article-title":"The waseda flutist robot wf-4rii in comparison with a professional flutist","volume":"30","author":"Solis","year":"2006","journal-title":"Comput. Music J."},{"key":"B45","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1109\/SII55687.2023.10039316","article-title":"Development of an anthropomorphic saxophonist robot using a human-like holding method","volume-title":"2023 IEEE\/SICE international symposium on system integration (SII)","author":"Uchiyama","year":"2023"},{"key":"B46","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/3570169","article-title":"Nonverbal cues in human\u2013robot interaction: a communication studies perspective","volume":"12","author":"Urakami","year":"2023","journal-title":"ACM Trans. Human-Robot Interact."},{"key":"B47","doi-asserted-by":"publisher","first-page":"95","DOI":"10.1111\/tops.12306","article-title":"Creating time: social collaboration in music improvisation","volume":"10","author":"Walton","year":"2018","journal-title":"Top. cognitive Sci."},{"key":"B48","first-page":"37","article-title":"Non-obvious performer gestures in instrumental music","volume-title":"International gesture workshop","author":"Wanderley","year":"1999"},{"key":"B49","doi-asserted-by":"publisher","first-page":"632","DOI":"10.1109\/jproc.2004.825882","article-title":"Gestural control of sound synthesis","volume":"92","author":"Wanderley","year":"2004","journal-title":"Proc. IEEE"},{"key":"B50","doi-asserted-by":"publisher","first-page":"97","DOI":"10.1080\/09298210500124208","article-title":"The musical significance of clarinetists\u2019 ancillary gestures: an exploration of the field","volume":"34","author":"Wanderley","year":"2005","journal-title":"J. New Music Res."},{"key":"B51","doi-asserted-by":"publisher","first-page":"11952","DOI":"10.48550\/arXiv.2409.11952","article-title":"Human-robot cooperative piano playing with learning-based real-time music accompaniment","author":"Wang","year":"2024","journal-title":"arXiv Prepr. arXiv:2409"},{"key":"B52","doi-asserted-by":"crossref","DOI":"10.1007\/978-3-030-38930-7","article-title":"Robotic musicianship: embodied artificial creativity and mechatronic musical expression","author":"Weinberg","year":"2020"},{"key":"B53","doi-asserted-by":"publisher","first-page":"1229","DOI":"10.1145\/1124772.1124957","article-title":"Robot-human interaction with an anthropomorphic percussionist","author":"Weinberg","year":"2006","journal-title":"Proc. SIGCHI Conf. Hum. Factors Comput. Syst."},{"key":"B54","first-page":"464","article-title":"Jam\u2019aa-a middle eastern percussion ensemble for human and robotic players","volume-title":"Icmc","author":"Weinberg","year":"2006"},{"key":"B55","article-title":"Mechatronics-driven musical expressivity for robotic percussionists","author":"Yang","year":"2020"},{"key":"B56","first-page":"978","article-title":"Robot gesture sonification to enhance awareness of robot status and enjoyment of interaction","volume-title":"2020 29th IEEE international conference on robot and human interactive communication","author":"Zahray","year":"2020"}],"container-title":["Frontiers in Robotics and AI"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frobt.2024.1461615\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,12,5]],"date-time":"2024-12-05T05:10:58Z","timestamp":1733375458000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frobt.2024.1461615\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,12,5]]},"references-count":54,"alternative-id":["10.3389\/frobt.2024.1461615"],"URL":"https:\/\/doi.org\/10.3389\/frobt.2024.1461615","relation":{},"ISSN":["2296-9144"],"issn-type":[{"value":"2296-9144","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,12,5]]},"article-number":"1461615"}}