{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,11]],"date-time":"2026-03-11T01:46:13Z","timestamp":1773193573544,"version":"3.50.1"},"reference-count":65,"publisher":"Association for Computing Machinery (ACM)","issue":"4","license":[{"start":{"date-parts":[[2023,12,13]],"date-time":"2023-12-13T00:00:00Z","timestamp":1702425600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"funder":[{"DOI":"10.13039\/501100004359","name":"Swedish Research Council","doi-asserted-by":"crossref","award":["2017-05189"],"award-info":[{"award-number":["2017-05189"]}],"id":[{"id":"10.13039\/501100004359","id-type":"DOI","asserted-by":"crossref"}]},{"name":"Swedish Foundation for Strategic Research","award":["SSF FFL18-0199"],"award-info":[{"award-number":["SSF FFL18-0199"]}]},{"name":"NordForsk, the Digital Futures research Center, the Vinnova Competence Center for Trustworthy Edge Computing Systems and Applications at KTH"},{"name":"Wallenberg Al, Autonomous Systems and Software Program (WASP) funded by the Knut and Alice Wallenberg Foundation"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["J. Hum.-Robot Interact."],"published-print":{"date-parts":[[2023,12,31]]},"abstract":"<jats:p>Non-verbal communication is important in HRI, particularly when humans and robots do not need to actively engage in a task together, but rather they co-exist in a shared space. Robots might still need to communicate states such as urgency or availability, and where they intend to go, to avoid collisions and disruptions. Sounds could be used to communicate such states and intentions in an intuitive and non-disruptive way. Here, we propose a multi-layer classification system for displaying various robot information simultaneously via sound. We first conceptualise which robot features could be displayed (robot size, speed, availability for interaction, urgency, and directionality); we then map them to a set of audio parameters. The designed sounds were then evaluated in five\u00a0online studies, where people listened to the sounds and were asked to identify the associated robot features. The sounds were generally understood as intended by participants, especially when they were evaluated one feature at a time, and partially when they were evaluated two features simultaneously. The results of these evaluations suggest that sounds can be successfully used to communicate robot states and intended actions implicitly and intuitively.<\/jats:p>","DOI":"10.1145\/3611655","type":"journal-article","created":{"date-parts":[[2023,8,18]],"date-time":"2023-08-18T03:53:39Z","timestamp":1692330819000},"page":"1-26","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":7,"title":["Sounding Robots: Design and Evaluation of Auditory Displays for Unintentional Human-robot Interaction"],"prefix":"10.1145","volume":"12","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-8542-255X","authenticated-orcid":false,"given":"Bastian","family":"Orthmann","sequence":"first","affiliation":[{"name":"KTH Royal Institute of Technology, Sweden"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2212-4325","authenticated-orcid":false,"given":"Iolanda","family":"Leite","sequence":"additional","affiliation":[{"name":"KTH Royal Institute of Technology, Sweden"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3086-0322","authenticated-orcid":false,"given":"Roberto","family":"Bresin","sequence":"additional","affiliation":[{"name":"KTH Royal Institute of Technology, Sweden"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8601-1370","authenticated-orcid":false,"given":"Ilaria","family":"Torre","sequence":"additional","affiliation":[{"name":"Chalmers University of Technology, Sweden and KTH Royal Institute of Technology, Sweden"}]}],"member":"320","published-online":{"date-parts":[[2023,12,13]]},"reference":[{"key":"e_1_3_2_2_2","doi-asserted-by":"publisher","DOI":"10.1523\/JNEUROSCI.5433-06.2007"},{"key":"e_1_3_2_3_2","doi-asserted-by":"publisher","DOI":"10.1177\/1084713810393751"},{"key":"e_1_3_2_4_2","doi-asserted-by":"publisher","DOI":"10.1145\/3342775.3342806"},{"key":"e_1_3_2_5_2","doi-asserted-by":"publisher","DOI":"10.3758\/s13415-014-0309-4"},{"key":"e_1_3_2_6_2","doi-asserted-by":"publisher","DOI":"10.1080\/00220973.1995.9943797"},{"key":"e_1_3_2_7_2","doi-asserted-by":"publisher","DOI":"10.3389\/frobt.2021.719154"},{"key":"e_1_3_2_8_2","first-page":"1075","volume-title":"27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201918)","author":"Bolano Gabriele","year":"2018","unstructured":"Gabriele Bolano, Arne Roennau, and Ruediger Dillmann. 2018. Transparent robot behavior by adding intuitive visual and acoustic feedback to motion replanning. In 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201918). IEEE, 1075\u20131080."},{"key":"e_1_3_2_9_2","volume-title":"The Physiology of Hearing","author":"Britannica Editors of Encyclopaedia","year":"2020","unstructured":"Editors of Encyclopaedia Britannica. 2020. The Physiology of Hearing. Retrieved from https:\/\/www.britannica.com\/science\/ear\/The-physiology-of-hearing"},{"key":"e_1_3_2_10_2","first-page":"1","volume-title":"CHI Conference on Human Factors in Computing Systems","author":"Cambre Julia","year":"2020","unstructured":"Julia Cambre, Jessica Colnago, Jim Maddock, Janice Tsai, and Jofish Kaye. 2020. Choice of voices: A large-scale evaluation of text-to-speech voice quality for long-form content. In CHI Conference on Human Factors in Computing Systems. 1\u201313."},{"key":"e_1_3_2_11_2","doi-asserted-by":"publisher","DOI":"10.1145\/3359325"},{"key":"e_1_3_2_12_2","doi-asserted-by":"publisher","DOI":"10.1145\/3171221.3171285"},{"key":"e_1_3_2_13_2","doi-asserted-by":"publisher","DOI":"10.1109\/IROS.2016.7759744"},{"key":"e_1_3_2_14_2","doi-asserted-by":"publisher","DOI":"10.4324\/9781315657813"},{"key":"e_1_3_2_15_2","doi-asserted-by":"publisher","DOI":"10.1145\/3077981.3078047"},{"key":"e_1_3_2_16_2","volume-title":"Musical Illusions and Paradoxes","author":"Deutsch Diana","year":"1995","unstructured":"Diana Deutsch. 1995. Musical Illusions and Paradoxes. Philomel."},{"key":"e_1_3_2_17_2","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0082491"},{"key":"e_1_3_2_18_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.actpsy.2020.103141"},{"key":"e_1_3_2_19_2","doi-asserted-by":"publisher","DOI":"10.2345\/0899-8205-45.4.290"},{"key":"e_1_3_2_20_2","doi-asserted-by":"publisher","DOI":"10.1136\/amiajnl-2012-001061"},{"key":"e_1_3_2_21_2","doi-asserted-by":"publisher","DOI":"10.1525\/mp.2012.30.1.49"},{"key":"e_1_3_2_22_2","doi-asserted-by":"publisher","DOI":"10.3389\/fpsyg.2013.00487"},{"key":"e_1_3_2_23_2","doi-asserted-by":"publisher","DOI":"10.3758\/APP.71.6.1360"},{"key":"e_1_3_2_24_2","doi-asserted-by":"publisher","DOI":"10.1007\/s12369-021-00788-4"},{"key":"e_1_3_2_25_2","unstructured":"Emma Frid Roberto Bresin and Simon Alexanderson. 2018. Perception Of mechanical sounds inherent to expressive gestures Of A nao robot-implications for movement soniication Of humanoids. In Proceedings of the Sound and Music Computing Conference (SMC\u201918) . Sound and Music Computing Network Limassol Cyprus 43\u201351."},{"key":"e_1_3_2_26_2","volume-title":"The Sonification Handbook","author":"Hermann Thomas","year":"2011","unstructured":"Thomas Hermann, Andy Hunt, and John G. Neuhoff. 2011. The Sonification Handbook. Logos Verlag, Berlin."},{"key":"e_1_3_2_27_2","doi-asserted-by":"publisher","DOI":"10.1109\/TRO.2007.907483"},{"key":"e_1_3_2_28_2","doi-asserted-by":"publisher","DOI":"10.1007\/s11370-010-0070-7"},{"key":"e_1_3_2_29_2","doi-asserted-by":"publisher","DOI":"10.3758\/s13423-020-01756-1"},{"key":"e_1_3_2_30_2","doi-asserted-by":"publisher","DOI":"10.1023\/A:1013953213049"},{"key":"e_1_3_2_31_2","volume-title":"Mastering Audio: The Art and the Science","author":"Katz R. A.","year":"2015","unstructured":"R. A. Katz. 2015. Mastering Audio: The Art and the Science. Focal Press. 2014024250 Retrieved from https:\/\/books.google.se\/books?id=P8QwMQEACAAJ"},{"key":"e_1_3_2_32_2","doi-asserted-by":"publisher","DOI":"10.3389\/fnbot.2020.593732"},{"key":"e_1_3_2_33_2","first-page":"1","volume-title":"Sound and Music Computing Conference","author":"Latupeirissa Adrian Benigno","year":"2019","unstructured":"Adrian Benigno Latupeirissa, Emma Frid, and Roberto Bresin. 2019. Sonic characteristics of robots in films. In Sound and Music Computing Conference. 1\u20136."},{"key":"e_1_3_2_34_2","doi-asserted-by":"publisher","DOI":"10.1145\/3469595.3469614"},{"key":"e_1_3_2_35_2","doi-asserted-by":"publisher","DOI":"10.1121\/1.401778"},{"key":"e_1_3_2_36_2","first-page":"5286","volume-title":"CHI Conference on Human Factors in Computing Systems","author":"Luger Ewa","year":"2016","unstructured":"Ewa Luger and Abigail Sellen. 2016. \u201cLike having a really bad PA\u201d: The gulf between user expectation and experience of conversational agents. In CHI Conference on Human Factors in Computing Systems. 5286\u20135297."},{"key":"e_1_3_2_37_2","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0090779"},{"key":"e_1_3_2_38_2","doi-asserted-by":"publisher","DOI":"10.1109\/HRI.2019.8673305"},{"key":"e_1_3_2_39_2","doi-asserted-by":"publisher","DOI":"10.1145\/2909824.3020238"},{"key":"e_1_3_2_40_2","volume-title":"1st International Workshop on Vocal Interactivity In-and-between Humans, Animals and Robots","author":"Moore Roger K.","year":"2017","unstructured":"Roger K. Moore. 2017. Appropriate voices for artefacts: Some key insights. In 1st International Workshop on Vocal Interactivity In-and-between Humans, Animals and Robots."},{"key":"e_1_3_2_41_2","doi-asserted-by":"publisher","DOI":"10.1007\/978-981-10-2585-3_22"},{"key":"e_1_3_2_42_2","doi-asserted-by":"publisher","DOI":"10.1145\/2513383.2513455"},{"key":"e_1_3_2_43_2","doi-asserted-by":"publisher","DOI":"10.1159\/000261678"},{"key":"e_1_3_2_44_2","first-page":"105","volume-title":"ACM\/IEEE International Conference on Human-robot Interaction","author":"Phillips Elizabeth","year":"2018","unstructured":"Elizabeth Phillips, Xuan Zhao, Daniel Ullman, and Bertram F. Malle. 2018. What is human-like? Decomposing robots\u2019 human-like appearance using the Anthropomorphic roBOT (ABOT) database. In ACM\/IEEE International Conference on Human-robot Interaction. 105\u2013113."},{"key":"e_1_3_2_45_2","volume-title":"Tutorial for the Handbook for Acoustic Ecology","author":"Publishing Cambridge Street","year":"2020","unstructured":"Cambridge Street Publishing. 2020. Tutorial for the Handbook for Acoustic Ecology. Retrieved from http:\/\/www.sfu.ca\/sonic-studio-webdav\/cmns\/Handbook%20Tutorial\/Filters.html"},{"key":"e_1_3_2_46_2","doi-asserted-by":"publisher","DOI":"10.1145\/3371382.3377431"},{"key":"e_1_3_2_47_2","doi-asserted-by":"publisher","DOI":"10.1145\/3434073.3444658"},{"key":"e_1_3_2_48_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.cognition.2005.01.004"},{"key":"e_1_3_2_49_2","first-page":"161","volume-title":"23rd IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201914)","author":"Schwenk Markus","year":"2014","unstructured":"Markus Schwenk and Kai O. Arras. 2014. R2-D2 reloaded: A flexible sound synthesis system for sonic human-robot interaction design. In 23rd IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201914). IEEE, 161\u2013167."},{"key":"e_1_3_2_50_2","doi-asserted-by":"publisher","DOI":"10.1109\/AIM.2016.7577007"},{"key":"e_1_3_2_51_2","first-page":"1418","volume-title":"Human Factors and Ergonomics Society Annual Meeting","author":"Sims Valerie K.","year":"2009","unstructured":"Valerie K. Sims, Matthew G. Chin, Heather C. Lum, Linda Upham-Ellis, Tatiana Ballion, and Nicholas C. Lagattuta. 2009. Robots\u2019 auditory cues are subject to anthropomorphism. In Human Factors and Ergonomics Society Annual Meeting, Vol. 53. SAGE Publications Sage CA, Los Angeles, CA, 1418\u20131421."},{"key":"e_1_3_2_52_2","doi-asserted-by":"publisher","DOI":"10.1145\/3171221.3171249"},{"key":"e_1_3_2_53_2","doi-asserted-by":"publisher","DOI":"10.1145\/2702123.2702374"},{"key":"e_1_3_2_54_2","doi-asserted-by":"publisher","DOI":"10.1007\/s12369-011-0100-4"},{"key":"e_1_3_2_55_2","first-page":"928","volume-title":"26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201917)","author":"Tennent Hamish","year":"2017","unstructured":"Hamish Tennent, Dylan Moore, Malte Jung, and Wendy Ju. 2017. Good vibrations: How consequential sounds affect perception of robotic arms. In 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201917). IEEE, 928\u2013935."},{"key":"e_1_3_2_56_2","doi-asserted-by":"publisher","DOI":"10.1196\/annals.1440.012"},{"key":"e_1_3_2_57_2","doi-asserted-by":"publisher","DOI":"10.1145\/3183654.3183691"},{"key":"e_1_3_2_58_2","doi-asserted-by":"publisher","DOI":"10.1109\/RO-MAN47096.2020.9223449"},{"key":"e_1_3_2_59_2","doi-asserted-by":"publisher","DOI":"10.1109\/RO-MAN47096.2020.9223599"},{"key":"e_1_3_2_60_2","first-page":"713","volume-title":"27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201918)","author":"Trovato Gabriele","year":"2018","unstructured":"Gabriele Trovato, Renato Paredes, Javier Balvin, Francisco Cuellar, Nicolai B\u00e6k Thomsen, Soren Bech, and Zheng-Hua Tan. 2018. The sound or silence: Investigating the influence of robot noise on proxemics. In 27th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201918). IEEE, 713\u2013718."},{"key":"e_1_3_2_61_2","first-page":"9","article-title":"Theory of sonification","volume":"1","author":"Walker Bruce N.","year":"2011","unstructured":"Bruce N. Walker and Michael A. Nees. 2011. Theory of sonification. Sonif. Handb. 1 (2011), 9\u201339.","journal-title":"Sonif. Handb."},{"key":"e_1_3_2_62_2","first-page":"707","volume-title":"17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201908)","author":"Walters Michael L.","year":"2008","unstructured":"Michael L. Walters, Dag Sverre Syrdal, Kheng Lee Koay, Kerstin Dautenhahn, and Ren\u00e9 Te Boekhorst. 2008. Human approach distances to a mechanical-looking robot with different robot voice styles. In 17th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN\u201908). IEEE, 707\u2013712."},{"key":"e_1_3_2_63_2","first-page":"40","volume-title":"1st International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots (VIHAR\u201917)","author":"Wilson Sarah","year":"2017","unstructured":"Sarah Wilson and Roger K. Moore. 2017. Robot, alien and cartoon voices: Implications for speech-enabled systems. In 1st International Workshop on Vocal Interactivity in-and-between Humans, Animals and Robots (VIHAR\u201917). 40\u201344."},{"key":"e_1_3_2_64_2","doi-asserted-by":"publisher","DOI":"10.1080\/10447318.2015.1093856"},{"key":"e_1_3_2_65_2","doi-asserted-by":"publisher","DOI":"10.1109\/RO-MAN47096.2020.9223452"},{"key":"e_1_3_2_66_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICRA48506.2021.9562082"}],"container-title":["ACM Transactions on Human-Robot Interaction"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3611655","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3611655","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T16:36:12Z","timestamp":1750178172000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3611655"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,12,13]]},"references-count":65,"journal-issue":{"issue":"4","published-print":{"date-parts":[[2023,12,31]]}},"alternative-id":["10.1145\/3611655"],"URL":"https:\/\/doi.org\/10.1145\/3611655","relation":{},"ISSN":["2573-9522"],"issn-type":[{"value":"2573-9522","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,12,13]]},"assertion":[{"value":"2022-05-25","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2023-06-22","order":1,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2023-12-13","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}