{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,17]],"date-time":"2026-03-17T04:57:26Z","timestamp":1773723446305,"version":"3.50.1"},"reference-count":59,"publisher":"Springer Science and Business Media LLC","issue":"2","license":[{"start":{"date-parts":[[2023,12,16]],"date-time":"2023-12-16T00:00:00Z","timestamp":1702684800000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2023,12,16]],"date-time":"2023-12-16T00:00:00Z","timestamp":1702684800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"BMBF"}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Int J of Soc Robotics"],"published-print":{"date-parts":[[2024,2]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Delivery robots and personal cargo robots are increasingly sharing space with incidentally co-present persons (InCoPs) on pedestrian ways facing the challenge of socially adequate and safe navigation. Humans are able to effortlessly negotiate this shared space by signalling their skirting intentions via non-verbal gaze cues. In two online-experiments we investigated whether this phenomenon of gaze cuing can be transferred to human\u2013robot interaction. In the first study, participants (<jats:italic>n<\/jats:italic> = 92) watched short videos in which either a human, a humanoid robot or a non-humanoid delivery robot moved towards the camera. In each video, the counterpart looked either straight towards the camera or did an eye movement to the right or left. The results showed that when the counterpart gaze cued to their left, also participants skirted more often to the left from their perspective, thereby walking past each other and avoiding collision. Since the participants were recruited in a right-hand driving country we replicated the study in left-hand driving countries (<jats:italic>n<\/jats:italic> = 176). Results showed that participants skirted more often to the right when the counterpart gaze cued to the right, and to the left in case of eye movements to the left, expanding our previous result. In both studies, skirting behavior did not differ regarding the type of counterpart. Hence, gaze cues increase the chance to trigger complementary skirting behavior in InCoPs independently of the robot morphology. Equipping robots with eyes can help to indicate moving direction by gaze cues and thereby improve interactions between humans and robots on pedestrian ways.<\/jats:p>","DOI":"10.1007\/s12369-023-01064-3","type":"journal-article","created":{"date-parts":[[2023,12,16]],"date-time":"2023-12-16T03:01:42Z","timestamp":1702695702000},"page":"311-325","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":6,"title":["Gaze-Cues of Humans and Robots on Pedestrian Ways"],"prefix":"10.1007","volume":"16","author":[{"ORCID":"https:\/\/orcid.org\/0009-0007-8663-1856","authenticated-orcid":false,"given":"Carla S.","family":"Jakobowsky","sequence":"first","affiliation":[]},{"given":"Anna M. H.","family":"Abrams","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2497-143X","authenticated-orcid":false,"given":"Astrid M.","family":"Rosenthal-von der P\u00fctten","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2023,12,16]]},"reference":[{"key":"1064_CR1","unstructured":"Technologies S. Starship. https:\/\/www.starship.xyz\/. Accessed 12 May 2023"},{"key":"1064_CR2","unstructured":"Lennartz T. UrbANT. https:\/\/urbant.de\/de\/index.htm. Accessed 12 May 2023"},{"key":"1064_CR3","unstructured":"Presse-Agentur D. Starship-Lieferroboter Werden in Europa Getestet. https:\/\/www.zeit.de\/news\/2016-07\/06\/computer-starship-lieferroboter-werden-in-europa-getestet-06164019. Accessed 12 May 2023"},{"key":"1064_CR4","unstructured":"Marr B. The future of delivery robots. https:\/\/www.forbes.com\/sites\/bernardmarr\/2021\/11\/05\/the-future-of-delivery-robots\/. Accessed 12 May 2023"},{"issue":"4","key":"1064_CR5","doi-asserted-by":"publisher","first-page":"833","DOI":"10.1007\/s12369-020-00666-5","volume":"13","author":"L Onnasch","year":"2021","unstructured":"Onnasch L, Roesler E (2021) A taxonomy to structure and analyze human\u2013robot interaction. Int J Soc Robot 13(4):833\u2013849. https:\/\/doi.org\/10.1007\/s12369-020-00666-5","journal-title":"Int J Soc Robot"},{"key":"1064_CR6","doi-asserted-by":"publisher","unstructured":"Rosenthal-von\u00a0der P\u00fctten A, Sirkin D, Abrams A, Platte L (2020) The forgotten in hri: incidental encounters with robots in public spaces. In: Companion of the 2020 ACM\/IEEE international conference on human\u2013robot interaction. HRI \u201920. Association for Computing Machinery, New York, NY, USA, pp 656\u2013657. https:\/\/doi.org\/10.1145\/3371382.3374852","DOI":"10.1145\/3371382.3374852"},{"issue":"7","key":"1064_CR7","doi-asserted-by":"publisher","first-page":"1625","DOI":"10.1007\/s12369-022-00894-x","volume":"14","author":"F Babel","year":"2022","unstructured":"Babel F, Kraus J, Baumann M (2022) Findings from a qualitative field study with an autonomous robot in public: exploration of user reactions and conflicts. Int J Soc Robot 14(7):1625\u20131655. https:\/\/doi.org\/10.1007\/s12369-022-00894-x","journal-title":"Int J Soc Robot"},{"key":"1064_CR8","doi-asserted-by":"publisher","DOI":"10.1145\/3550280","author":"S Nielsen","year":"2022","unstructured":"Nielsen S, Skov MB, Hansen KD, Kaszowska A (2022) Using user-generated Youtube videos to understand unguided interactions with robots in public places. J Hum Robot Interact. https:\/\/doi.org\/10.1145\/3550280","journal-title":"J Hum Robot Interact"},{"key":"1064_CR9","doi-asserted-by":"publisher","unstructured":"Abrams AMH, Dautzenberg PSC, Jakobowsky C, Ladwig S, Rosenthal-von\u00a0der P\u00fctten AM (2021) A theoretical and empirical reflection on technology acceptance models for autonomous delivery robots. In: Proceedings of the 2021 ACM\/IEEE international conference on human\u2013robot interaction. HRI \u201921. Association for Computing Machinery, New York, NY, USA, pp 272\u2013280. https:\/\/doi.org\/10.1145\/3434073.3444662","DOI":"10.1145\/3434073.3444662"},{"key":"1064_CR10","unstructured":"Abrams AMH, Platte L, Rosenthal-von\u00a0der P\u00fctten A (2020) Field observation: interactions between pedestrians and a delivery robot. In: IEEE international conference on robot & human interactive communication ROMAN-2020. Crowdbot workshop: robots from pathways to crowds, ethical, legal and safety concerns of robot navigating human environments. http:\/\/crowdbot.eu\/wp-content\/uploads\/2020\/09\/Short-Talk-1-Workshop_Abstract_Field-Observation_final.pdf"},{"key":"1064_CR11","unstructured":"van Mierlo S (2021) Field observations of reactions of incidentally copresent pedestrians to a seemingly autonomous sidewalk delivery vehicle: an exploratory study. Master\u2019s thesis, Universiteit Utrecht. http:\/\/mwlc.global\/wp-content\/uploads\/2021\/08\/Thesis_Shianne_van_Mierlo_6206557.pdf"},{"key":"1064_CR12","doi-asserted-by":"publisher","unstructured":"Mahadevan K, Somanath S, Sharlin E (2018) Communicating awareness and intent in autonomous vehicle-pedestrian interaction. In: Proceedings of the 2018 CHI conference on human factors in computing systems. CHI \u201918. Association for Computing Machinery, New York, NY, USA, pp 1\u201312. https:\/\/doi.org\/10.1145\/3173574.3174003","DOI":"10.1145\/3173574.3174003"},{"issue":"22","key":"1064_CR13","doi-asserted-by":"publisher","first-page":"58","DOI":"10.1177\/0361198118777091","volume":"2672","author":"SC Stanciu","year":"2018","unstructured":"Stanciu SC, Eby DW, Molnar LJ, Louis RMS, Zanier N, Kostyniuk LP (2018) Pedestrians\/bicyclists and autonomous vehicles: how will they communicate? Transp Res Rec 2672(22):58\u201366. https:\/\/doi.org\/10.1177\/0361198118777091","journal-title":"Transp Res Rec"},{"key":"1064_CR14","doi-asserted-by":"publisher","first-page":"320","DOI":"10.1007\/978-3-030-62056-1_27","volume-title":"Social robotics","author":"J Hart","year":"2020","unstructured":"Hart J, Mirsky R, Xiao X, Tejeda S, Mahajan B, Goo J, Baldauf K, Owen S, Stone P (2020) Using human-inspired signals to disambiguate navigational intentions. In: Wagner AR, Feil-Seifer D, Haring KS, Rossi S, Williams T, He H, Sam Ge S (eds) Social robotics. Springer, Cham, pp 320\u2013331"},{"key":"1064_CR15","doi-asserted-by":"publisher","unstructured":"Chang C-M, Toda K, Gui X, Seo SH, Igarashi T (2022) Can eyes on a car reduce traffic accidents? In: Proceedings of the 14th international conference on automotive user interfaces and interactive vehicular applications. AutomotiveUI \u201922. Association for Computing Machinery, New York, NY, USA, pp 349\u2013359. https:\/\/doi.org\/10.1145\/3543174.3546841","DOI":"10.1145\/3543174.3546841"},{"key":"1064_CR16","doi-asserted-by":"publisher","unstructured":"Li Y, Dikmen M, Hussein TG, Wang Y, Burns C (2018) To cross or not to cross: urgency-based external warning displays on autonomous vehicles to improve pedestrian crossing safety. In: Proceedings of the 10th international conference on automotive user interfaces and interactive vehicular applications. AutomotiveUI \u201918. Association for Computing Machinery, New York, NY, USA, pp 188\u2013197. https:\/\/doi.org\/10.1145\/3239060.3239082","DOI":"10.1145\/3239060.3239082"},{"issue":"2","key":"1064_CR17","doi-asserted-by":"publisher","first-page":"176","DOI":"10.1109\/THMS.2019.2960517","volume":"50","author":"S Deb","year":"2020","unstructured":"Deb S, Carruth DW, Hudson CR (2020) How communicating features can help pedestrian safety in the presence of self-driving vehicles: virtual reality experiment. IEEE Trans Hum Mach Syst 50(2):176\u2013186. https:\/\/doi.org\/10.1109\/THMS.2019.2960517","journal-title":"IEEE Trans Hum Mach Syst"},{"key":"1064_CR18","unstructured":"Reeves B, Nass C (1996) The media equation: how people treat computers, television, and new media like real people. Cambridge, UK, vol 10, p 236605"},{"issue":"12","key":"1064_CR19","doi-asserted-by":"publisher","first-page":"1454","DOI":"10.1111\/j.1467-9280.2009.02464.x","volume":"20","author":"L Nummenmaa","year":"2009","unstructured":"Nummenmaa L, Hy\u00f6n\u00e4 J, Hietanen JK (2009) I\u2019ll walk this way: eyes reveal the direction of locomotion and make passersby look and go the other way. Psychol Sci 20(12):1454\u20131458. https:\/\/doi.org\/10.1111\/j.1467-9280.2009.02464.x","journal-title":"Psychol Sci"},{"issue":"2","key":"1064_CR20","doi-asserted-by":"publisher","first-page":"221","DOI":"10.1007\/s00221-001-0983-7","volume":"143","author":"MA Hollands","year":"2002","unstructured":"Hollands MA, Patla AE, Vickers JN (2002) \u201clook where you\u2019re going!\u2019\u2019: gaze behaviour associated with maintaining and changing the direction of locomotion. Exp Brain Res 143(2):221\u2013230. https:\/\/doi.org\/10.1007\/s00221-001-0983-7","journal-title":"Exp Brain Res"},{"key":"1064_CR21","doi-asserted-by":"publisher","first-page":"95","DOI":"10.1007\/978-3-642-04504-2_7","volume-title":"Pedestrian and evacuation dynamics 2008","author":"K Kitazawa","year":"2010","unstructured":"Kitazawa K, Fujiyama T (2010) Pedestrian vision and collision avoidance behavior: investigation of the information process space of pedestrians using an eye tracker. In: Klingsch WWF, Rogsch C, Schadschneider A, Schreckenberg M (eds) Pedestrian and evacuation dynamics 2008. Springer, Berlin, pp 95\u2013108"},{"issue":"1","key":"1064_CR22","doi-asserted-by":"publisher","first-page":"25","DOI":"10.5898\/JHRI.6.1.Admoni","volume":"6","author":"H Admoni","year":"2017","unstructured":"Admoni H, Scassellati B (2017) Social eye gaze in human\u2013robot interaction: a review. J Hum Robot Interact 6(1):25\u201363. https:\/\/doi.org\/10.5898\/JHRI.6.1.Admoni","journal-title":"J Hum Robot Interact"},{"key":"1064_CR23","doi-asserted-by":"crossref","unstructured":"Moon AJ, Troniak DM, Gleeson B, Pan MKXJ, Zheng M, Blumer BA, MacLean K, Croft EA (2014) Meet me where i\u2019m gazing: how shared attention gaze affects human-robot handover timing. In: 2014 9th ACM\/IEEE international conference on human\u2013robot interaction (HRI), pp 334\u2013341","DOI":"10.1145\/2559636.2559656"},{"key":"1064_CR24","doi-asserted-by":"publisher","unstructured":"Mutlu B, Forlizzi J, Hodgins J (2006) A storytelling robot: modeling and evaluation of human-like gaze behavior. In: 2006 6th IEEE-RAS international conference on humanoid robots, pp 518\u2013523. https:\/\/doi.org\/10.1109\/ICHR.2006.321322","DOI":"10.1109\/ICHR.2006.321322"},{"issue":"16","key":"1064_CR25","doi-asserted-by":"publisher","first-page":"5413","DOI":"10.3390\/app10165413","volume":"10","author":"W Lee","year":"2020","unstructured":"Lee W, Park CH, Jang S, Cho H-K (2020) Design of effective robotic gaze-based social cueing for users in task-oriented situations: how to overcome in-attentional blindness? Appl Sci 10(16):5413. https:\/\/doi.org\/10.3390\/app10165413","journal-title":"Appl Sci"},{"key":"1064_CR26","doi-asserted-by":"publisher","unstructured":"Andrist S, Tan XZ, Gleicher M, Mutlu B (2014) Conversational gaze aversion for humanlike robots. In: Proceedings of the 2014 ACM\/IEEE international conference on human\u2013robot interaction. HRI \u201914. Association for Computing Machinery, New York, NY, USA, pp 25\u201332. https:\/\/doi.org\/10.1145\/2559636.2559666","DOI":"10.1145\/2559636.2559666"},{"key":"1064_CR27","doi-asserted-by":"publisher","unstructured":"Yamashita S, Kurihara T, Ikeda T, Shinozawa K, Iwaki S (2020) Evaluation of robots that signals a pedestrian using face orientation based on analysis of velocity vector fluctuation in moving trajectories, vol 34. Taylor & Francis, pp 1309\u20131323. https:\/\/doi.org\/10.1080\/01691864.2020.1811763","DOI":"10.1080\/01691864.2020.1811763"},{"issue":"4","key":"1064_CR28","doi-asserted-by":"publisher","first-page":"520","DOI":"10.1098\/rsbl.2012.0160","volume":"8","author":"AC Gallup","year":"2012","unstructured":"Gallup AC, Chong A, Couzin ID (2012) The directional flow of visual information transfer between pedestrians. Biol Let 8(4):520\u2013522. https:\/\/doi.org\/10.1098\/rsbl.2012.0160","journal-title":"Biol Let"},{"issue":"10","key":"1064_CR29","doi-asserted-by":"publisher","first-page":"2633","DOI":"10.1007\/s00221-022-06427-2","volume":"240","author":"TM Bhojwani","year":"2022","unstructured":"Bhojwani TM, Lynch SD, B\u00fchler MA, Lamontagne A (2022) Impact of dual tasking on gaze behaviour and locomotor strategies adopted while circumventing virtual pedestrians during a collision avoidance task. Exp Brain Res 240(10):2633\u20132645. https:\/\/doi.org\/10.1007\/s00221-022-06427-2","journal-title":"Exp Brain Res"},{"issue":"10","key":"1064_CR30","doi-asserted-by":"publisher","first-page":"5","DOI":"10.1167\/jov.20.10.5","volume":"20","author":"RS Hessels","year":"2020","unstructured":"Hessels RS, Benjamins JS, van Doorn AJ, Koenderink JJ, Holleman GA, Hooge ITC (2020) Looking behavior and potential human interactions during locomotion. J Vis 20(10):5\u20135. https:\/\/doi.org\/10.1167\/jov.20.10.5","journal-title":"J Vis"},{"key":"1064_CR31","doi-asserted-by":"crossref","unstructured":"Ruhland K, Peters CE, Andrist S, Badler JB, Badler NI, Gleicher M, Mutlu B, McDonnell R (2015) A review of eye gaze in virtual agents, social robotics and hci: behaviour generation, user interaction and perception. In: Computer graphics forum, vol 34. Wiley Online Library, pp 299\u2013326","DOI":"10.1111\/cgf.12603"},{"issue":"5","key":"1064_CR32","doi-asserted-by":"publisher","first-page":"783","DOI":"10.1007\/s12369-015-0305-z","volume":"7","author":"M Zheng","year":"2015","unstructured":"Zheng M, Moon A, Croft EA, Meng MQ-H (2015) Impacts of robot head gaze on robot-to-human handovers. Int J Soc Robot 7(5):783\u2013798. https:\/\/doi.org\/10.1007\/s12369-015-0305-z","journal-title":"Int J Soc Robot"},{"key":"1064_CR33","doi-asserted-by":"publisher","unstructured":"Terzio\u011flu Y, Mutlu B, \u015eahin E (2020) Designing social cues for collaborative robots: The role of gaze and breathing in human-robot collaboration. In: Proceedings of the 2020 ACM\/IEEE international conference on human\u2013robot interaction. HRI \u201920. Association for Computing Machinery, New York, NY, USA, pp 343\u2013357. https:\/\/doi.org\/10.1145\/3319502.3374829","DOI":"10.1145\/3319502.3374829"},{"key":"1064_CR34","doi-asserted-by":"publisher","unstructured":"Mutlu B, Shiwa T, Kanda T, Ishiguro H, Hagita N (2009) Footing in human\u2013robot conversations: how robots might shape participant roles using gaze cues. In: Proceedings of the 4th ACM\/IEEE international conference on human robot interaction. HRI \u201909. Association for Computing Machinery, New York, NY, USA, pp 61\u201368. https:\/\/doi.org\/10.1145\/1514095.1514109","DOI":"10.1145\/1514095.1514109"},{"key":"1064_CR35","doi-asserted-by":"crossref","unstructured":"Takayama L, Pantofaru C (2009) Influences on proxemic behaviors in human\u2013robot interaction. In: 2009 IEEE\/RSJ international conference on intelligent robots and systems. IEEE, pp 5495\u20135502","DOI":"10.1109\/IROS.2009.5354145"},{"key":"1064_CR36","doi-asserted-by":"publisher","unstructured":"Angelopoulos G, Rossi A, Napoli CD, Rossi S (2022) You are in my way: non-verbal social cues for legible robot navigation behaviors. In: 2022 IEEE\/RSJ international conference on intelligent robots and systems (IROS), pp 657\u2013662. https:\/\/doi.org\/10.1109\/IROS47612.2022.9981754","DOI":"10.1109\/IROS47612.2022.9981754"},{"issue":"3","key":"1064_CR37","doi-asserted-by":"publisher","first-page":"692","DOI":"10.1109\/TRO.2020.2964824","volume":"36","author":"Y Che","year":"2020","unstructured":"Che Y, Okamura AM, Sadigh D (2020) Efficient and trustworthy social navigation via explicit and implicit robot-human communication. IEEE Trans Robot 36(3):692\u2013707. https:\/\/doi.org\/10.1109\/TRO.2020.2964824","journal-title":"IEEE Trans Robot"},{"key":"1064_CR38","doi-asserted-by":"crossref","unstructured":"Gehrke SR, Russo BJ, Phair CD, Smaglik EJ (2022) Evaluation of sidewalk autonomous delivery robot interactions with pedestrians and bicyclists. Technical report","DOI":"10.1016\/j.trip.2023.100789"},{"key":"1064_CR39","unstructured":"van Mierlo S (2021) Field observations of reactions of incidentally copresent pedestrians to a seemingly autonomous sidewalk delivery vehicle: an exploratory study. Master\u2019s thesis"},{"key":"1064_CR40","unstructured":"Hardeman K (2021) Encounters with a seemingly autonomous sidewalk delivery vehicle: interviews with incidentally copresent pedestrians. Master\u2019s thesis"},{"key":"1064_CR41","doi-asserted-by":"crossref","unstructured":"Yu X, Hoggenmueller M, Tomitsch M (2023) Your way or my way: improving human-robot co-navigation through robot intent and pedestrian prediction visualisations. In: Proceedings of the 2023 ACM\/IEEE international conference on human\u2013robot interaction, pp 211\u2013221","DOI":"10.1145\/3568162.3576992"},{"issue":"3","key":"1064_CR42","doi-asserted-by":"publisher","first-page":"5010","DOI":"10.1109\/LRA.2021.3068708","volume":"6","author":"NJ Hetherington","year":"2021","unstructured":"Hetherington NJ, Croft EA, Van der Loos HFM (2021) Hey robot, which way are you going? nonverbal motion legibility cues for human\u2013robot spatial interaction. IEEE Robot Autom Lett 6(3):5010\u20135015. https:\/\/doi.org\/10.1109\/LRA.2021.3068708","journal-title":"IEEE Robot Autom Lett"},{"key":"1064_CR43","doi-asserted-by":"publisher","unstructured":"Kannan SS, Lee A, Min B-C (2021) External human\u2013machine interface on delivery robots: expression of navigation intent of the robot. In: 2021 30th IEEE international conference on robot & human interactive communication (RO-MAN), pp 1305\u20131312. https:\/\/doi.org\/10.1109\/RO-MAN50785.2021.9515408","DOI":"10.1109\/RO-MAN50785.2021.9515408"},{"key":"1064_CR44","unstructured":"Robotics S. For better business just add pepper. https:\/\/us.softbankrobotics.com\/pepper. Accessed 23 May 2023"},{"issue":"1","key":"1064_CR45","doi-asserted-by":"publisher","first-page":"119","DOI":"10.5898\/JHRI.1.1.Riek","volume":"1","author":"LD Riek","year":"2012","unstructured":"Riek LD (2012) Wizard of oz studies in hri: a systematic review and new reporting guidelines. J Hum Robot Interact 1(1):119\u2013136. https:\/\/doi.org\/10.5898\/JHRI.1.1.Riek","journal-title":"J Hum Robot Interact"},{"key":"1064_CR46","doi-asserted-by":"crossref","unstructured":"Carpinella CM, Wyman AB, Perez MA, Stroessner SJ (2017) The robotic social attributes scale (rosas): development and validation. In: 2017 12th ACM\/IEEE international conference on human\u2013robot interaction (HRI), pp 254\u2013262","DOI":"10.1145\/2909824.3020208"},{"key":"1064_CR47","unstructured":"Syrdal DS, Dautenhahn K, Koay KL, Walters ML (2009) The negative attitudes towards robots scale and reactions to robot behaviour in a live human\u2013robot interaction study. Adaptive and emergent behaviour and complex systems"},{"key":"1064_CR48","doi-asserted-by":"publisher","unstructured":"Horstmann AC, Kr\u00e4mer NC (2020) When a robot violates expectations: the influence of reward valence and expectancy violation on people\u2019s evaluation of a social robot. In: Companion of the 2020 ACM\/IEEE international conference on human\u2013robot interaction. HRI \u201920. Association for Computing Machinery, New York, NY, USA, pp 254\u2013256. https:\/\/doi.org\/10.1145\/3371382.3378292","DOI":"10.1145\/3371382.3378292"},{"issue":"2","key":"1064_CR49","doi-asserted-by":"publisher","first-page":"232","DOI":"10.1111\/j.1468-2958.1990.tb00232.x","volume":"17","author":"JK Burgoon","year":"2006","unstructured":"Burgoon JK, Walther JB (2006) Nonverbal expectancies and the evaluative consequences of violations. Hum Commun Res 17(2):232\u2013265. https:\/\/doi.org\/10.1111\/j.1468-2958.1990.tb00232.x","journal-title":"Hum Commun Res"},{"issue":"4","key":"1064_CR50","doi-asserted-by":"publisher","first-page":"485","DOI":"10.1007\/s00146-008-0181-2","volume":"23","author":"KF MacDorman","year":"2009","unstructured":"MacDorman KF, Vasudevan SK, Ho C-C (2009) Does Japan really have robot mania? Comparing attitudes by implicit and explicit measures. AI Soc 23(4):485\u2013510. https:\/\/doi.org\/10.1007\/s00146-008-0181-2","journal-title":"AI Soc"},{"key":"1064_CR51","unstructured":"Leiner Dominik J (2019) SoSci Survey (version 3.3.17). https:\/\/www.soscisurvey.de"},{"key":"1064_CR52","doi-asserted-by":"publisher","first-page":"939","DOI":"10.3389\/fpsyg.2019.00939","volume":"10","author":"AC Horstmann","year":"2019","unstructured":"Horstmann AC, Kr\u00e4mer NC (2019) Great expectations? Relation of previous experiences with social robots in real life or in the media and expectancies based on qualitative and quantitative assessment. Front Psychol 10:939. https:\/\/doi.org\/10.3389\/fpsyg.2019.00939","journal-title":"Front Psychol"},{"key":"1064_CR53","unstructured":"Core Team R (2020) R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. https:\/\/www.R-project.org\/"},{"key":"1064_CR54","doi-asserted-by":"publisher","DOI":"10.18637\/jss.v067.i01","author":"D Bates","year":"2015","unstructured":"Bates D, M\u00e4chler M, Bolker B, Walker S (2015) Fitting linear mixed-effects models using lme4. J Stat Soft. https:\/\/doi.org\/10.18637\/jss.v067.i01","journal-title":"J Stat Soft"},{"issue":"3","key":"1064_CR55","doi-asserted-by":"publisher","first-page":"255","DOI":"10.1016\/j.jml.2012.11.001","volume":"68","author":"DJ Barr","year":"2013","unstructured":"Barr DJ, Levy R, Scheepers C, Tily HJ (2013) Random effects structure for confirmatory hypothesis testing: keep it maximal. J Mem Lang 68(3):255\u2013278. https:\/\/doi.org\/10.1016\/j.jml.2012.11.001","journal-title":"J Mem Lang"},{"key":"1064_CR56","doi-asserted-by":"publisher","unstructured":"Jakobowsky CS, Abrams AMH, Rosenthal-von\u00a0der P\u00fctten AM. Gaze-cues of humans and robots on pedestrian ways\u2014supplementary material. https:\/\/doi.org\/10.17605\/OSF.IO\/NCWT5","DOI":"10.17605\/OSF.IO\/NCWT5"},{"issue":"5","key":"1064_CR57","doi-asserted-by":"publisher","first-page":"799","DOI":"10.1007\/s12369-015-0321-z","volume":"7","author":"AM Rosenthal-von der P\u00fctten","year":"2015","unstructured":"Rosenthal-von der P\u00fctten AM, Kr\u00e4mer NC (2015) Individuals\u2019 evaluations of and attitudes towards potentially uncanny robots. Int J Soc Robot 7(5):799\u2013824. https:\/\/doi.org\/10.1007\/s12369-015-0321-z","journal-title":"Int J Soc Robot"},{"key":"1064_CR58","unstructured":"Amazon Mechanical\u00a0Turk Ioia Amazon Mechanical Turk. https:\/\/www.mturk.com\/. Accessed 23 May 2023"},{"key":"1064_CR59","doi-asserted-by":"publisher","unstructured":"Mutlu B, Forlizzi J (2008) Robots in organizations: the role of workflow, social, and environmental factors in human\u2013robot interaction. In: 2008 3rd ACM\/IEEE international conference on human-robot interaction (HRI), pp 287\u2013294. https:\/\/doi.org\/10.1145\/1349822.1349860","DOI":"10.1145\/1349822.1349860"}],"container-title":["International Journal of Social Robotics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12369-023-01064-3.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s12369-023-01064-3\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12369-023-01064-3.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,2,8]],"date-time":"2024-02-08T13:17:18Z","timestamp":1707398238000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s12369-023-01064-3"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,12,16]]},"references-count":59,"journal-issue":{"issue":"2","published-print":{"date-parts":[[2024,2]]}},"alternative-id":["1064"],"URL":"https:\/\/doi.org\/10.1007\/s12369-023-01064-3","relation":{},"ISSN":["1875-4791","1875-4805"],"issn-type":[{"value":"1875-4791","type":"print"},{"value":"1875-4805","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,12,16]]},"assertion":[{"value":"14 September 2023","order":1,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"16 December 2023","order":2,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare that they have no conflict of interest.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}},{"value":"The studies were conducted guaranteeing respect for the participating volunteers and human dignity. We followed the national, EU and international ethical guidelines and conventions as laid down in the following documents: The Ethical guidelines of the German Psychological Association; The Charter of Fundamental Rights of the EU; Helsinki Declaration in its latest version. Specifically, participants gave informed consent prior to the study and were properly debriefed after completion of the study.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethical standard"}}]}}