{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,23]],"date-time":"2026-04-23T06:08:06Z","timestamp":1776924486388,"version":"3.51.2"},"reference-count":66,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2024,1,30]],"date-time":"2024-01-30T00:00:00Z","timestamp":1706572800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/100010661","name":"Horizon 2020 Framework Programme","doi-asserted-by":"publisher","id":[{"id":"10.13039\/100010661","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Robot. AI"],"abstract":"<jats:p><jats:bold>Introduction:<\/jats:bold> Communication from automated vehicles (AVs) to pedestrians using augmented reality (AR) could positively contribute to traffic safety. However, previous AR research for pedestrians was mainly conducted through online questionnaires or experiments in virtual environments instead of real ones.<\/jats:p><jats:p><jats:bold>Methods:<\/jats:bold> In this study, 28 participants conducted trials outdoors with an approaching AV and were supported by four different AR interfaces. The AR experience was created by having participants wear a Varjo XR-3 headset with see-through functionality, with the AV and AR elements virtually overlaid onto the real environment. The AR interfaces were vehicle-locked (<jats:italic>Planes on vehicle<\/jats:italic>), world-locked (<jats:italic>Fixed pedestrian lights<\/jats:italic>, <jats:italic>Virtual fence<\/jats:italic>), or head-locked (<jats:italic>Pedestrian lights HUD<\/jats:italic>). Participants had to hold down a button when they felt it was safe to cross, and their opinions were obtained through rating scales, interviews, and a questionnaire.<\/jats:p><jats:p><jats:bold>Results:<\/jats:bold> The results showed that participants had a subjective preference for AR interfaces over no AR interface. Furthermore, the <jats:italic>Pedestrian lights HUD<\/jats:italic> was more effective than no AR interface in a statistically significant manner, as it led to participants more frequently keeping the button pressed. The <jats:italic>Fixed pedestrian lights<\/jats:italic> scored lower than the other interfaces, presumably due to low saliency and the fact that participants had to visually identify both this AR interface and the AV.<\/jats:p><jats:p><jats:bold>Discussion:<\/jats:bold> In conclusion, while users favour AR in AV-pedestrian interactions over no AR, its effectiveness depends on design factors like location, visibility, and visual attention demands. In conclusion, this work provides important insights into the use of AR outdoors. The findings illustrate that, in these circumstances, a clear and easily interpretable AR interface is of key importance.<\/jats:p>","DOI":"10.3389\/frobt.2024.1324060","type":"journal-article","created":{"date-parts":[[2024,1,30]],"date-time":"2024-01-30T04:33:03Z","timestamp":1706589183000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":15,"title":["Augmented reality for supporting the interaction between pedestrians and automated vehicles: an experimental outdoor study"],"prefix":"10.3389","volume":"11","author":[{"given":"Thomas K.","family":"Aleva","sequence":"first","affiliation":[]},{"given":"Wilbert","family":"Tabone","sequence":"additional","affiliation":[]},{"given":"Dimitra","family":"Dodou","sequence":"additional","affiliation":[]},{"given":"Joost C. F.","family":"de Winter","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2024,1,30]]},"reference":[{"key":"B1","first-page":"3721","article-title":"External Human-Machine Interfaces: which of 729 colors is best for signaling \u2018Please (do not) cross","author":"Bazilinskyy","year":"2020"},{"key":"B2","doi-asserted-by":"publisher","first-page":"103450","DOI":"10.1016\/j.apergo.2021.103450","article-title":"How should external Human-Machine Interfaces behave? Examining the effects of colour, position, message, activation distance, vehicle yielding, and visual distraction among 1,434 participants","volume":"95","author":"Bazilinskyy","year":"2021","journal-title":"Appl. Ergon."},{"key":"B3","doi-asserted-by":"publisher","first-page":"33","DOI":"10.1016\/j.trf.2021.11.013","article-title":"Do cyclists need HMIs in future automated traffic? An interview study","volume":"84","author":"Berge","year":"2022","journal-title":"Transp. Res. Part F Traffic Psychol. Behav."},{"key":"B4","doi-asserted-by":"publisher","first-page":"012077","DOI":"10.1088\/1742-6596\/2161\/1\/012077","article-title":"A review on classifications of tracking systems in augmented reality","volume":"2161","author":"Bhakar","year":"2022","journal-title":"J. Phys. Conf. Ser."},{"key":"B5","doi-asserted-by":"publisher","first-page":"136","DOI":"10.1016\/j.trf.2022.08.016","article-title":"Two-step communication for the interaction between automated vehicles and pedestrians","volume":"90","author":"Bindsch\u00e4del","year":"2022","journal-title":"Transp. Res. Part F Traffic Psychol. Behav."},{"key":"B6","first-page":"612","article-title":"Validation of the vehicle in the loop (vil); a milestone for the simulation of driver assistance systems","author":"Bokc","year":"2007"},{"key":"B7","doi-asserted-by":"publisher","first-page":"23","DOI":"10.3233\/ves-150541","article-title":"Less sickness with more motion and\/or mental distraction","volume":"25","author":"Bos","year":"2015","journal-title":"J. Vestib. Res. Equilib. Orientat."},{"key":"B8","doi-asserted-by":"publisher","first-page":"52","DOI":"10.1007\/s38311-017-0082-4","article-title":"Vehicle-in-the-loop real-world vehicle tests combined with virtual scenarios","volume":"119","author":"Butenuth","year":"2017","journal-title":"ATZ Worldw."},{"key":"B9","first-page":"625","article-title":"Peripheral vision: a new killer app for smart glasses","author":"Chaturvedi","year":"2019"},{"key":"B10","doi-asserted-by":"crossref","DOI":"10.1145\/3334480.3382865","article-title":"Unveiling the lack of scalability in research on external communication of autonomous vehicles","author":"Colley","year":"2020"},{"key":"B11","doi-asserted-by":"publisher","first-page":"1353","DOI":"10.1177\/0018720819836343","article-title":"External human-machine interfaces on automated vehicles: effects on pedestrian crossing decisions","volume":"61","author":"De Clercq","year":"2019","journal-title":"Hum. Factors"},{"key":"B12","doi-asserted-by":"crossref","first-page":"533","DOI":"10.1007\/978-3-030-77726-5_20","article-title":"Interactions of automated vehicles with road users","volume-title":"User experience design in the era of automated driving","author":"Dey","year":"2022"},{"key":"B13","first-page":"192","article-title":"Distance-dependent eHMIs for the interaction between automated vehicles and pedestrians","author":"Dey","year":"2020"},{"key":"B14","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1155\/2021\/5573560","article-title":"I see your gesture: a vr-based study of bidirectional communication between pedestrians and automated vehicles","volume":"2021","author":"Epke","year":"2021","journal-title":"J. Adv. Transp."},{"key":"B15","first-page":"1549","article-title":"An augmented reality environment for connected and automated vehicle testing and evaluation","author":"Feng","year":"2018"},{"key":"B16","unstructured":"Sedan car - 01 [3D land vehicle model]2021"},{"key":"B17","doi-asserted-by":"publisher","first-page":"456","DOI":"10.1080\/10447318.2018.1456150","article-title":"A personal resource for technology interaction: development and validation of the affinity for technology interaction (ATI) scale","volume":"35","author":"Franke","year":"2019","journal-title":"Int. J. Human\u2013Computer Interact."},{"key":"B18","first-page":"649","article-title":"Comparing world and screen coordinate systems in optical see-through head-mounted displays for text readability while walking","author":"Fukushima","year":"2020"},{"key":"B19","volume-title":"Mobile safety application for pedestrians","author":"Gelbal","year":"2023"},{"key":"B20","unstructured":"A new sense of direction with Live View2020"},{"key":"B21","unstructured":"Top-down view of part of the TU Delft campus2023"},{"key":"B22","doi-asserted-by":"publisher","first-page":"187","DOI":"10.1016\/j.future.2022.03.036","article-title":"Pedestrian safety using the Internet of Things and sensors: issues, challenges, and open problems","volume":"134","author":"Hasan","year":"2022","journal-title":"Future Gener. Comput. Syst."},{"key":"B23","first-page":"261","article-title":"Don\u2019t panic! Guiding pedestrians in autonomous traffic with augmented reality","author":"Hesenius","year":"2018"},{"key":"B24","first-page":"211","article-title":"Overtrust in external cues of automated vehicles: an experimental investigation","author":"Holl\u00e4nder","year":"2019"},{"key":"B25","unstructured":"SteamVR base station 2.02019"},{"key":"B26","doi-asserted-by":"publisher","first-page":"608","DOI":"10.1109\/TITS.2012.2226239","article-title":"Augmented reality experiment: drivers' behavior at an unsignalized intersection","volume":"14","author":"Hussain","year":"2013","journal-title":"IEEE Trans. Intelligent Transp. Syst."},{"key":"B27","volume-title":"Tests for colour-blindness","author":"Ishihara","year":"1917"},{"key":"B28","doi-asserted-by":"crossref","first-page":"117","DOI":"10.1007\/978-3-658-27990-5_11","article-title":"Artificial intelligence for automated driving \u2013 quo vadis?","volume-title":"Automatisiertes fahren 2019: von der Fahrerassistenz zum autonomen fahren 5. Internationale ATZ-fachtagung","author":"Jungmann","year":"2020"},{"key":"B29","doi-asserted-by":"publisher","first-page":"1070","DOI":"10.1177\/0018720820970751","article-title":"External Human-Machine Interfaces can be misleading: an examination of trust development and misuse in a CAVE-based pedestrian simulation environment","volume":"64","author":"Kaleefathullah","year":"2022","journal-title":"Hum. Factors"},{"key":"B30","first-page":"363","article-title":"Mixed reality agent-based framework for pedestrian-cyclist interaction","author":"Kamalasanan","year":"2022"},{"key":"B31","first-page":"191","article-title":"Designing virtual agent human\u2013machine interfaces depending on the communication and anthropomorphism levels in augmented reality","author":"Kang","year":"2023"},{"key":"B32","doi-asserted-by":"publisher","first-page":"102283","DOI":"10.1016\/j.displa.2022.102283","article-title":"Optical see-through augmented reality can induce severe motion sickness","volume":"74","author":"Kaufeld","year":"2022","journal-title":"Displays"},{"key":"B33","doi-asserted-by":"crossref","DOI":"10.1145\/3544549.3585655","article-title":"Wearing awareness: designing pedestrian-wearables for interactions with autonomous vehicles","author":"Lakhdhir","year":"2023"},{"key":"B34","first-page":"320","article-title":"Securing augmented reality output","author":"Lebeck","year":"2017"},{"key":"B35","article-title":"eHMI on the vehicle or on the infrastructure? A driving simulator study","author":"Lingam","year":"2023","journal-title":"ResearchGate"},{"key":"B36","doi-asserted-by":"publisher","first-page":"187208231151280","DOI":"10.1177\/00187208231151280","article-title":"Do simulated augmented reality overlays influence street-crossing decisions for non-mobility-impaired older and younger adult pedestrians?","author":"Malik","year":"2023","journal-title":"Hum. Factors"},{"key":"B37","article-title":"Analyzing pedestrian behavior in augmented reality \u2014 proof of concept","author":"Maruhn","year":"2020"},{"key":"B38","doi-asserted-by":"publisher","first-page":"61","DOI":"10.20982\/tqmp.04.2.p061","article-title":"Confidence intervals from normalized data: a correction to Cousineau (2005)","volume":"4","author":"Morey","year":"2008","journal-title":"Tutorials Quantitative Methods Psychol."},{"key":"B39","doi-asserted-by":"publisher","first-page":"2143","DOI":"10.3390\/s21062143","article-title":"Enabling technologies for urban smart mobility: recent trends, opportunities and challenges","volume":"21","author":"Paiva","year":"2021","journal-title":"Sensors"},{"key":"B40","article-title":"Head-locked, world-locked, or conformal diminished-reality? An examination of different AR solutions for pedestrian safety in occluded scenarios","author":"Peereboom","year":"2023","journal-title":"ResearchGate"},{"key":"B41","doi-asserted-by":"crossref","DOI":"10.1145\/3565970.3567701","article-title":"Push the red button: comparing notification placement with augmented and non-augmented tasks in AR","author":"Plabst","year":"2022"},{"key":"B42","doi-asserted-by":"publisher","first-page":"1157","DOI":"10.1109\/TVT.2021.3054312","article-title":"Comparing state-of-the-art and emerging augmented reality interfaces for autonomous vehicle-to-pedestrian communication","volume":"70","author":"Prattic\u00f2","year":"2021","journal-title":"IEEE Trans. Veh. Technol."},{"key":"B43","unstructured":"Qualtrics\n            QualtricsX. M.\n          2023"},{"key":"B44","doi-asserted-by":"publisher","first-page":"107289","DOI":"10.1016\/j.chb.2022.107289","article-title":"What is XR? Towards a framework for augmented and virtual reality","volume":"133","author":"Rauschnabel","year":"2022","journal-title":"Comput. Hum. Behav."},{"key":"B45","doi-asserted-by":"publisher","first-page":"1005","DOI":"10.1016\/j.trf.2018.07.020","article-title":"Interaction between pedestrians and automated vehicles: a Wizard of Oz experiment","volume":"58","author":"Rodr\u00edguez Palmeiro","year":"2018","journal-title":"Transp. Res. Part F Traffic Psychol. Behav."},{"key":"B46","doi-asserted-by":"publisher","first-page":"75","DOI":"10.1162\/PRES_e_00247","article-title":"Recollections on presence beginnings, and some challenges for augmented and virtual reality","volume":"25","author":"Sheridan","year":"2016","journal-title":"Presence Teleoperators Virtual Environ."},{"key":"B47","doi-asserted-by":"publisher","first-page":"1416","DOI":"10.1080\/00140139.2021.1925353","article-title":"Automated vehicles that communicate implicitly: examining the use of lateral position within the lane","volume":"64","author":"Sripada","year":"2021","journal-title":"Ergonomics"},{"key":"B48","doi-asserted-by":"publisher","first-page":"231053","DOI":"10.1098\/rsos.231053","article-title":"Using ChatGPT for human\u2013computer interaction research: a primer","volume":"10","author":"Tabone","year":"2023","journal-title":"R. Soc. Open Sci."},{"key":"B49","doi-asserted-by":"publisher","first-page":"100293","DOI":"10.1016\/j.trip.2020.100293","article-title":"Vulnerable road users and the coming wave of automated vehicles: expert perspectives","volume":"9","author":"Tabone","year":"","journal-title":"Transp. Res. Interdiscip. Perspect."},{"key":"B50","doi-asserted-by":"publisher","first-page":"170","DOI":"10.1016\/j.trf.2023.02.005","article-title":"Augmented reality interfaces for pedestrian-vehicle interactions: an online study","volume":"94","author":"Tabone","year":"","journal-title":"Transp. Res. Part F Traffic Psychol. Behav."},{"key":"B51","article-title":"Immersive insights: evaluating augmented reality interfaces for pedestrians in a CAVE-based experiment","author":"Tabone","year":"","journal-title":"ResearchGate"},{"key":"B52","first-page":"209","article-title":"Towards future pedestrian-vehicle interactions: introducing theoretically-supported AR prototypes","author":"Tabone","year":""},{"key":"B53","doi-asserted-by":"publisher","first-page":"7197","DOI":"10.3390\/app11167197","article-title":"An augmented warning system for pedestrians: user interface design and algorithm development","volume":"11","author":"Tong","year":"2021","journal-title":"Appl. Sci."},{"key":"B54","first-page":"167","article-title":"Scoping out the scalability issues of autonomous vehicle-pedestrian interaction","author":"Tran","year":"2023"},{"key":"B55","doi-asserted-by":"publisher","DOI":"10.3389\/fcomp.2022.866516","article-title":"Designing wearable augmented reality concepts to support scalability in autonomous vehicle-pedestrian interaction","volume":"4","author":"Tran","year":"2022","journal-title":"Front. Comput. Sci."},{"key":"B56","unstructured":"Automotive head-up display (HUD) market2023"},{"key":"B57","doi-asserted-by":"crossref","DOI":"10.1145\/3366551.3370340","article-title":"eHMI positioning for autonomous vehicle\/pedestrians interaction","author":"Troel-Madec","year":"2019"},{"key":"B58","unstructured":"United Nations A\/RES\/74\/299 (A\/RES\/74\/299). Resolution adopted by the general assembly on2020"},{"key":"B59","unstructured":"Unity 2021.3.132022"},{"key":"B60","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/s0968-090x(96)00025-3","article-title":"A simple procedure for the assessment of acceptance of advanced transport telematics","volume":"5","author":"Van der Laan","year":"1997","journal-title":"Transp. Res. Part C Emerg. Technol."},{"key":"B61","unstructured":"Varjo XR-3 - the industry\u2019s highest resolution mixed reality headset2023"},{"key":"B62","unstructured":"Getting started with Varjo XR Plugin for unity2023"},{"key":"B63","unstructured":"Mixed reality \u2013 Varjo.com2023"},{"key":"B64","article-title":"AR designs for eHMI\u2013Communication between automated vehicles and pedestrians using augmented reality","author":"Wilbrink","year":"2023"},{"key":"B65","unstructured":"Global status report on road safety 2018: summary2018"},{"key":"B66","first-page":"178","article-title":"ARcoustic: a mobile augmented reality system for seeing out-of-view traffic","author":"Zhang","year":"2023"}],"container-title":["Frontiers in Robotics and AI"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frobt.2024.1324060\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,1,30]],"date-time":"2024-01-30T04:33:11Z","timestamp":1706589191000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frobt.2024.1324060\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,1,30]]},"references-count":66,"alternative-id":["10.3389\/frobt.2024.1324060"],"URL":"https:\/\/doi.org\/10.3389\/frobt.2024.1324060","relation":{},"ISSN":["2296-9144"],"issn-type":[{"value":"2296-9144","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,1,30]]},"article-number":"1324060"}}