{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T21:17:03Z","timestamp":1776115023588,"version":"3.50.1"},"reference-count":51,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2025,4,16]],"date-time":"2025-04-16T00:00:00Z","timestamp":1744761600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Virtual Real."],"abstract":"<jats:sec><jats:title>Introduction<\/jats:title><jats:p>Extended reality (XR) technologies, particularly gaze\u2010based interaction methods, have evolved significantly in recent years to improve accessibility and reach broader user communities. While previous research has improved the simplicity and inclusivity of gaze-based choice, the adaptability of such systems \u2010 particularly in terms of user comfort and fault tolerance \u2010 has not yet been fully explored.<\/jats:p><\/jats:sec><jats:sec><jats:title>Methods<\/jats:title><jats:p>In this study, four gaze\u2010based interaction techniques were examined in a visual search game in virtual reality (VR). A total of 52 participants were involved. The techniques tested included selection by dwell time, confirmation by head orientation, nodding and smooth pursuit eye movements. Both subjective and objective performance measures were assessed, using the NASA\u2010TLX for perceived task load and time to complete the task and score for objective evaluation.<\/jats:p><\/jats:sec><jats:sec><jats:title>Results<\/jats:title><jats:p>Significant differences were found between the interaction techniques in terms of NASA\u2010TLX dimensions, target search time and overall performance. The results indicate different levels of efficiency and intuitiveness of each method. Gender differences in interaction preferences and cognitive load were also found.<\/jats:p><\/jats:sec><jats:sec><jats:title>Discussion<\/jats:title><jats:p>These findings highlight the importance of personalizing gaze\u2010based VR interfaces to the individual user to improve accessibility, reduce cognitive load and enhance the user experience. Personalizing gaze interaction strategies can support more inclusive and effective VR systems that benefit both general and accessibility\u2010focused populations.<\/jats:p><\/jats:sec>","DOI":"10.3389\/frvir.2025.1576962","type":"journal-article","created":{"date-parts":[[2025,4,16]],"date-time":"2025-04-16T05:25:38Z","timestamp":1744781138000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":9,"title":["The interplay of user preference and precision in different gaze-based interaction methods in virtual environments"],"prefix":"10.3389","volume":"6","author":[{"given":"Bj\u00f6rn R.","family":"Severitt","sequence":"first","affiliation":[]},{"given":"Yannick","family":"Sauer","sequence":"additional","affiliation":[]},{"given":"Alexander","family":"Neugebauer","sequence":"additional","affiliation":[]},{"given":"Rajat","family":"Agarwala","sequence":"additional","affiliation":[]},{"given":"Nora","family":"Castner","sequence":"additional","affiliation":[]},{"given":"Siegfried","family":"Wahl","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2025,4,16]]},"reference":[{"key":"B1","doi-asserted-by":"crossref","first-page":"69","DOI":"10.1145\/2857491.2857527","article-title":"A rotary dial for gaze-based pin entry","volume-title":"Proceedings of the ninth biennial ACM symposium on eye tracking research and applications","author":"Best","year":"2016"},{"key":"B2","doi-asserted-by":"crossref","DOI":"10.1145\/3206343.3206349","article-title":"Advantages of eye-gaze over head-gaze-based selection in virtual and augmented reality under varying field of views","volume-title":"Proceedings of the workshop on communication by gaze interaction","author":"Blattgerste","year":"2018"},{"key":"B3","doi-asserted-by":"crossref","first-page":"536","DOI":"10.1145\/3377325.3377497","article-title":"Detecting errors in pick and place procedures: detecting errors in multi-stage and sequence-constrained manual retrieve-assembly procedures","volume-title":"Proceedings of the 25th international conference on intelligent user interfaces","author":"Bovo","year":"2020"},{"key":"B4","doi-asserted-by":"crossref","first-page":"19","DOI":"10.1109\/AIVR52153.2021.00013","article-title":"Predicting future position from natural walking and eye movements with machine learning","volume-title":"2021 IEEE international conference on artificial intelligence and virtual reality (AIVR)","author":"Bremer","year":"2021"},{"key":"B5","doi-asserted-by":"publisher","first-page":"06399","DOI":"10.48550\/arXiv.1704.06399","article-title":"Improving gaze-based selection using variable dwell time","author":"Chen","year":"2017","journal-title":"ArXiv abs\/1704"},{"key":"B6","doi-asserted-by":"publisher","first-page":"71","DOI":"10.5121\/ijsea.2019.10505","article-title":"Best practices for improving user interface design","volume":"10","author":"Dey","year":"2019","journal-title":"Int. J. Softw. Eng. and Appl."},{"key":"B7","doi-asserted-by":"publisher","first-page":"59","DOI":"10.1016\/j.cag.2018.04.002","article-title":"Gaze-based interaction: a 30 year retrospective","volume":"73","author":"Duchowski","year":"2018","journal-title":"Comput. and Graph."},{"key":"B8","first-page":"457","article-title":"Orbits: gaze interaction for smart watches using smooth pursuit eye movements","volume-title":"Orbits: gaze interaction for smart watches using smooth pursuit eye movements","author":"Esteves","year":"2015"},{"key":"B9","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1080\/10447318.2025.2453966","article-title":"Gaze inputs for targeting: the eyes have it, not with a cursor","author":"Fernandes","year":"2025","journal-title":"Int. J. Human\u2013Computer Interact."},{"key":"B10","article-title":"Critic: large language models can self-correct with tool-interactive critiquing","author":"Gou","year":"2023"},{"key":"B11","doi-asserted-by":"publisher","first-page":"84","DOI":"10.3390\/mti4040084","article-title":"A human\u2013computer interface replacing mouse and keyboard for individuals with limited upper limb mobility","volume":"4","author":"G\u00fcr","year":"2020","journal-title":"Multimodal Technol. Interact."},{"key":"B12","article-title":"A history of the unity game engine","author":"Haas","year":"2014"},{"key":"B13","doi-asserted-by":"crossref","first-page":"131","DOI":"10.1145\/968363.968389","article-title":"Gaze typing compared with input by head and hand","volume-title":"Proceedings of the 2004 symposium on eye tracking research and applications","author":"Hansen","year":"2004"},{"key":"B14","first-page":"139","article-title":"Development of nasa-tlx (task load index): results of empirical and theoretical research","volume-title":"North-holland, vol. 52 of","author":"Hart","year":"1988"},{"key":"B15","doi-asserted-by":"crossref","first-page":"35","DOI":"10.1145\/1117309.1117319","article-title":"Secure graphical password system for high traffic public areas","volume-title":"Proceedings of the 2006 symposium on eye tracking research and applications","author":"Hoanca","year":"2006"},{"key":"B16","doi-asserted-by":"publisher","first-page":"2458","DOI":"10.3390\/s24082458","article-title":"Visionaryvr: an optical simulation tool for evaluating and optimizing vision correction solutions in virtual reality","volume":"24","author":"Hosp","year":"2024","journal-title":"Sensors"},{"key":"B17","doi-asserted-by":"crossref","DOI":"10.1145\/3588015.3589203","article-title":"Zero: a generic open-source extended reality eye-tracking controller interface for scientists","volume-title":"Proceedings of the 2023 symposium on eye tracking research and applications","author":"Hosp","year":"2023"},{"key":"B18","doi-asserted-by":"crossref","DOI":"10.1145\/1272582.1272618","article-title":"Gazing with peye: new concepts in eye typing","volume-title":"Proceedings of the 4th symposium on applied perception in graphics and visualization","author":"Huckauf","year":"2007"},{"key":"B19","doi-asserted-by":"crossref","first-page":"51","DOI":"10.1145\/1344471.1344483","article-title":"Gazing with peyes: towards a universal input for various applications","volume-title":"Proceedings of the 2008 symposium on eye tracking research and applications","author":"Huckauf","year":"2008"},{"key":"B20","doi-asserted-by":"crossref","first-page":"229","DOI":"10.1145\/2168556.2168602","article-title":"Gaze gestures or dwell-based interaction?","volume-title":"Proceedings of the symposium on eye tracking research and applications","author":"Hyrskykari","year":"2012"},{"key":"B21","first-page":"11","article-title":"What you look at is what you get: eye movement-based interaction techniques","volume-title":"Proceedings of the SIGCHI conference on human factors in computing systems","author":"Jacob","year":"1990"},{"key":"B22","doi-asserted-by":"publisher","first-page":"3981","DOI":"10.1007\/s11042-020-09749-x","article-title":"Real time object detection and trackingsystem for video surveillance system","volume":"80","author":"Jha","year":"2021","journal-title":"Multimedia Tools Appl."},{"key":"B23","doi-asserted-by":"publisher","first-page":"583","DOI":"10.1080\/01621459.1952.10483441","article-title":"Use of ranks in one-criterion variance analysis","volume":"47","author":"Kruskal","year":"1952","journal-title":"J. Am. Stat. Assoc."},{"key":"B24","first-page":"1","article-title":"The eye tracking and gaze estimation system by low cost wearable devices","volume-title":"2020 IEEE international conference on consumer electronics - taiwan (ICCE-Taiwan)","author":"Lee","year":"2020"},{"key":"B25","article-title":"Generating with confidence: uncertainty quantification for black-box large language models","author":"Lin","year":"2023","journal-title":"arXiv Prepr. arXiv:2305"},{"key":"B26","doi-asserted-by":"publisher","first-page":"357","DOI":"10.1093\/iwc\/iwv003","article-title":"What is intuitive interaction? Balancing users\u2019 performance and satisfaction with natural user interfaces","volume":"27","author":"Macaranas","year":"2015","journal-title":"Interact. Comput."},{"key":"B27","doi-asserted-by":"publisher","first-page":"239","DOI":"10.1007\/s10209-009-0150-7","article-title":"Special issue: communication by gaze interaction","volume":"8","author":"Majaranta","year":"2009","journal-title":"Univers. Access Inf. Soc."},{"key":"B28","doi-asserted-by":"crossref","first-page":"15","DOI":"10.1145\/507072.507076","article-title":"Twenty years of eye typing: systems and design issues","volume-title":"Proceedings of the 2002 symposium on eye tracking research and applications","author":"Majaranta","year":"2002"},{"key":"B29","doi-asserted-by":"publisher","first-page":"50","DOI":"10.1214\/aoms\/1177730491","article-title":"On a test of whether one of two random variables is stochastically larger than the other","volume":"18","author":"Mann","year":"1947","journal-title":"Ann. Math. Statistics"},{"key":"B30","doi-asserted-by":"crossref","first-page":"506","DOI":"10.1109\/HSI.2018.8431368","article-title":"A low-profile digital eye-tracking oculometer for smart eyeglasses","volume-title":"2018 11th international conference on human system interaction (HSI)","author":"Mastrangelo","year":"2018"},{"key":"B31","first-page":"1","volume-title":"Kruskal-wallis test","author":"McKight","year":"2010"},{"key":"B32","first-page":"1","volume-title":"Mann-whitney U test","author":"McKnight","year":"2010"},{"key":"B33","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1145\/3544548.3580871","article-title":"Comparing dwell time, pursuits and gaze gestures for gaze interaction on handheld mobile devices","volume-title":"Proceedings of the 2023 CHI conference on human factors in computing systems","author":"Namnakani","year":"2023"},{"key":"B34","doi-asserted-by":"publisher","DOI":"10.16910\/jemr.2.4.6","article-title":"Eye typing in application: a comparison of two systems with als patients","volume":"2","author":"Pannasch","year":"2008","journal-title":"J. Eye Mov. Res."},{"key":"B35","doi-asserted-by":"publisher","first-page":"3587","DOI":"10.1016\/S0042-6989(01)00245-0","article-title":"Oculomotor behavior and perceptual strategies in complex tasks","volume":"41","author":"Pelz","year":"2001","journal-title":"Vis. Res."},{"key":"B36","first-page":"99","article-title":"Gaze + pinch interaction in virtual reality","volume-title":"Proceedings of the 5th symposium on spatial user interaction","author":"Pfeuffer","year":""},{"key":"B37","first-page":"99","article-title":"Gaze+ pinch interaction in virtual reality","volume-title":"Proceedings of the 5th symposium on spatial user interaction","author":"Pfeuffer","year":""},{"key":"B38","doi-asserted-by":"publisher","first-page":"36","DOI":"10.1109\/3DUI.2017.7893315","article-title":"Exploring natural eye-gaze-based interaction for immersive virtual reality","author":"Piumsomboon","year":"2017","journal-title":"2017 IEEE Symposium 3D User Interfaces"},{"key":"B39","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/3491207","article-title":"The eye in extended reality: a survey on gaze interaction and eye tracking in head-worn extended reality","volume":"55","author":"Plopski","year":"2022","journal-title":"ACM Comput. Surv."},{"key":"B40","doi-asserted-by":"crossref","first-page":"91","DOI":"10.1145\/3131277.3132182","article-title":"The eyes don\u2019t have it: an empirical comparison of head-based and eye-based selection in virtual reality","volume-title":"Proceedings of the 5th symposium on spatial user interaction","author":"Qian","year":"2017"},{"key":"B41","first-page":"05480","article-title":"A review of the low-cost eye-tracking systems for 2010-2020","author":"Rakhmatulin","year":"2020","journal-title":"Corr. abs\/2010"},{"key":"B42","doi-asserted-by":"crossref","first-page":"1161","DOI":"10.1145\/3332165.3347921","article-title":"Eye&head: synergetic eye and head movement for gaze pointing and selection","volume-title":"Proceedings of the 32nd annual ACM symposium on user interface software and technology","author":"Sidenmark","year":"2019"},{"key":"B43","doi-asserted-by":"publisher","first-page":"180","DOI":"10.3390\/healthcare9020180","article-title":"Eye-tracking for clinical ophthalmology with virtual reality (vr): a case study of the htc vive pro eye\u2019s usability","volume":"9","author":"Sipatchin","year":"2021","journal-title":"Healthcare"},{"key":"B44","doi-asserted-by":"crossref","DOI":"10.1145\/1178823.1178847","article-title":"Use of eye movements for video game control","volume-title":"Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology","author":"Smith","year":"2006"},{"key":"B45","doi-asserted-by":"crossref","first-page":"705","DOI":"10.1145\/2370216.2370369","article-title":"Enhanced gaze interaction using simple head gestures","volume-title":"Proceedings of the 2012 ACM conference on ubiquitous computing","author":"\u0160pakov","year":"2012"},{"key":"B46","first-page":"3","article-title":"A gaze-responsive self-disclosing display","volume-title":"Proceedings of the SIGCHI conference on human factors in computing systems","author":"Starker","year":"1990"},{"key":"B47","doi-asserted-by":"crossref","first-page":"5967","DOI":"10.1109\/EMBC.2019.8856608","article-title":"A. eye drive: gaze-based semi-autonomous wheelchair interface","volume-title":"2019 41st annual international conference of the IEEE engineering in medicine and biology society (EMBC)","author":"Subramanian","year":"2019"},{"key":"B48","doi-asserted-by":"crossref","first-page":"439","DOI":"10.1145\/2493432.2493477","article-title":"Pursuits: spontaneous interaction with displays based on smooth pursuit eye movement and moving targets","volume-title":"Proceedings of the 2013 ACM international joint conference on pervasive and ubiquitous computing","author":"Vidal","year":"2013"},{"key":"B49","doi-asserted-by":"publisher","first-page":"838","DOI":"10.1038\/418838a","article-title":"Fast hands-free writing by gaze direction","volume":"418","author":"Ward","year":"2002","journal-title":"Nature"},{"key":"B50","doi-asserted-by":"crossref","DOI":"10.1145\/3544548.3581042","article-title":"Predicting gaze-based target selection in augmented reality headsets based on eye and head endpoint distributions","volume-title":"Proceedings of the 2023 CHI conference on human factors in computing systems","author":"Wei","year":"2023"},{"key":"B51","doi-asserted-by":"crossref","first-page":"11","DOI":"10.1145\/1344471.1344475","article-title":"Longitudinal evaluation of discrete consecutive gaze gestures for text entry","volume-title":"Proceedings of the 2008 symposium on eye tracking research and applications","author":"Wobbrock","year":"2008"}],"container-title":["Frontiers in Virtual Reality"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frvir.2025.1576962\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,4,16]],"date-time":"2025-04-16T05:25:50Z","timestamp":1744781150000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/frvir.2025.1576962\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,4,16]]},"references-count":51,"alternative-id":["10.3389\/frvir.2025.1576962"],"URL":"https:\/\/doi.org\/10.3389\/frvir.2025.1576962","relation":{},"ISSN":["2673-4192"],"issn-type":[{"value":"2673-4192","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,4,16]]},"article-number":"1576962"}}