{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,4]],"date-time":"2026-03-04T17:24:37Z","timestamp":1772645077731,"version":"3.50.1"},"reference-count":31,"publisher":"Association for Computing Machinery (ACM)","issue":"ETRA","license":[{"start":{"date-parts":[[2023,5,17]],"date-time":"2023-05-17T00:00:00Z","timestamp":1684281600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"funder":[{"DOI":"10.13039\/100019827","name":"Meta","doi-asserted-by":"publisher","id":[{"id":"10.13039\/100019827","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/100000001","name":"National Science Foundation","doi-asserted-by":"publisher","award":["DGE-1840989, DGE-1144466"],"award-info":[{"award-number":["DGE-1840989, DGE-1144466"]}],"id":[{"id":"10.13039\/100000001","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["Proc. ACM Hum.-Comput. Interact."],"published-print":{"date-parts":[[2023,5,17]]},"abstract":"<jats:p>This paper proposes a novel evaluation framework, termed \"critical evaluation periods,\" for evaluating continuous gaze prediction models. This framework emphasizes prediction performance when it is most critical for gaze prediction to be accurate relative to user perception. Based on perceptual characteristics of the human visual system such as saccadic suppression, this framework provides a more practical assessment of gaze prediction performance for gaze-contingent rendering compared to the dominant sample-by-sample evaluation strategy employed in literature, which overemphasizes performance during easy-to-predict periods of fixation. Using a case study with a lightweight deep learning gaze prediction model, we observe a significant discrepancy in the reported prediction accuracy between the proposed critical evaluation periods and the dominant evaluation strategy employed in literature. Based on our findings, we suggest that the proposed framework is more suitable for evaluating the performance of continuous gaze prediction models intended for gaze-contingent rendering applications.<\/jats:p>","DOI":"10.1145\/3591134","type":"journal-article","created":{"date-parts":[[2023,5,18]],"date-time":"2023-05-18T20:21:03Z","timestamp":1684441263000},"page":"1-17","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":3,"title":["Practical Perception-Based Evaluation of Gaze Prediction for Gaze Contingent Rendering"],"prefix":"10.1145","volume":"7","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-7656-2662","authenticated-orcid":false,"given":"Samantha","family":"Aziz","sequence":"first","affiliation":[{"name":"Texas State University, San Marcos, TX, USA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8088-9270","authenticated-orcid":false,"given":"Dillon J.","family":"Lohr","sequence":"additional","affiliation":[{"name":"Texas State University, San Marcos, TX, USA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4591-6891","authenticated-orcid":false,"given":"Razvan","family":"Stefanescu","sequence":"additional","affiliation":[{"name":"Meta, Seattle, WA, USA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-7890-8842","authenticated-orcid":false,"given":"Oleg","family":"Komogortsev","sequence":"additional","affiliation":[{"name":"Texas State University, San Marcos, TX, USA"}]}],"member":"320","published-online":{"date-parts":[[2023,5,18]]},"reference":[{"key":"e_1_2_2_1_1","doi-asserted-by":"publisher","DOI":"10.1145\/3127589"},{"key":"e_1_2_2_2_1","unstructured":"James Anliker. 1976. Eye movements - On-line measurement analysis and control."},{"key":"e_1_2_2_3_1","doi-asserted-by":"publisher","DOI":"10.1145\/3072959.3073642"},{"key":"e_1_2_2_4_1","doi-asserted-by":"publisher","DOI":"10.1016\/0042--6989(78)90219--5"},{"key":"e_1_2_2_5_1","unstructured":"William Falcon et al. 2019. PyTorch Lightning. https:\/\/github.com\/Lightning-AI\/lightning"},{"key":"e_1_2_2_6_1","doi-asserted-by":"publisher","DOI":"10.3758\/s13428-018--1050--7"},{"key":"e_1_2_2_7_1","doi-asserted-by":"publisher","DOI":"10.1038\/s41597-021-00959-y"},{"key":"e_1_2_2_8_1","doi-asserted-by":"publisher","DOI":"10.1167\/13.8.27"},{"key":"e_1_2_2_9_1","unstructured":"Kenneth Holmqvist Marcus Nystr\u00f6m Richard Andersson Richard Dewhurst Halszka Jarodzka and Joost Van de Weijer. 2011. Eye tracking: A comprehensive guide to methods and measures. OUP Oxford."},{"key":"e_1_2_2_10_1","doi-asserted-by":"publisher","DOI":"10.1109\/TVCG.2021.3067779"},{"key":"e_1_2_2_11_1","doi-asserted-by":"publisher","DOI":"10.1109\/TVCG.2019.2899187"},{"key":"e_1_2_2_12_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.conb.2011.05.012"},{"key":"e_1_2_2_13_1","doi-asserted-by":"publisher","DOI":"10.1145\/3534086.3534331"},{"key":"e_1_2_2_14_1","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2020.2975032"},{"key":"e_1_2_2_15_1","doi-asserted-by":"publisher","DOI":"10.48550\/ARXIV.1412.6980"},{"key":"e_1_2_2_16_1","doi-asserted-by":"publisher","DOI":"10.3758\/s13428-012-0234--9"},{"key":"e_1_2_2_17_1","doi-asserted-by":"publisher","DOI":"10.1145\/1344471.1344525"},{"key":"e_1_2_2_18_1","volume-title":"Zee","author":"John Leigh R.","year":"2006","unstructured":"R. John Leigh and David S. Zee. 2006. The Neurology of Eye Movements. Oxford University Press."},{"key":"e_1_2_2_19_1","doi-asserted-by":"publisher","DOI":"10.1145\/1314303.1314310"},{"key":"e_1_2_2_20_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.jbusres.2017.09.028"},{"key":"e_1_2_2_21_1","doi-asserted-by":"publisher","DOI":"10.1109\/ACCESS.2021.3070511"},{"key":"e_1_2_2_22_1","volume-title":"PyTorch: An Imperative Style","author":"Paszke Adam","unstructured":"Adam Paszke, Sam Gross, Francisco Massa, Adam Lerer, James Bradbury, Gregory Chanan, Trevor Killeen, Zeming Lin, Natalia Gimelshein, Luca Antiga, Alban Desmaison, Andreas Kopf, Edward Yang, Zachary DeVito, Martin Raison, Alykhan Tejani, Sasank Chilamkurthy, Benoit Steiner, Lu Fang, Junjie Bai, and Soumith Chintala. 2019. PyTorch: An Imperative Style, High-Performance Deep Learning Library. In Advances in Neural Information Processing Systems 32, H. Wallach, H. Larochelle, A. Beygelzimer, F. dtextquotesingle Alch\u00e9-Buc, E. Fox, and R. Garnett (Eds.). Curran Associates, Inc., 8024--8035. http:\/\/papers.neurips.cc\/paper\/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf"},{"key":"e_1_2_2_23_1","doi-asserted-by":"publisher","DOI":"10.4018\/978--1--5225--7168--1.ch002"},{"key":"e_1_2_2_24_1","volume-title":"Kelleher","author":"Salton Giancarlo D.","year":"2019","unstructured":"Giancarlo D. Salton and John D. Kelleher. 2019. Persistence pays off: Paying Attention to What the LSTM Gating Mechanism Persists. In RANLP."},{"key":"e_1_2_2_25_1","doi-asserted-by":"publisher","DOI":"10.3758\/s13428-013-0375--5"},{"key":"e_1_2_2_26_1","doi-asserted-by":"publisher","DOI":"10.1145\/2931002.2931011"},{"key":"e_1_2_2_27_1","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-030-58604-1_40"},{"key":"e_1_2_2_28_1","volume-title":"Foveated Rendering: Benefits and Costs. https:\/\/vr.tobii.com\/sdk\/learn\/foveation\/rendering\/benefits-costs\/","year":"2022","unstructured":"Tobii. 2022. Foveated Rendering: Benefits and Costs. https:\/\/vr.tobii.com\/sdk\/learn\/foveation\/rendering\/benefits-costs\/"},{"key":"e_1_2_2_29_1","doi-asserted-by":"publisher","DOI":"10.1016\/0042--6989(86)90164--1"},{"key":"e_1_2_2_30_1","doi-asserted-by":"publisher","DOI":"10.1167\/17.14.3"},{"key":"e_1_2_2_31_1","doi-asserted-by":"publisher","DOI":"10.1109\/ISMAR55827.2022.00063"}],"container-title":["Proceedings of the ACM on Human-Computer Interaction"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3591134","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3591134","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3591134","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T16:37:31Z","timestamp":1750178251000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3591134"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,5,17]]},"references-count":31,"journal-issue":{"issue":"ETRA","published-print":{"date-parts":[[2023,5,17]]}},"alternative-id":["10.1145\/3591134"],"URL":"https:\/\/doi.org\/10.1145\/3591134","relation":{},"ISSN":["2573-0142"],"issn-type":[{"value":"2573-0142","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,5,17]]},"assertion":[{"value":"2023-05-18","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}