{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T20:50:06Z","timestamp":1776113406951,"version":"3.50.1"},"reference-count":47,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2024,8,14]],"date-time":"2024-08-14T00:00:00Z","timestamp":1723593600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2024,8,14]],"date-time":"2024-08-14T00:00:00Z","timestamp":1723593600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["J Vis"],"published-print":{"date-parts":[[2025,2]]},"abstract":"<jats:title>Abstract<\/jats:title>\n          <jats:p>Eye movements have a spatial (where people look), but also a temporal (when people look) component. Various types of visualizations have been proposed that take this spatio-temporal nature of the data into account, but it is unclear how well each one can be interpreted and whether such interpretation depends on the question asked about the data or the nature of the dataset that is being visualised. In this study, four spatio-temporal visualization techniques for eye movements (chord diagram, scan path, scarf plot, space-time cube) were compared in a user study. Participants <jats:inline-formula>\n              <jats:alternatives>\n                <jats:tex-math>$$(N = 25)$$<\/jats:tex-math>\n                <mml:math xmlns:mml=\"http:\/\/www.w3.org\/1998\/Math\/MathML\">\n                  <mml:mrow>\n                    <mml:mo>(<\/mml:mo>\n                    <mml:mi>N<\/mml:mi>\n                    <mml:mo>=<\/mml:mo>\n                    <mml:mn>25<\/mml:mn>\n                    <mml:mo>)<\/mml:mo>\n                  <\/mml:mrow>\n                <\/mml:math>\n              <\/jats:alternatives>\n            <\/jats:inline-formula> answered three questions (what region first, what region most, which regions most between) about each visualization, which was based on two types of datasets (eye movements towards adverts, eye movements towards pairs of gambles). Accuracy of the answers depended on a combination of the dataset, the question that needed to answered, and the type of visualization. For most questions, the scan path, which did not use area of interest (AOI) information, resulted in lower accuracy than the other graphs. This suggests that AOIs improve the information conveyed by graphs. No effects of experience with reading graphs (for work or not for work) or education on accuracy of the answer was found. The results therefore suggest that there is no single best visualisation of the spatio-temporal aspects of eye movements. When visualising eye movement data, a user study may therefore be beneficial to determine the optimal visualization of the dataset and research question at hand.<\/jats:p>\n          <jats:p>\n            <jats:bold>Graphical abstract<\/jats:bold>\n          <\/jats:p>","DOI":"10.1007\/s12650-024-01023-8","type":"journal-article","created":{"date-parts":[[2024,8,14]],"date-time":"2024-08-14T12:02:22Z","timestamp":1723636942000},"page":"153-169","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":3,"title":["A user study of visualisations of spatio-temporal eye tracking data"],"prefix":"10.1007","volume":"28","author":[{"given":"Marcel","family":"Claus","sequence":"first","affiliation":[]},{"given":"Frouke","family":"Hermens","sequence":"additional","affiliation":[]},{"given":"Stefano","family":"Bromuri","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2024,8,14]]},"reference":[{"key":"1023_CR1","doi-asserted-by":"crossref","unstructured":"Bakardzhiev H et al (2021) A web-based eye tracking data visualization tool. In: International conference on pattern recognition, pp 405\u2013419","DOI":"10.1007\/978-3-030-68796-0_29"},{"issue":"7","key":"1023_CR2","doi-asserted-by":"publisher","first-page":"986","DOI":"10.1080\/17470210701410375","volume":"61","author":"E Birmingham","year":"2008","unstructured":"Birmingham E, Bischof WF, Kingstone A (2008) Social attention and real-world scenes: the roles of action, competition and social content. Q J Exp Psychol 61(7):986\u2013998","journal-title":"Q J Exp Psychol"},{"key":"1023_CR3","doi-asserted-by":"crossref","unstructured":"Blascheck T, Kurzhals K, Raschke M, Burch M, Weiskopf D, Ertl T (2014) State-of-the-art of visualization for eye tracking data. Eurovis (stars)","DOI":"10.1007\/978-1-4614-7485-2_15"},{"key":"1023_CR5","doi-asserted-by":"crossref","unstructured":"Blascheck T, Raschke M, Ertl T (2013) Circular heat map transition diagram. In: Proceedings of the 2013 conference on eye tracking South Africa, pp 58\u201361","DOI":"10.1145\/2509315.2509326"},{"key":"1023_CR4","doi-asserted-by":"crossref","unstructured":"Blascheck T, Kurzhals K, Raschke M, Burch M, Weiskopf D, Ertl T (2017) Visualization of eye tracking data: a taxonomy and survey. In: Computer graphics forum, vol. 36, pp 260\u2013284","DOI":"10.1111\/cgf.13079"},{"key":"1023_CR6","doi-asserted-by":"crossref","unstructured":"Bojko AA (2009) Informative or misleading? Heatmaps deconstructed. In: International conference on human\u2013computer interaction, pp 30\u201339","DOI":"10.1007\/978-3-642-02574-7_4"},{"key":"1023_CR7","doi-asserted-by":"crossref","unstructured":"Burch M, Kull A, Weiskopf D (2013) Aoi rivers for visualizing dynamic eye gaze frequencies. In: Computer graphics forum, vol 32, pp 281\u2013290","DOI":"10.1111\/cgf.12115"},{"key":"1023_CR8","doi-asserted-by":"crossref","unstructured":"Burch M et al (2021) The power of linked eye movement data visualizations. In: ACM symposium on eye tracking research and applications, pp 1\u201311","DOI":"10.1145\/3448017.3457377"},{"key":"1023_CR9","doi-asserted-by":"crossref","unstructured":"Chen M, Alves N, Sol R (2013) Combining spatial and temporal information of eye movements in goal-oriented tasks. In: International conference on human factors in computing and informatics, pp 827\u2013830","DOI":"10.1007\/978-3-642-39062-3_62"},{"issue":"3","key":"1023_CR10","doi-asserted-by":"publisher","first-page":"599","DOI":"10.1177\/1747021818769203","volume":"72","author":"F Crosby","year":"2019","unstructured":"Crosby F, Hermens F (2019) Does it look safe? An eye tracking study into the visual aspects of fear of crime. Q J Exp Psychol 72(3):599\u2013615","journal-title":"Q J Exp Psychol"},{"key":"1023_CR11","doi-asserted-by":"crossref","unstructured":"Elmqvist N (2005) Balloonprobe: reducing occlusion in 3d using interactive space distortion. In: Proceedings of the ACM symposium on virtual reality software and technology, pp 134\u2013137","DOI":"10.1145\/1101616.1101643"},{"issue":"5","key":"1023_CR12","doi-asserted-by":"publisher","first-page":"1095","DOI":"10.1109\/TVCG.2008.59","volume":"14","author":"N Elmqvist","year":"2008","unstructured":"Elmqvist N, Tsigas P (2008) A taxonomy of 3d occlusion management for visualization. IEEE Trans Vis Comput Gr 14(5):1095\u20131109","journal-title":"IEEE Trans Vis Comput Gr"},{"issue":"1","key":"1023_CR13","first-page":"1","volume":"9","author":"S Eraslan","year":"2016","unstructured":"Eraslan S, Yesilada Y, Harper S (2016) Eye tracking scanpath analysis techniques on web pages: a survey, evaluation and comparison. J Eye Mov Res 9(1):1\u201319","journal-title":"J Eye Mov Res"},{"issue":"1","key":"1023_CR14","doi-asserted-by":"publisher","first-page":"8","DOI":"10.1037\/cep0000004","volume":"68","author":"T Foulsham","year":"2014","unstructured":"Foulsham T, Chapman C, Nasiopoulos E, Kingstone A (2014) Top-down and bottom-up aspects of active search in a real-world environment. Can J Exp Psychol 68(1):8","journal-title":"Can J Exp Psychol"},{"issue":"7","key":"1023_CR15","doi-asserted-by":"publisher","first-page":"854","DOI":"10.1177\/0272989X16655334","volume":"36","author":"R Garcia-Retamero","year":"2016","unstructured":"Garcia-Retamero R, Cokely ET, Ghazal S, Joeris A (2016) Measuring graph literacy without a test: a brief subjective assessment. Med Decis Making 36(7):854\u2013867","journal-title":"Med Decis Making"},{"key":"1023_CR16","doi-asserted-by":"publisher","DOI":"10.1016\/j.foodres.2021.110309","volume":"143","author":"A Gere","year":"2021","unstructured":"Gere A, H\u00e9berger K, Kov\u00e1cs S (2021) How to predict choice using eyemovements data? Food Res Int 143:110309","journal-title":"Food Res Int"},{"issue":"3","key":"1023_CR17","doi-asserted-by":"publisher","first-page":"182","DOI":"10.1177\/1473871611406623","volume":"10","author":"JH Goldberg","year":"2011","unstructured":"Goldberg JH, Helfman J (2011) Eye tracking for visualization evaluation: reading values on linear versus radial graphs. Inf Vis. 10(3):182\u2013195","journal-title":"Inf Vis."},{"issue":"6","key":"1023_CR18","doi-asserted-by":"publisher","first-page":"631","DOI":"10.1016\/S0169-8141(98)00068-7","volume":"24","author":"JH Goldberg","year":"1999","unstructured":"Goldberg JH, Kotval XP (1999) Computer interface evaluation using eye movements: methods and constructs. Int J Ind Ergon 24(6):631\u2013645","journal-title":"Int J Ind Ergon"},{"key":"1023_CR19","doi-asserted-by":"crossref","unstructured":"Hembrooke H, Feusner M, Gay G (2006) Averaging scan patterns and what they can tell us. In: Proceedings of the 2006 symposium on eye tracking research & applications, p 41","DOI":"10.1145\/1117309.1117325"},{"issue":"8","key":"1023_CR20","doi-asserted-by":"publisher","first-page":"1141","DOI":"10.1109\/TVCG.2013.246","volume":"20","author":"C Hurter","year":"2013","unstructured":"Hurter C, Ersoy O, Fabrikant SI, Klein TR, Telea AC (2013) Bundled visualization of dynamic graph and trail data. IEEE Trans Vis Comput Gr 20(8):1141\u20131157","journal-title":"IEEE Trans Vis Comput Gr"},{"key":"1023_CR21","doi-asserted-by":"crossref","unstructured":"Krejtz I, Szarkowska A, Krejtz K (2013) The effects of shot changes on eye movements in subtitling","DOI":"10.16910\/jemr.6.5.3"},{"issue":"9","key":"1023_CR22","doi-asserted-by":"publisher","first-page":"1155","DOI":"10.1068\/p3409bn1","volume":"34","author":"G Kuhn","year":"2005","unstructured":"Kuhn G, Tatler BW (2005) Magic and fixation: now you don\u2019t see it, now you do. Perception 34(9):1155\u20131161","journal-title":"Perception"},{"issue":"12","key":"1023_CR25","doi-asserted-by":"publisher","first-page":"2129","DOI":"10.1109\/TVCG.2013.194","volume":"19","author":"K Kurzhals","year":"2013","unstructured":"Kurzhals K, Weiskopf D (2013) Space-time visual analytics of eyetracking data for dynamic stimuli. IEEE Trans Vis Comput Gr 19(12):2129\u20132138","journal-title":"IEEE Trans Vis Comput Gr"},{"key":"1023_CR23","doi-asserted-by":"crossref","unstructured":"Kurzhals K, Burch M, Blascheck T, Andrienko G, Andrienko N, Weiskopf D (2015a) A task-based view on the visual analysis of eye-tracking data. In: Workshop on eye tracking and visualization, pp 3\u201322","DOI":"10.1007\/978-3-319-47024-5_1"},{"issue":"1","key":"1023_CR24","doi-asserted-by":"publisher","first-page":"1005","DOI":"10.1109\/TVCG.2015.2468091","volume":"22","author":"K Kurzhals","year":"2015","unstructured":"Kurzhals K, Hlawatsch M, Heimerl F, Burch M, Ertl T, Weiskopf D (2015b) Gaze stripes: image-based visualization of eye tracking data. IEEE Trans Vis Comput Gr 22(1):1005\u20131014","journal-title":"IEEE Trans Vis Comput Gr"},{"issue":"25\u201326","key":"1023_CR26","doi-asserted-by":"publisher","first-page":"3559","DOI":"10.1016\/S0042-6989(01)00102-X","volume":"41","author":"MF Land","year":"2001","unstructured":"Land MF, Hayhoe M (2001) In what ways do eye movements contribute to everyday activities? Vis Res 41(25\u201326):3559\u20133565","journal-title":"Vis Res"},{"key":"1023_CR27","first-page":"153","volume":"5","author":"MF Land","year":"1996","unstructured":"Land MF, Horwood J (1996) The relations between head and eye movements during driving. Vis Veh 5:153\u2013160","journal-title":"Vis Veh"},{"key":"1023_CR28","doi-asserted-by":"crossref","unstructured":"Li A, Zhang Y, Chen Z (2017) Scanpath mining of eye movement trajectories for visual attention analysis. In: 2017 IEEE international conference on multimedia and expo (ICME), pp 535\u2013540","DOI":"10.1109\/ICME.2017.8019507"},{"key":"1023_CR29","first-page":"34892","volume-title":"Advances in neural information processing systems","author":"H Liu","year":"2023","unstructured":"Liu H, Li C, Wu Q, Lee YJ (2023) Visual instruction tuning. In: Oh A, Naumann T, Globerson A, Saenko K, Hardt M, Levine S (eds) Advances in neural information processing systems, vol 36. Curran Associates Inc, New York, pp 34892\u201334916"},{"key":"1023_CR30","doi-asserted-by":"crossref","unstructured":"Menges R, Kramer S, Hill S, Nisslmueller M, Kumar C, Staab S (2020) A visualization tool for eye tracking data analysis in the web. In: ACM symposium on eye tracking research and applications, pp 1\u20135","DOI":"10.1145\/3379156.3391831"},{"issue":"1","key":"1023_CR31","doi-asserted-by":"publisher","first-page":"4","DOI":"10.1016\/j.cviu.2004.07.010","volume":"98","author":"CH Morimoto","year":"2005","unstructured":"Morimoto CH, Mimica MR (2005) Eye gaze tracking techniques for interactive applications. Comput Vis Image Underst 98(1):4\u201324","journal-title":"Comput Vis Image Underst"},{"issue":"3968","key":"1023_CR32","doi-asserted-by":"publisher","first-page":"308","DOI":"10.1126\/science.171.3968.308","volume":"171","author":"D Noton","year":"1971","unstructured":"Noton D, Stark L (1971a) Scanpaths in eye movements during pattern perception. Science 171(3968):308\u2013311","journal-title":"Science"},{"issue":"9","key":"1023_CR33","doi-asserted-by":"publisher","first-page":"929-IN8","DOI":"10.1016\/0042-6989(71)90213-6","volume":"11","author":"D Noton","year":"1971","unstructured":"Noton D, Stark L (1971b) Scanpaths in saccadic eye movements while viewing and recognizing patterns. Vis Res 11(9):929-IN8","journal-title":"Vis Res"},{"issue":"3","key":"1023_CR34","doi-asserted-by":"publisher","first-page":"183","DOI":"10.1177\/0272989X19829728","volume":"39","author":"Y Okan","year":"2019","unstructured":"Okan Y, Janssen E, Galesic M, Waters EA (2019) Using the short graph literacy scale to predict precursors of health behavior change. Med Decis Making 39(3):183\u2013195","journal-title":"Med Decis Making"},{"key":"1023_CR35","doi-asserted-by":"publisher","DOI":"10.16910\/jemr.10.5.9","author":"V Peysakhovich","year":"2017","unstructured":"Peysakhovich V, Hurter C (2017) Scanpath visualization and comparison using visual aggregation techniques. J Eye Mov Res. https:\/\/doi.org\/10.16910\/jemr.10.5.9","journal-title":"J Eye Mov Res"},{"key":"1023_CR36","doi-asserted-by":"crossref","unstructured":"R\u00e4ih\u00e4 K-J, Aula A, Majaranta P, Rantala H, Koivunen K (2005) Static visualization of temporal eye-tracking data. In: IFIP conference on human\u2013computer interaction, pp 946\u2013949","DOI":"10.1007\/11555261_76"},{"issue":"3","key":"1023_CR37","doi-asserted-by":"publisher","first-page":"618","DOI":"10.1037\/0033-2909.85.3.618","volume":"85","author":"K Rayner","year":"1978","unstructured":"Rayner K (1978) Eye movements in reading and information processing. Psychol Bull 85(3):618","journal-title":"Psychol Bull"},{"issue":"3","key":"1023_CR38","doi-asserted-by":"publisher","first-page":"372","DOI":"10.1037\/0033-2909.124.3.372","volume":"124","author":"K Rayner","year":"1998","unstructured":"Rayner K (1998) Eye movements in reading and information processing: 20 years of research. Psychol Bull 124(3):372","journal-title":"Psychol Bull"},{"key":"1023_CR39","doi-asserted-by":"crossref","unstructured":"Rayner K, Juhasz BJ, Pollatsek A (2005) Eye movements during reading","DOI":"10.4135\/9781848608177.n12"},{"key":"1023_CR40","doi-asserted-by":"crossref","unstructured":"Rees D, Laramee RS, Brookes P, D\u2019Cruze T (2020) Interaction techniques for chord diagrams. In: 2020 24th international conference information visualisation (IV), pp 28\u201337","DOI":"10.1109\/IV51561.2020.00015"},{"issue":"6","key":"1023_CR41","doi-asserted-by":"publisher","first-page":"1045","DOI":"10.1207\/s15516709cog0000_29","volume":"29","author":"DC Richardson","year":"2005","unstructured":"Richardson DC, Dale R (2005) Looking to understand: the coupling between speakers\u2019 and listeners\u2019 eye movements and its relationship to discourse comprehension. Cogn Sci 29(6):1045\u20131060","journal-title":"Cogn Sci"},{"key":"1023_CR42","doi-asserted-by":"crossref","unstructured":"Riehmann P, Hanfler M, Froehlich B (2005) Interactive Sankey diagrams. In: IEEE symposium on information visualization, 2005. INFOVIS, pp 233\u2013240","DOI":"10.1109\/INFVIS.2005.1532152"},{"key":"1023_CR43","doi-asserted-by":"crossref","unstructured":"Rodrigues N, Netzel R, Spalink J, Weiskopf D (2018) Multiscale scanpath visualization and filtering. In: Proceedings of the 3rd workshop on eye tracking and visualization, pp 1\u20135","DOI":"10.1145\/3205929.3205931"},{"issue":"1","key":"1023_CR44","doi-asserted-by":"publisher","first-page":"38","DOI":"10.1016\/j.visinf.2019.03.005","volume":"3","author":"Y Ueno","year":"2019","unstructured":"Ueno Y, Natsukawa H, Aoyama N, Koyamada K (2019) Exploration behavior of group-in-a-box layouts. Vis Inform 3(1):38\u201347","journal-title":"Vis Inform"},{"key":"1023_CR45","doi-asserted-by":"crossref","unstructured":"Unger M, Wedel M, Tuzhilin A (2023) Predicting consumer choice from raw eye-movement data using the retina deep learning architecture. Available at SSRN 4341410","DOI":"10.2139\/ssrn.4341410"},{"key":"1023_CR46","doi-asserted-by":"publisher","first-page":"197","DOI":"10.1007\/978-3-030-82635-2_8","volume-title":"Eye-tracking with Python and Pylink","author":"Z Wang","year":"2021","unstructured":"Wang Z (2021) Eye movement data analysis and visualization. In: Wang Z (ed) Eye-tracking with Python and Pylink. Springer, Berlin, pp 197\u2013224"},{"key":"1023_CR47","doi-asserted-by":"crossref","unstructured":"Yang C-K, Wacharamanotham C (2018) Alpscarf: augmenting scarf plots for exploring temporal gaze patterns. In: Extended abstracts of the 2018 chi conference on human factors in computing systems, pp 1\u20136","DOI":"10.1145\/3170427.3188490"}],"container-title":["Journal of Visualization"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12650-024-01023-8.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s12650-024-01023-8\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12650-024-01023-8.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,1,29]],"date-time":"2025-01-29T06:55:50Z","timestamp":1738133750000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s12650-024-01023-8"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,8,14]]},"references-count":47,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2025,2]]}},"alternative-id":["1023"],"URL":"https:\/\/doi.org\/10.1007\/s12650-024-01023-8","relation":{},"ISSN":["1343-8875","1875-8975"],"issn-type":[{"value":"1343-8875","type":"print"},{"value":"1875-8975","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,8,14]]},"assertion":[{"value":"5 February 2023","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"8 May 2024","order":2,"name":"revised","label":"Revised","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"30 July 2024","order":3,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"14 August 2024","order":4,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare that they have no conflict of interest. The anonymised data from the survey and the original eye tracking study can be downloaded from: .","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}}]}}