{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,23]],"date-time":"2026-04-23T10:33:39Z","timestamp":1776940419172,"version":"3.51.4"},"reference-count":27,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2020,7,6]],"date-time":"2020-07-06T00:00:00Z","timestamp":1593993600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2020,7,6]],"date-time":"2020-07-06T00:00:00Z","timestamp":1593993600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"Ministry of Energy, Science, Technology, Environment and Climate Change","award":["ICF0001-2018"],"award-info":[{"award-number":["ICF0001-2018"]}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["J Big Data"],"published-print":{"date-parts":[[2020,12]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:sec>\n<jats:title>Background<\/jats:title>\n<jats:p>Emotion classification remains a challenging problem in affective computing. The large majority of emotion classification studies rely on electroencephalography (EEG) and\/or electrocardiography (ECG) signals and only classifies the emotions into two or three classes. Moreover, the stimuli used in most emotion classification studies utilize either music or visual stimuli that are presented through conventional displays such as computer display screens or television screens. This study reports on a novel approach to recognizing emotions using pupillometry alone in the form of pupil diameter data to classify emotions into four distinct classes according to Russell\u2019s Circumplex Model of Emotions, utilizing emotional stimuli that are presented in a virtual reality (VR) environment. The stimuli used in this experiment are 360\u00b0 videos presented using a VR headset. Using an eye-tracker, pupil diameter is acquired as the sole classification feature. Three classifiers were used for the emotion classification which are Support Vector Machine (SVM), k-Nearest Neighbor (KNN), and Random Forest (RF).<\/jats:p>\n<\/jats:sec><jats:sec>\n<jats:title>Findings<\/jats:title>\n<jats:p>SVM achieved the best performance for the four-class intra-subject classification task at an average of 57.05% accuracy, which is more than twice the accuracy of a random classifier. Although the accuracy can still be significantly improved, this study reports on the first systematic study on the use of eye-tracking data alone without any other supplementary sensor modalities to perform human emotion classification and demonstrates that even with a single feature of pupil diameter alone, emotions could be classified into four distinct classes to a certain level of accuracy. Moreover, the best performance for recognizing a particular class was 70.83%, which was achieved by the KNN classifier for Quadrant 3 emotions.<\/jats:p>\n<\/jats:sec><jats:sec>\n<jats:title>Conclusion<\/jats:title>\n<jats:p>This study presents the first systematic investigation on the use of pupillometry as the sole feature to classify emotions into four distinct classes using VR stimuli. The ability to conduct emotion classification using pupil data alone represents a promising new approach to affective computing as new applications could be developed using readily-available webcams on laptops and other mobile devices that are equipped with cameras without the need for specialized and costly equipment such as EEG and\/or ECG as the sensor modality.<\/jats:p>\n<\/jats:sec>","DOI":"10.1186\/s40537-020-00322-9","type":"journal-article","created":{"date-parts":[[2020,7,6]],"date-time":"2020-07-06T10:02:58Z","timestamp":1594029778000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":39,"title":["Four-class emotion classification in virtual reality using pupillometry"],"prefix":"10.1186","volume":"7","author":[{"given":"Lim Jia","family":"Zheng","sequence":"first","affiliation":[]},{"given":"James","family":"Mountstephens","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2415-5915","authenticated-orcid":false,"given":"Jason","family":"Teo","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2020,7,6]]},"reference":[{"key":"322_CR1","doi-asserted-by":"publisher","unstructured":"Alhargan A, Cooke N, Binjammaz T. Multimodal affect recognition in an interactive gaming environment using eye tracking and speech signals. In: ICMI 2017\u2014proceedings of the 19th ACM international conference on multimodal interaction; 2017. p. 479\u201386. https:\/\/doi.org\/10.1145\/3136755.3137016.","DOI":"10.1145\/3136755.3137016"},{"key":"322_CR2","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/j.entcom.2015.12.001","volume":"14","author":"S Almeida","year":"2016","unstructured":"Almeida S, Mealha \u00d3, Veloso A. Video game scenery analysis with eye tracking. Entertain Comput. 2016;14:1\u201313. https:\/\/doi.org\/10.1016\/j.entcom.2015.12.001.","journal-title":"Entertain Comput"},{"issue":"18","key":"322_CR3","first-page":"10987","volume":"11","author":"MH Alsibai","year":"2016","unstructured":"Alsibai MH, Manap SA. A study on driver fatigue notification systems. ARPN J Eng Appl Sci. 2016;11(18):10987\u201392.","journal-title":"ARPN J Eng Appl Sci"},{"key":"322_CR4","doi-asserted-by":"publisher","unstructured":"Aracena C, Basterrech S, Snasel V, Velasquez J. Neural networks for emotion recognition based on eye tracking data.In: Proceedings\u20142015 IEEE international conference on systems, man, and cybernetics, SMC 2015; 2016. p. 2632\u20137. https:\/\/doi.org\/10.1109\/SMC.2015.460.","DOI":"10.1109\/SMC.2015.460"},{"key":"322_CR5","doi-asserted-by":"publisher","unstructured":"Basu S, Chakraborty J, Aftabuddin M. Emotion recognition from speech using convolutional neural network with recurrent neural network architecture. In: Proceedings of the 2nd international conference on communication and electronics systems, ICCES 2017, 2018-Jan (Icces); 2018. p. 333\u2013336. https:\/\/doi.org\/10.1109\/CESYS.2017.8321292.","DOI":"10.1109\/CESYS.2017.8321292"},{"key":"322_CR6","doi-asserted-by":"publisher","unstructured":"Bekele E, Bian D, Zheng Z, Peterman J, Park S, Sarkar N. Responses during facial emotional expression recognition tasks using virtual reality and static IAPS pictures for adults with schizophrenia. In: Lecture Notes in Computer Science (Including Subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8526 LNCS (PART 2); 2014. p. 225\u201335. https:\/\/doi.org\/10.1007\/978-3-319-07464-1_21.","DOI":"10.1007\/978-3-319-07464-1_21"},{"key":"322_CR7","doi-asserted-by":"publisher","unstructured":"Busjahn T, Begel A, Orlov P, Sharif B, Hansen M, Bednarik R, Shchekotova G. Eye tracking in computing education categories and subject descriptors. In: ACM: proceedings of the tenth annual conference on international computing education research; 2014. p. 3\u201310. https:\/\/doi.org\/10.1145\/2632320.2632344.","DOI":"10.1145\/2632320.2632344"},{"key":"322_CR8","doi-asserted-by":"publisher","unstructured":"Chanthaphan N, Uchimura K, Satonaka T, Makioka T. Facial emotion recognition based on facial motion stream generated by kinect. In: Proceedings\u201411th international conference on signal-image technology and internet-based systems, SITIS 2015; 2016. p. 117\u2013124. https:\/\/doi.org\/10.1109\/SITIS.2015.31.","DOI":"10.1109\/SITIS.2015.31"},{"issue":"2\u20133","key":"322_CR9","doi-asserted-by":"publisher","first-page":"83","DOI":"10.1016\/S0165-0173(97)00064-7","volume":"26","author":"AR Damasio","year":"1998","unstructured":"Damasio AR. Emotion in the perspective of an integrated nervous system. Brain Res Rev. 1998;26(2\u20133):83\u20136. https:\/\/doi.org\/10.1016\/S0165-0173(97)00064-7.","journal-title":"Brain Res Rev"},{"key":"322_CR10","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1007\/978-3-319-28099-8_495-1","volume-title":"Encyclopedia of personality and individual differences","author":"P Ekman","year":"1999","unstructured":"Ekman P. Basic emotions. Encyclopedia of personality and individual differences. Cham: Springer; 1999. p. 1\u20136. https:\/\/doi.org\/10.1007\/978-3-319-28099-8_495-1."},{"issue":"6","key":"322_CR11","doi-asserted-by":"publisher","first-page":"699","DOI":"10.1089\/cpb.2009.0192","volume":"12","author":"A Gorini","year":"2009","unstructured":"Gorini A, Mosso JL, Mosso D, Pineda E, Ru\u00edz NL, Ram\u00edez M, et al. Emotional response to virtual reality exposure across different cultures: the role of the attribution process. CyberPsychol Behav. 2009;12(6):699\u2013705. https:\/\/doi.org\/10.1089\/cpb.2009.0192.","journal-title":"CyberPsychol Behav"},{"key":"322_CR12","doi-asserted-by":"publisher","unstructured":"Guo R, Li S, He L, Gao W, Qi H, Owens G. Pervasive and unobtrusive emotion sensing for human mental health. In: Proceedings of the 2013 7th international conference on pervasive computing technologies for healthcare and workshops, PervasiveHealth 2013; 2013. p. 436\u20139. https:\/\/doi.org\/10.4108\/icst.pervasivehealth.2013.252133.","DOI":"10.4108\/icst.pervasivehealth.2013.252133"},{"issue":"1","key":"322_CR13","doi-asserted-by":"publisher","first-page":"51","DOI":"10.1097\/SIH.0000000000000192","volume":"12","author":"EA Henneman","year":"2017","unstructured":"Henneman EA, Marquard JL, Fisher DL, Gawlinski A. Eye tracking: a novel approach for evaluating and improving the safety of healthcare processes in the simulated setting. Simul Healthcare. 2017;12(1):51\u20136. https:\/\/doi.org\/10.1097\/SIH.0000000000000192.","journal-title":"Simul Healthcare"},{"key":"322_CR14","volume-title":"The tell-tale eye: How your eyes reveal hidden thoughts and emotions. In The tell-tale eye: How your eyes reveal hidden thoughts and emotions","author":"EH Hess","year":"1975","unstructured":"Hess EH. The tell-tale eye: How your eyes reveal hidden thoughts and emotions. In The tell-tale eye: How your eyes reveal hidden thoughts and emotions. Oxford: Van Nostrand Reinhold; 1975."},{"key":"322_CR15","doi-asserted-by":"publisher","unstructured":"Hickson S, Kwatra V, Dufour N, Sud A, Essa I. Eyemotion: classifying facial expressions in VR using eye-tracking cameras. In: Proceedings\u20142019 IEEE winter conference on applications of computer vision, WACV 2019; 2019. p. 1626\u20131635. https:\/\/doi.org\/10.1109\/WACV.2019.00178.","DOI":"10.1109\/WACV.2019.00178"},{"key":"322_CR16","doi-asserted-by":"publisher","first-page":"40","DOI":"10.1037\/0882-7974.21.1.40","volume":"21","author":"DM Isaacowitz","year":"2006","unstructured":"Isaacowitz DM, Wadlinger HA, Goren D, Wilson HR. Selective preference in visual fixation away from negative images in old age? An eye-tracking study. Psychol Aging. 2006;21:40\u20138. https:\/\/doi.org\/10.1037\/0882-7974.21.1.40.","journal-title":"Psychol Aging"},{"key":"322_CR17","doi-asserted-by":"publisher","DOI":"10.1016\/B978-044451020-4\/50031-1","author":"RJK Jacob","year":"2003","unstructured":"Jacob RJK, Karn KS. Eye tracking in human-computer interaction and usability research: ready to deliver the promises. Mind\u2019s Eye. 2003. https:\/\/doi.org\/10.1016\/B978-044451020-4\/50031-1.","journal-title":"Mind\u2019s Eye"},{"issue":"5","key":"322_CR18","doi-asserted-by":"publisher","first-page":"865","DOI":"10.1007\/s12555-009-0521-0","volume":"7","author":"KE Ko","year":"2009","unstructured":"Ko KE, Yang HC, Sim KB. Emotion recognition using EEG signals with relative power values and Bayesian network. Int J Control Autom Syst. 2009;7(5):865\u201370. https:\/\/doi.org\/10.1007\/s12555-009-0521-0.","journal-title":"Int J Control Autom Syst"},{"issue":"8","key":"322_CR19","doi-asserted-by":"publisher","first-page":"1","DOI":"10.3390\/s20082384","volume":"20","author":"JZ Lim","year":"2020","unstructured":"Lim JZ, Mountstephens J, Teo J. Emotion recognition using eye-tracking: taxonomy, review and current challenges. Sensors (Switzerland). 2020;20(8):1\u201321. https:\/\/doi.org\/10.3390\/s20082384.","journal-title":"Sensors (Switzerland)"},{"issue":"1","key":"322_CR20","doi-asserted-by":"publisher","first-page":"59","DOI":"10.1504\/IJBET.2017.082224","volume":"23","author":"S Paul","year":"2017","unstructured":"Paul S, Banerjee A, Tibarewala DN. Emotional eye movement analysis using electrooculography signal. Int J Biomed Eng Technol. 2017;23(1):59\u201370. https:\/\/doi.org\/10.1504\/IJBET.2017.082224.","journal-title":"Int J Biomed Eng Technol"},{"issue":"3","key":"322_CR21","doi-asserted-by":"publisher","first-page":"393","DOI":"10.1007\/BF00354055","volume":"52","author":"R Plutchik","year":"2001","unstructured":"Plutchik R. The nature of emotions. Philos Stud. 2001;52(3):393\u2013409. https:\/\/doi.org\/10.1007\/BF00354055.","journal-title":"Philos Stud"},{"key":"322_CR22","doi-asserted-by":"publisher","unstructured":"Rattanyu K, Ohkura M, Mizukawa M. Emotion monitoring from physiological signals for service robots in the living space. In: ICCAS 2010\u2014international conference on control, automation and systems; 2010. p. 580\u2013583. https:\/\/doi.org\/10.1109\/ICCAS.2010.5669914.","DOI":"10.1109\/ICCAS.2010.5669914"},{"issue":"8","key":"322_CR23","doi-asserted-by":"publisher","first-page":"79","DOI":"10.14569\/ijacsa.2013.040812","volume":"4","author":"V Raudonis","year":"2013","unstructured":"Raudonis V, Dervinis G, Vilkauskas A, Paulauskaite A, Kersulyte G. Evaluation of human emotion from eye motions. Int J Adv Comput Sci Appl. 2013;4(8):79\u201384. https:\/\/doi.org\/10.14569\/ijacsa.2013.040812.","journal-title":"Int J Adv Comput Sci Appl"},{"issue":"8","key":"322_CR24","doi-asserted-by":"publisher","first-page":"1457","DOI":"10.1080\/17470210902816461","volume":"62","author":"K Rayner","year":"2009","unstructured":"Rayner K. Eye movements and attention in reading, scene perception, and visual search. Quart J Exp Psychol. 2009;62(8):1457\u2013506. https:\/\/doi.org\/10.1080\/17470210902816461.","journal-title":"Quart J Exp Psychol"},{"issue":"6","key":"322_CR25","doi-asserted-by":"publisher","first-page":"1161","DOI":"10.1037\/h0077714","volume":"39","author":"JA Russell","year":"1980","unstructured":"Russell JA. A circumplex model of affect. J Pers Soc Psychol. 1980;39(6):1161\u201378. https:\/\/doi.org\/10.1037\/h0077714.","journal-title":"J Pers Soc Psychol"},{"key":"322_CR26","doi-asserted-by":"publisher","unstructured":"Teo J, Suhaimi NS, Mountstephens J. Augmenting EEG with inertial sensing for improved 4-class subject-independent emotion classification in virtual reality; 2019. p. 1\u20138. https:\/\/doi.org\/10.4108\/eai.18-7-2019.2287946.","DOI":"10.4108\/eai.18-7-2019.2287946"},{"issue":"9","key":"322_CR27","doi-asserted-by":"publisher","first-page":"2826","DOI":"10.3390\/s18092826","volume":"18","author":"Y Wang","year":"2018","unstructured":"Wang Y, Lv Z, Zheng Y. Automatic emotion perception using eye movement information for E-healthcare systems. Sensors (Switzerland). 2018;18(9):2826. https:\/\/doi.org\/10.3390\/s18092826.","journal-title":"Sensors (Switzerland)"}],"container-title":["Journal of Big Data"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1186\/s40537-020-00322-9.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1186\/s40537-020-00322-9\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1186\/s40537-020-00322-9.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2021,7,5]],"date-time":"2021-07-05T23:39:50Z","timestamp":1625528390000},"score":1,"resource":{"primary":{"URL":"https:\/\/journalofbigdata.springeropen.com\/articles\/10.1186\/s40537-020-00322-9"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,7,6]]},"references-count":27,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2020,12]]}},"alternative-id":["322"],"URL":"https:\/\/doi.org\/10.1186\/s40537-020-00322-9","relation":{},"ISSN":["2196-1115"],"issn-type":[{"value":"2196-1115","type":"electronic"}],"subject":[],"published":{"date-parts":[[2020,7,6]]},"assertion":[{"value":"30 March 2020","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"21 June 2020","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"6 July 2020","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"The authors declare that they have no competing interests.","order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Competing interests"}}],"article-number":"43"}}