{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,20]],"date-time":"2026-03-20T23:01:03Z","timestamp":1774047663072,"version":"3.50.1"},"reference-count":44,"publisher":"MDPI AG","issue":"1","license":[{"start":{"date-parts":[[2024,1,18]],"date-time":"2024-01-18T00:00:00Z","timestamp":1705536000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"Graz University of Technology"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["MAKE"],"abstract":"<jats:p>Understanding and detecting human emotions is crucial for enhancing mental health, cognitive performance and human\u2013computer interactions. This field in affective computing is relatively unexplored, and gaining knowledge about which external factors impact emotions could enhance communication between users and machines. Furthermore, it could also help us to manage affective disorders or understand affective physiological responses to human spatial and digital environments. The main objective of the current study was to investigate the influence of external stimulation, specifically the influence of different light conditions, on brain activity while observing affect-eliciting pictures and their classification. In this context, a multichannel electroencephalography (EEG) was recorded in 30 participants as they observed images from the Nencki Affective Picture System (NAPS) database in an art-gallery-style Virtual Reality (VR) environment. The elicited affect states were classified into three affect classes within the two-dimensional valence\u2013arousal plane. Valence (positive\/negative) and arousal (high\/low) values were reported by participants on continuous scales. The experiment was conducted in two experimental conditions: a warm light condition and a cold light condition. Thus, three classification tasks arose with regard to the recorded brain data: classification of an affect state within a warm-light condition, classification of an affect state within a cold light condition, and warm light vs. cold light classification during observation of affect-eliciting images. For all classification tasks, Linear Discriminant Analysis, a Spatial Filter Model, a Convolutional Neural Network, the EEGNet, and the SincNet were compared. The EEGNet architecture performed best in all tasks. It could significantly classify three affect states with 43.12% accuracy under the influence of warm light. Under the influence of cold light, no model could achieve significant results. The classification between visual stimulus with warm light vs. cold light could be classified significantly with 76.65% accuracy from the EEGNet, well above any other machine learning or deep learning model. No significant differences could be detected between affect recognition in different light conditions, but the results point towards the advantage of gradient-based learning methods for data-driven experimental designs for the problem of affect decoding from EEG, providing modern tools for affective computing in digital spaces. Moreover, the ability to discern externally driven affective states through deep learning not only advances our understanding of the human mind but also opens avenues for developing innovative therapeutic interventions and improving human\u2013computer interaction.<\/jats:p>","DOI":"10.3390\/make6010011","type":"journal-article","created":{"date-parts":[[2024,1,19]],"date-time":"2024-01-19T07:00:32Z","timestamp":1705647632000},"page":"199-214","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":3,"title":["The Impact of Light Conditions on Neural Affect Classification: A Deep Learning Approach"],"prefix":"10.3390","volume":"6","author":[{"given":"Sophie","family":"Zentner","sequence":"first","affiliation":[{"name":"Institute of Neural Engineering, Graz University of Technology, 8010 Graz, Austria"}]},{"given":"Alberto","family":"Barradas Chacon","sequence":"additional","affiliation":[{"name":"Institute of Neural Engineering, Graz University of Technology, 8010 Graz, Austria"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4345-7310","authenticated-orcid":false,"given":"Selina C.","family":"Wriessnegger","sequence":"additional","affiliation":[{"name":"Institute of Neural Engineering, Graz University of Technology, 8010 Graz, Austria"}]}],"member":"1968","published-online":{"date-parts":[[2024,1,18]]},"reference":[{"key":"ref_1","first-page":"1","article-title":"EEG-based emotion recognition. The influence of visual and auditory stimuli","volume":"56","author":"Bos","year":"2006","journal-title":"Comput. Sci."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"84","DOI":"10.1007\/s11920-021-01299-9","article-title":"EEG Neurofeedback for Anxiety Disorders and Post-Traumatic Stress Disorders: A Blueprint for a Promising Brain-Based Therapy","volume":"23","author":"Jeunet","year":"2021","journal-title":"Curr. Psychiatry Rep."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"868450","DOI":"10.3389\/fnhum.2022.868450","article-title":"The effect of neurofeedback on the reaction time and cognitive performance of athletes: A systematic review and meta-analysis","volume":"16","author":"Fernandes","year":"2022","journal-title":"Front. Hum. Neurosci."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"97","DOI":"10.1016\/j.pscychresns.2013.10.005","article-title":"Cognitive processes associated with compulsive buying behaviours and related EEG coherence","volume":"221","author":"Lawrence","year":"2014","journal-title":"Psychiatry Res. Neuroimaging"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"6802","DOI":"10.1038\/s41598-022-10906-5","article-title":"Toward passive BCI: Asynchronous decoding of neural responses to direction-and angle-specific perturbations during a simulated cockpit scenario","volume":"12","author":"Jalilpour","year":"2022","journal-title":"Sci. Rep."},{"key":"ref_6","unstructured":"Dalgleish, T., and Power, M.J. (2005). Handbook of Cognition and Emotion, John Wiley & Sons, Ltd."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"1161","DOI":"10.1037\/h0077714","article-title":"A circumplex model of affect","volume":"39","author":"Russell","year":"1980","journal-title":"J. Personal. Soc. Psychol."},{"key":"ref_8","first-page":"4385","article-title":"EEG-based emotion recognition: Review of commercial EEG devices and machine learning techniques","volume":"34","author":"Dadebayev","year":"2022","journal-title":"J. King Saud. Univ.-Comput. Inf. Sci."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"374","DOI":"10.1109\/TAFFC.2017.2714671","article-title":"Emotions Recognition Using EEG Signals: A Survey","volume":"9","author":"Alarcao","year":"2019","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"587","DOI":"10.1080\/00207454.2021.1941947","article-title":"Fused CNN-LSTM deep learning emotion recognition model using electroencephalography signals","volume":"133","author":"Ramzan","year":"2023","journal-title":"Int. J. Neurosci."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"964754","DOI":"10.3389\/frvir.2022.964754","article-title":"Real-time affect detection in virtual reality: A technique based on a three-dimensional model of affect and EEG signals","volume":"3","author":"Pinilla","year":"2023","journal-title":"Front. Virtual Real."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"13361","DOI":"10.3390\/s140813361","article-title":"Emotion recognition from single-trial EEG based on kernel Fisher\u2019s emotion pattern and imbalanced quasiconformal kernel support vector machine","volume":"14","author":"Liu","year":"2014","journal-title":"Sensors"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Gonzalez, H.A., Yoo, J., and Elfadel, I.M. (2019, January 23\u201327). EEG-based emotion detection using unsupervised transfer learning. Proceedings of the 2019 41st Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Berlin, Germany.","DOI":"10.1109\/EMBC.2019.8857248"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Ding, Y., Robinson, N., Zeng, Q., Chen, D., Wai, A.A.P., Lee, T.S., and Guan, C. (2020, January 19\u201324). TSception: A deep learning framework for emotion detection using EEG. Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK.","DOI":"10.1109\/IJCNN48605.2020.9206750"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Yu, M., Xiao, S., Hua, M., Wang, H., Chen, X., Tian, F., and Li, Y. (2022). EEG-based emotion recognition in an immersive virtual reality environment: From local activity to brain network features. Biomed. Signal Process. Control, 72.","DOI":"10.1016\/j.bspc.2021.103349"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"13657","DOI":"10.1038\/s41598-018-32063-4","article-title":"Affective computing in virtual reality: Emotion recognition from brain and heartbeat dynamics using wearable sensors","volume":"8","author":"Greco","year":"2018","journal-title":"Sci. Rep."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"1504","DOI":"10.1109\/JIOT.2020.3012452","article-title":"DeepEDN: A deep-learning-based image encryption and decryption network for internet of medical things","volume":"8","author":"Ding","year":"2020","journal-title":"IEEE Internet Things J."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Gupta, K., Lazarevic, J., Pai, Y.S., and Billinghurst, M. (2020, January 1\u20134). AffectivelyVR: Towards VR personalized emotion recognition. Proceedings of the 26th ACM Symposium on Virtual Reality Software and Technology, Virtual Event.","DOI":"10.1145\/3385956.3422122"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"e64812","DOI":"10.7554\/eLife.64812","article-title":"Decoding subjective emotional arousal from EEG during an immersive virtual reality experience","volume":"10","author":"Hofmann","year":"2021","journal-title":"eLife"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Mar\u00edn-Morales, J., Llinares, C., Guixeres, J., and Alca\u00f1iz, M. (2020). Emotion recognition in immersive virtual reality: From statistics to affective computing. Sensors, 20.","DOI":"10.3390\/s20185163"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"1003","DOI":"10.1007\/s00779-017-1072-7","article-title":"Towards emotion recognition for virtual environments: An evaluation of eeg features on benchmark dataset","volume":"21","author":"Menezes","year":"2017","journal-title":"Pers. Ubiquitous Comput."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"108085","DOI":"10.1016\/j.chb.2023.108085","article-title":"EEG-based affective computing in virtual reality with a balancing of the computational efficiency and recognition accuracy","volume":"152","author":"Pei","year":"2024","journal-title":"Comput. Hum. Behav."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"8875426","DOI":"10.1155\/2020\/8875426","article-title":"EEG-based emotion recognition: A state-of-the-art review of current trends and opportunities","volume":"2020","author":"Suhaimi","year":"2020","journal-title":"Comput. Intell. Neurosci."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"101083","DOI":"10.1016\/j.jneuroling.2022.101083","article-title":"What\u2019s in a Color? A neuropsycholinguistic study on the effect of colors on EEG brainwaves, immediate emotional responses, and English language vocabulary retention among Iranian young adults","volume":"63","author":"Hosseini","year":"2022","journal-title":"J. Neurolinguist."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Hassib, M., Braun, M., Pfleging, B., and Alt, F. (2019, January 2\u20136). Detecting and influencing driver emotions using psycho-physiological sensors and ambient light. Proceedings of the IFIP Conference on Human-Computer Interaction, Paphos, Cyprus.","DOI":"10.1007\/978-3-030-29381-9_43"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1523\/ENEURO.0104-22.2022","article-title":"Enlarged Interior Built Environment Scale Modulates High-Frequency EEG Oscillations","volume":"9","author":"Bower","year":"2022","journal-title":"Eneuro"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Schilling, T., Sipatchin, A., Chuang, L., and Wahl, S. (2018, January 27\u201329). Tinted lenses affect our physiological responses to affective pictures: An EEG\/ERP study. Proceedings of the 2nd International Neuroergonomics Conference: The Brain at Work and in Everyday Life, Philadelphia, PA, USA.","DOI":"10.3389\/conf.fnhum.2018.227.00104"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"227","DOI":"10.1002\/smi.2460060308","article-title":"The stress appraisal measure (SAM): A multidimensional approach to cognitive appraisal","volume":"6","author":"Peacock","year":"1990","journal-title":"Stress Med."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Liu, X. (2015). Methods and Applications of Longitudinal Data Analysis, Elsevier.","DOI":"10.1016\/B978-0-12-801342-7.00002-2"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Horvat, M., Dobrini\u0107, M., Novosel, M., and Jer\u010di\u0107, P. (2018, January 21\u201325). Assessing emotional responses induced in virtual reality using a consumer EEG headset: A preliminary report. Proceedings of the 2018 41st International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), Opatija, Croatia.","DOI":"10.23919\/MIPRO.2018.8400184"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"329","DOI":"10.1109\/TPAMI.2022.3145392","article-title":"Deep ROC analysis and auc as balanced average accuracy, for improved classifier selection, audit and explanation","volume":"45","author":"Carrington","year":"2023","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"299","DOI":"10.1109\/TKDE.2005.50","article-title":"Using AUC and accuracy in evaluating learning algorithms","volume":"17","author":"Huang","year":"2005","journal-title":"IEEE Trans. Knowl. Data Eng."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"14","DOI":"10.3389\/fninf.2014.00014","article-title":"Machine learning for neuroimaging with scikit-learn","volume":"8","author":"Abraham","year":"2014","journal-title":"Front. Neuroinform."},{"key":"ref_34","unstructured":"Szczerbicki, E. (2003). Knowledge and Information Technology Management: Human and Social Perspectives, IGI Global."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Albawi, S., Mohammed, T.A., and Al-Zawi, S. (2017, January 21\u201323). Understanding of a convolutional neural network. Proceedings of the 2017 International Conference on Engineering and Technology (ICET), Antalya, Turkey.","DOI":"10.1109\/ICEngTechnol.2017.8308186"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"056013","DOI":"10.1088\/1741-2552\/aace8c","article-title":"EEGNet: A compact convolutional neural network for EEG-based brain-computer interfaces","volume":"15","author":"Lawhern","year":"2018","journal-title":"J. Neural Eng."},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Yu, Y., Abadi, M., Barham, P., Brevdo, E., Burrows, M., Davis, A., Dean, J., Ghemawat, S., Harley, T., and Hawkins, P. (2018, January 23\u201326). Dynamic control flow in large-scale machine learning. Proceedings of the Thirteenth EuroSys Conference, Porto, Portugal.","DOI":"10.1145\/3190508.3190551"},{"key":"ref_38","unstructured":"Ravanelli, M., and Bengio, Y. (2018). Interpretable convolutional filters with sincnet. arXiv."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"131636","DOI":"10.1109\/ACCESS.2020.3009665","article-title":"S-EEGNet: Electroencephalogram signal classification based on a separable convolution neural network with bilinear interpolation","volume":"8","author":"Huang","year":"2020","journal-title":"IEEE Access"},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Qiao, Y., Alnemari, M., and Bagherzadeh, N. (2022, January 22\u201325). A two-stage efficient 3-D CNN framework for EEG based emotion recognition. Proceedings of the 2022 IEEE International Conference on Industrial Technology (ICIT), Shanghai, China.","DOI":"10.1109\/ICIT48603.2022.10002796"},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Zhu, Y., Ozawa, K., and Kong, W. (2021, January 9\u201311). EEGNetT: EEG-based neural network for emotion recognition in real-world applications. Proceedings of the 2021 IEEE 3rd Global Conference on Life Sciences and Technologies (LifeTech), Nara, Japan.","DOI":"10.1109\/LifeTech52111.2021.9391941"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"817","DOI":"10.1177\/0013916500326005","article-title":"Effects of indoor lighting, gender, and age on mood and cognitive performance","volume":"32","author":"Knez","year":"2000","journal-title":"Environ. Behav."},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Ma, C., Wang, H., Wu, J., and Xue, C. (2022). How the Quantity and Hue Contrast of Interface Color Coding Affect Human Perception: Evidence from Two EEG Studies. Res. Sq.","DOI":"10.21203\/rs.3.rs-2265895\/v1"},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Li, Y., Zhang, S., Yin, Y., Xiao, W., and Zhang, J. (2018). Parallel one-class extreme learning machine for imbalance learning based on Bayesian approach. J. Ambient. Intell. Hum. Comput.","DOI":"10.1007\/s12652-018-0994-x"}],"container-title":["Machine Learning and Knowledge Extraction"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2504-4990\/6\/1\/11\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T13:49:50Z","timestamp":1760104190000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2504-4990\/6\/1\/11"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,1,18]]},"references-count":44,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2024,3]]}},"alternative-id":["make6010011"],"URL":"https:\/\/doi.org\/10.3390\/make6010011","relation":{},"ISSN":["2504-4990"],"issn-type":[{"value":"2504-4990","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,1,18]]}}}