{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,17]],"date-time":"2026-04-17T16:44:50Z","timestamp":1776444290452,"version":"3.51.2"},"reference-count":98,"publisher":"MDPI AG","issue":"14","license":[{"start":{"date-parts":[[2020,7,21]],"date-time":"2020-07-21T00:00:00Z","timestamp":1595289600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>Emotion recognition has increased the potential of affective computing by getting an instant feedback from users and thereby, have a better understanding of their behavior. Physiological sensors have been used to recognize human emotions in response to audio and video content that engages single (auditory) and multiple (two: auditory and vision) human senses, respectively. In this study, human emotions were recognized using physiological signals observed in response to tactile enhanced multimedia content that engages three (tactile, vision, and auditory) human senses. The aim was to give users an enhanced real-world sensation while engaging with multimedia content. To this end, four videos were selected and synchronized with an electric fan and a heater, based on timestamps within the scenes, to generate tactile enhanced content with cold and hot air effect respectively. Physiological signals, i.e., electroencephalography (EEG), photoplethysmography (PPG), and galvanic skin response (GSR) were recorded using commercially available sensors, while experiencing these tactile enhanced videos. The precision of the acquired physiological signals (including EEG, PPG, and GSR) is enhanced using pre-processing with a Savitzky-Golay smoothing filter. Frequency domain features (rational asymmetry, differential asymmetry, and correlation) from EEG, time domain features (variance, entropy, kurtosis, and skewness) from GSR, heart rate and heart rate variability from PPG data are extracted. The K nearest neighbor classifier is applied to the extracted features to classify four (happy, relaxed, angry, and sad) emotions. Our experimental results show that among individual modalities, PPG-based features gives the highest accuracy of     78.57 %     as compared to EEG- and GSR-based features. The fusion of EEG, GSR, and PPG features further improved the classification accuracy to     79.76 %     (for four emotions) when interacting with tactile enhanced multimedia.<\/jats:p>","DOI":"10.3390\/s20144037","type":"journal-article","created":{"date-parts":[[2020,7,21]],"date-time":"2020-07-21T06:38:55Z","timestamp":1595313535000},"page":"4037","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":71,"title":["Physiological Sensors Based Emotion Recognition While Experiencing Tactile Enhanced Multimedia"],"prefix":"10.3390","volume":"20","author":[{"given":"Aasim","family":"Raheel","sequence":"first","affiliation":[{"name":"Department of Computer Engineering, University of Engineering and Technology, Taxila 47050, Pakistan"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3662-2525","authenticated-orcid":false,"given":"Muhammad","family":"Majid","sequence":"additional","affiliation":[{"name":"Department of Computer Engineering, University of Engineering and Technology, Taxila 47050, Pakistan"}]},{"given":"Majdi","family":"Alnowami","sequence":"additional","affiliation":[{"name":"Department of Nuclear Engineering, King Abdulaziz University, Jeddah 21589, Saudi Arabia"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8179-3959","authenticated-orcid":false,"given":"Syed Muhammad","family":"Anwar","sequence":"additional","affiliation":[{"name":"Department of Software Engineering, University of Engineering and Technology, Taxila 47050, Pakistan"}]}],"member":"1968","published-online":{"date-parts":[[2020,7,21]]},"reference":[{"key":"ref_1","first-page":"17","article-title":"Mulsemedia: State of the art, perspectives, and challenges","volume":"11","author":"Ghinea","year":"2014","journal-title":"ACM Trans. Multimed. Comput. Commun. Appl. (TOMM)"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"91","DOI":"10.1145\/3233774","article-title":"Is multimedia multisensorial?\u2014A review of mulsemedia systems","volume":"51","author":"Covaci","year":"2019","journal-title":"ACM Comput. Surv. (CSUR)"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1145\/3319853","article-title":"Mulsemedia DIY: A survey of devices and a tutorial for building your own mulsemedia environment","volume":"52","author":"Saleme","year":"2019","journal-title":"ACM Comput. Surv. (CSUR)"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"421","DOI":"10.1007\/s00530-019-00618-8","article-title":"A mulsemedia framework for delivering sensory effects to heterogeneous systems","volume":"25","author":"Saleme","year":"2019","journal-title":"Multimed. Syst."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Picard, R.W., and Picard, R. (1997). Affective Computer, MIT Press.","DOI":"10.7551\/mitpress\/1140.001.0001"},{"key":"ref_6","unstructured":"Ekman, P., and Friesen, W.V. (2003). Unmasking the Face: A Guide to Recognizing Emotions from Facial Clues, ISHK."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"1161","DOI":"10.1037\/h0077714","article-title":"A circumplex model of affect","volume":"39","author":"Russell","year":"1980","journal-title":"J. Personal. Soc. Psychol."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Gunes, H., Schuller, B., Pantic, M., and Cowie, R. (2011, January 21\u201325). Emotion representation, analysis and synthesis in continuous space: A survey. Proceedings of the Face and Gesture 2011, Santa Barbara, CA, USA.","DOI":"10.1109\/FG.2011.5771357"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Bethel, C.L., Salomon, K., Murphy, R.R., and Burke, J.L. (2007, January 26\u201329). Survey of psychophysiology measurements applied to human-robot interaction. Proceedings of the RO-MAN 2007\u2014The 16th IEEE International Symposium on Robot and Human Interactive Communication, Jeju, Korea.","DOI":"10.1109\/ROMAN.2007.4415182"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Dzedzickis, A., Kaklauskas, A., and Bucinskas, V. (2020). Human Emotion Recognition: Review of Sensors and Methods. Sensors, 20.","DOI":"10.3390\/s20030592"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Dellaert, F., Polzin, T., and Waibel, A. (1996, January 3\u20136). Recognizing emotion in speech. Proceedings of the Fourth International Conference on Spoken Language Processing, ICSLP\u201996, Philadelphia, PA, USA.","DOI":"10.21437\/ICSLP.1996-462"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"183","DOI":"10.3390\/s20010183","article-title":"A CNN-Assisted Enhanced Audio Signal Processing for Speech Emotion Recognition","volume":"20","author":"Mustaqeem","year":"2020","journal-title":"Sensors"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"312","DOI":"10.1016\/j.bspc.2018.08.035","article-title":"Speech emotion recognition using deep 1D & 2D CNN LSTM networks","volume":"47","author":"Zhao","year":"2019","journal-title":"Biomed. Signal Process. Control."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"1004","DOI":"10.1049\/iet-ipr.2017.0499","article-title":"Emotion recognition from facial expressions using hybrid feature descriptors","volume":"12","author":"Kalsum","year":"2018","journal-title":"IET Image Process."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Qayyum, H., Majid, M., Anwar, S.M., and Khan, B. (2017). Facial Expression Recognition Using Stationary Wavelet Transform Features. Math. Probl. Eng., 2017.","DOI":"10.1155\/2017\/9854050"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Zhou, B., Ghose, T., and Lukowicz, P. (2020). Expressure: Detect Expressions Related to Emotional and Cognitive Activities Using Forehead Textile Pressure Mechanomyography. Sensors, 20.","DOI":"10.3390\/s20030730"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Busso, C., Deng, Z., Yildirim, S., Bulut, M., Lee, C.M., Kazemzadeh, A., Lee, S., Neumann, U., and Narayanan, S. (2004, January 13\u201315). Analysis of emotion recognition using facial expressions, speech and multimodal information. Proceedings of the 6th International Conference on Multimodal Interfaces, State College, PA, USA.","DOI":"10.1145\/1027933.1027968"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Ranganathan, H., Chakraborty, S., and Panchanathan, S. (2016, January 7\u201310). Multimodal emotion recognition using deep learning architectures. Proceedings of the 2016 IEEE Winter Conference on Applications of Computer Vision (WACV), Lake Placid, NY, USA.","DOI":"10.1109\/WACV.2016.7477679"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Raheel, A., Majid, M., and Anwar, S.M. (2019, January 30\u201331). Facial Expression Recognition based on Electroencephalography. Proceedings of the 2019 2nd International Conference on Computing, Mathematics and Engineering Technologies (iCoMET), Sukkur, Pakistan.","DOI":"10.1109\/ICOMET.2019.8673408"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"102672","DOI":"10.1016\/j.jvcir.2019.102672","article-title":"Generation of personalized video summaries by detecting viewer\u2019s emotion using electroencephalography","volume":"65","author":"Qayyum","year":"2019","journal-title":"J. Vis. Commun. Image Represent."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"McCraty, R. (2019). Heart-brain neurodynamics: The making of emotions. Media Models to Foster Collective Human Coherence in the PSYCHecology, IGI Global.","DOI":"10.4018\/978-1-5225-9065-1.ch010"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Shu, L., Xie, J., Yang, M., Li, Z., Li, Z., Liao, D., Xu, X., and Yang, X. (2018). A review of emotion recognition using physiological signals. Sensors, 18.","DOI":"10.3390\/s18072074"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"1152","DOI":"10.1093\/scan\/nsv083","article-title":"The integration of facial and vocal cues during emotional change perception: EEG markers","volume":"11","author":"Chen","year":"2016","journal-title":"Soc. Cogn. Affect. Neurosci."},{"key":"ref_24","unstructured":"Shi, Y., Ruiz, N., Taib, R., Choi, E., and Chen, F. (May, January 28). Galvanic skin response (GSR) as an index of cognitive load. Proceedings of the CHI\u201907 Extended Abstracts on Human Factors in Computing Systems, San Jose, CA, USA."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Lee, C., Yoo, S., Park, Y., Kim, N., Jeong, K., and Lee, B. (2006, January 17\u201318). Using neural network to recognize human emotions from heart rate variability and skin resistance. Proceedings of the IEEE-EMBS 2005, 27th Annual International Conference of the Engineering in Medicine and Biology Society, Shanghai, China.","DOI":"10.1109\/IEMBS.2005.1615734"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"98","DOI":"10.1016\/j.cobeha.2017.12.017","article-title":"How heart rate variability affects emotion regulation brain networks","volume":"19","author":"Mather","year":"2018","journal-title":"Curr. Opin. Behav. Sci."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"2446","DOI":"10.1109\/JBHI.2019.2895589","article-title":"Human emotion characterization by heart rate variability analysis guided by respiration","volume":"23","author":"Yamuza","year":"2019","journal-title":"IEEE J. Biomed. Health Informatics"},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Murray, N., Lee, B., Qiao, Y., and Miro-Muntean, G. (2016, January 6\u20138). The influence of human factors on olfaction based mulsemedia quality of experience. Proceedings of the 2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX), Lisbon, Portugal.","DOI":"10.1109\/QoMEX.2016.7498975"},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"957","DOI":"10.1109\/TMM.2015.2431915","article-title":"Perceived synchronization of mulsemedia services","volume":"17","author":"Yuan","year":"2015","journal-title":"IEEE Trans. Multimed."},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Covaci, A., Trestian, R., Saleme, E.a.B., Comsa, I.S., Assres, G., Santos, C.A.S., and Ghinea, G. (2019, January 21\u201325). 360 Mulsemedia: A Way to Improve Subjective QoE in 360 Videos. Proceedings of the 27th ACM International Conference on Multimedia, Association for Computing Machinery, Nice, France.","DOI":"10.1145\/3343031.3350954"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Keighrey, C., Flynn, R., Murray, S., and Murray, N. (June, January 31). A QoE evaluation of immersive augmented and virtual reality speech & language assessment applications. Proceedings of the 2017 Ninth International Conference on Quality of Multimedia Experience (QoMEX), Erfurt, Germany.","DOI":"10.1109\/QoMEX.2017.7965656"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Egan, D., Brennan, S., Barrett, J., Qiao, Y., Timmerer, C., and Murray, N. (2016, January 6\u20138). An evaluation of Heart Rate and ElectroDermal Activity as an objective QoE evaluation method for immersive virtual reality environments. Proceedings of the 2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX), Lisbon, Portugal.","DOI":"10.1109\/QoMEX.2016.7498964"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"7987","DOI":"10.1007\/s11042-019-08473-5","article-title":"QoE of cross-modally mapped Mulsemedia: An assessment using eye gaze and heart rate","volume":"79","author":"Mesfin","year":"2020","journal-title":"Multimed. Tools Appl."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"1249","DOI":"10.1109\/TMM.2019.2941274","article-title":"How do we experience crossmodal correspondent mulsemedia content?","volume":"22","author":"Covaci","year":"2020","journal-title":"IEEE Trans. Multimed."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"13971","DOI":"10.1007\/s11042-018-6907-3","article-title":"Emotion recognition in response to traditional and tactile enhanced multimedia using electroencephalography","volume":"78","author":"Raheel","year":"2019","journal-title":"Multimed. Tools Appl."},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Raheel, A., Majid, M., Anwar, S.M., and Bagci, U. (2019, January 23\u201327). Emotion Classification in Response to Tactile Enhanced Multimedia using Frequency Domain Features of Brain Signals. Proceedings of the 2019 IEEE 41st Annual International Conference of the Engineering in Medicine and Biology Society (EMBC), Berlin, Germany.","DOI":"10.1109\/EMBC.2019.8857632"},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"267","DOI":"10.1016\/j.chb.2016.08.029","article-title":"Human emotion recognition and analysis in response to audio music using brain signals","volume":"65","author":"Bhatti","year":"2016","journal-title":"Comput. Hum. Behav."},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Kim, M., Cheon, S., and Kang, Y. (2019). Use of Electroencephalography (EEG) for the Analysis of Emotional Perception and Fear to Nightscapes. Sustainability, 11.","DOI":"10.3390\/su11010233"},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Becerra, M., Londo\u00f1o-Delgado, E., Pelaez-Becerra, S., Serna-Guar\u00edn, L., Castro-Ospina, A., Marin-Castrill\u00f3n, D., and Peluffo-Ord\u00f3\u00f1ez, D. (2018, January 26\u201328). Odor Pleasantness Classification from Electroencephalographic Signals and Emotional States. Proceedings of the Colombian Conference on Computing, Cartagena, Colombia.","DOI":"10.1007\/978-3-319-98998-3_10"},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"893","DOI":"10.3389\/fnhum.2014.00893","article-title":"The brain\u2019s response to pleasant touch: An EEG investigation of tactile caressing","volume":"8","author":"Singh","year":"2014","journal-title":"Front. Hum. Neurosci."},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Udovi\u010di\u0107, G., \u00d0erek, J., Russo, M., and Sikora, M. (2017, January 23\u201327). Wearable emotion recognition system based on GSR and PPG signals. Proceedings of the 2nd International Workshop on Multimedia for Personal Health and Health Care, Mountain View, CA, USA.","DOI":"10.1145\/3132635.3132641"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"18","DOI":"10.1109\/T-AFFC.2011.15","article-title":"Deap: A database for emotion analysis; using physiological signals","volume":"3","author":"Koelstra","year":"2011","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"211","DOI":"10.1109\/T-AFFC.2011.37","article-title":"Multimodal emotion recognition in response to videos","volume":"3","author":"Soleymani","year":"2012","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"126","DOI":"10.1109\/TAFFC.2014.2327617","article-title":"Emotion recognition based on multi-variant correlation of physiological signals","volume":"5","author":"Wen","year":"2014","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"1985","DOI":"10.1007\/s00521-015-2149-8","article-title":"Wavelet-based emotion recognition system using EEG signal","volume":"28","author":"Mohammadi","year":"2017","journal-title":"Neural Comput. Appl."},{"key":"ref_46","doi-asserted-by":"crossref","first-page":"550","DOI":"10.1109\/TAFFC.2017.2660485","article-title":"Real-time movie-induced discrete emotion recognition from EEG signals","volume":"9","author":"Liu","year":"2018","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"8402","DOI":"10.1109\/JSEN.2018.2867221","article-title":"Toward user-independent emotion recognition using physiological signals","volume":"19","author":"Albraikan","year":"2018","journal-title":"IEEE Sens. J."},{"key":"ref_48","doi-asserted-by":"crossref","first-page":"114","DOI":"10.1016\/j.cviu.2015.09.015","article-title":"Multi-modal emotion analysis from facial expressions and electroencephalogram","volume":"147","author":"Huang","year":"2016","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_49","doi-asserted-by":"crossref","unstructured":"Chai, X., Wang, Q., Zhao, Y., Li, Y., Liu, D., Liu, X., and Bai, O. (2017). A fast, efficient domain adaptation technique for cross-domain electroencephalography (EEG)-based emotion recognition. Sensors, 17.","DOI":"10.3390\/s17051014"},{"key":"ref_50","doi-asserted-by":"crossref","first-page":"13361","DOI":"10.3390\/s140813361","article-title":"Emotion recognition from single-trial EEG based on kernel Fisher\u2019s emotion pattern and imbalanced quasiconformal kernel support vector machine","volume":"14","author":"Liu","year":"2014","journal-title":"Sensors"},{"key":"ref_51","doi-asserted-by":"crossref","first-page":"737","DOI":"10.1109\/TITB.2011.2157933","article-title":"A novel emotion elicitation index using frontal brain asymmetry for enhanced EEG-based emotion recognition","volume":"15","author":"Petrantonakis","year":"2011","journal-title":"IEEE Trans. Inf. Technol. Biomed."},{"key":"ref_52","doi-asserted-by":"crossref","first-page":"103","DOI":"10.1016\/j.inffus.2020.01.011","article-title":"Emotion recognition using multi-modal data and machine learning techniques: A tutorial and review","volume":"59","author":"Zhang","year":"2020","journal-title":"Inf. Fusion"},{"key":"ref_53","doi-asserted-by":"crossref","unstructured":"Zhang, X., Xu, C., Xue, W., Hu, J., He, Y., and Gao, M. (2018). Emotion recognition based on multichannel physiological signals with comprehensive nonlinear processing. Sensors, 18.","DOI":"10.3390\/s18113886"},{"key":"ref_54","doi-asserted-by":"crossref","first-page":"15549","DOI":"10.3390\/s131115549","article-title":"A multimodal emotion detection system during human\u2013robot interaction","volume":"13","author":"Malfaz","year":"2013","journal-title":"Sensors"},{"key":"ref_55","doi-asserted-by":"crossref","first-page":"419","DOI":"10.1007\/BF02344719","article-title":"Emotion recognition system using short-term monitoring of physiological signals","volume":"42","author":"Kim","year":"2004","journal-title":"Med Biol. Eng. Comput."},{"key":"ref_56","doi-asserted-by":"crossref","unstructured":"Koelstra, S., Yazdani, A., Soleymani, M., M\u00fchl, C., Lee, J.S., Nijholt, A., Pun, T., Ebrahimi, T., and Patras, I. (2010, January 28\u201330). Single trial classification of EEG and peripheral physiological signals for recognition of emotions induced by music videos. Proceedings of the International Conference on Brain Informatics, Toronto, ON, Canada.","DOI":"10.1007\/978-3-642-15314-3_9"},{"key":"ref_57","doi-asserted-by":"crossref","first-page":"46","DOI":"10.1016\/j.inffus.2018.09.001","article-title":"Deep learning analysis of mobile physiological, environmental and location sensor data for emotion detection","volume":"49","author":"Kanjo","year":"2019","journal-title":"Inf. Fusion"},{"key":"ref_58","doi-asserted-by":"crossref","first-page":"2067","DOI":"10.1109\/TPAMI.2008.26","article-title":"Emotion recognition based on physiological changes in music listening","volume":"30","author":"Kim","year":"2008","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_59","doi-asserted-by":"crossref","first-page":"196","DOI":"10.1109\/TCE.2018.2844736","article-title":"Emotion based music recommendation system using wearable physiological sensors","volume":"64","author":"Ayata","year":"2018","journal-title":"IEEE Trans. Consum. Electron."},{"key":"ref_60","doi-asserted-by":"crossref","unstructured":"Chang, C.Y., Tsai, J.S., Wang, C.J., and Chung, P.C. (April, January 30). Emotion recognition with consideration of facial expression and physiological signals. Proceedings of the 2009 IEEE Symposium on Computational Intelligence in Bioinformatics and Computational Biology, Nashville, TN, USA.","DOI":"10.1109\/CIBCB.2009.4925739"},{"key":"ref_61","doi-asserted-by":"crossref","unstructured":"Khalili, Z., and Moradi, M. (2008, January 18\u201320). Emotion detection using brain and peripheral signals. Proceedings of the 2008 Cairo International Biomedical Engineering Conference, Cairo, Egypt.","DOI":"10.1109\/CIBEC.2008.4786096"},{"key":"ref_62","doi-asserted-by":"crossref","first-page":"42","DOI":"10.1109\/T-AFFC.2011.25","article-title":"A multimodal database for affect recognition and implicit tagging","volume":"3","author":"Soleymani","year":"2011","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_63","doi-asserted-by":"crossref","first-page":"209","DOI":"10.1109\/TAFFC.2015.2392932","article-title":"DECAF: MEG-based multimodal database for decoding affective physiological responses","volume":"6","author":"Abadi","year":"2015","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_64","doi-asserted-by":"crossref","first-page":"147","DOI":"10.1109\/TAFFC.2016.2625250","article-title":"ASCERTAIN: Emotion and personality recognition using commercial sensors","volume":"9","author":"Subramanian","year":"2016","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_65","unstructured":"Correa, J.A.M., Abadi, M.K., Sebe, N., and Patras, I. (2018). Amigos: A dataset for affect, personality and mood research on individuals and groups. IEEE Trans. Affect. Comput."},{"key":"ref_66","doi-asserted-by":"crossref","first-page":"12177","DOI":"10.1109\/ACCESS.2019.2891579","article-title":"MPED: A multi-modal physiological emotion database for discrete emotion recognition","volume":"7","author":"Song","year":"2019","journal-title":"IEEE Access"},{"key":"ref_67","first-page":"57","article-title":"Using deep convolutional neural network for emotion detection on a physiological signals dataset (AMIGOS)","volume":"7","author":"Abdulhay","year":"2018","journal-title":"IEEE Access"},{"key":"ref_68","doi-asserted-by":"crossref","unstructured":"Mart\u00ednez-Rodrigo, A., Zangr\u00f3niz, R., Pastor, J.M., Latorre, J.M., and Fern\u00e1ndez-Caballero, A. (2015). Emotion detection in ageing adults from physiological sensors. Ambient Intelligence-Software and Applications, Springer.","DOI":"10.1007\/978-3-319-19695-4_26"},{"key":"ref_69","doi-asserted-by":"crossref","unstructured":"Zhuang, N., Zeng, Y., Yang, K., Zhang, C., Tong, L., and Yan, B. (2018). Investigating patterns for self-induced emotion recognition from EEG signals. Sensors, 18.","DOI":"10.3390\/s18030841"},{"key":"ref_70","doi-asserted-by":"crossref","unstructured":"Dissanayake, T., Rajapaksha, Y., Ragel, R., and Nawinne, I. (2019). An Ensemble Learning Approach for Electrocardiogram Sensor Based Human Emotion Recognition. Sensors, 19.","DOI":"10.3390\/s19204495"},{"key":"ref_71","doi-asserted-by":"crossref","unstructured":"Athavipach, C., Pan-ngum, S., and Israsena, P. (2019). A Wearable In-Ear EEG Device for Emotion Monitoring. Sensors, 19.","DOI":"10.3390\/s19184014"},{"key":"ref_72","doi-asserted-by":"crossref","unstructured":"Alghowinem, S., Goecke, R., Wagner, M., and Alwabil, A. (2019). Evaluating and Validating Emotion Elicitation Using English and Arabic Movie Clips on a Saudi Sample. Sensors, 19.","DOI":"10.3390\/s19102218"},{"key":"ref_73","doi-asserted-by":"crossref","unstructured":"Chen, D.W., Miao, R., Yang, W.Q., Liang, Y., Chen, H.H., Huang, L., Deng, C.J., and Han, N. (2019). A feature extraction method based on differential entropy and linear discriminant analysis for emotion recognition. Sensors, 19.","DOI":"10.3390\/s19071631"},{"key":"ref_74","doi-asserted-by":"crossref","unstructured":"Alazrai, R., Homoud, R., Alwanni, H., and Daoud, M.I. (2018). EEG-based emotion recognition using quadratic time-frequency distribution. Sensors, 18.","DOI":"10.3390\/s18082739"},{"key":"ref_75","doi-asserted-by":"crossref","unstructured":"Lee, K.W., Yoon, H.S., Song, J.M., and Park, K.R. (2018). Convolutional neural network-based classification of driver\u2019s emotion during aggressive and smooth driving using multi-modal camera sensors. Sensors, 18.","DOI":"10.3390\/s18040957"},{"key":"ref_76","doi-asserted-by":"crossref","first-page":"119","DOI":"10.1007\/s13246-019-00825-7","article-title":"The potential of photoplethysmogram and galvanic skin response in emotion recognition using nonlinear features","volume":"43","author":"Goshvarpour","year":"2020","journal-title":"Phys. Eng. Sci. Med."},{"key":"ref_77","doi-asserted-by":"crossref","unstructured":"Seo, J., Laine, T.H., and Sohn, K.A. (2019). An Exploration of Machine Learning Methods for Robust Boredom Classification Using EEG and GSR Data. Sensors, 19.","DOI":"10.3390\/s19204561"},{"key":"ref_78","doi-asserted-by":"crossref","unstructured":"Lee, J., and Yoo, S.K. (2018). Design of user-customized negative emotion classifier based on feature selection using physiological signal sensors. Sensors, 18.","DOI":"10.3390\/s18124253"},{"key":"ref_79","doi-asserted-by":"crossref","unstructured":"Zhang, J., Chen, M., Zhao, S., Hu, S., Shi, Z., and Cao, Y. (2016). ReliefF-based EEG sensor selection methods for emotion recognition. Sensors, 16.","DOI":"10.3390\/s16101558"},{"key":"ref_80","doi-asserted-by":"crossref","unstructured":"Shu, L., Yu, Y., Chen, W., Hua, H., Li, Q., Jin, J., and Xu, X. (2020). Wearable Emotion Recognition Using Heart Rate Data from a Smart Bracelet. Sensors, 20.","DOI":"10.3390\/s20030718"},{"key":"ref_81","doi-asserted-by":"crossref","unstructured":"Kwon, Y.H., Shin, S.B., and Kim, S.D. (2018). Electroencephalography based fusion two-dimensional (2D)-convolution neural networks (CNN) model for emotion recognition system. Sensors, 18.","DOI":"10.3390\/s18051383"},{"key":"ref_82","doi-asserted-by":"crossref","first-page":"162","DOI":"10.1109\/TAMD.2015.2431497","article-title":"Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks","volume":"7","author":"Zheng","year":"2015","journal-title":"IEEE Trans. Auton. Ment. Dev."},{"key":"ref_83","doi-asserted-by":"crossref","first-page":"839","DOI":"10.1109\/TCYB.2017.2788081","article-title":"Spatial\u2013temporal recurrent neural network for emotion recognition","volume":"49","author":"Zhang","year":"2019","journal-title":"IEEE Trans. Cybern."},{"key":"ref_84","doi-asserted-by":"crossref","unstructured":"Al Machot, F., Elmachot, A., Ali, M., Al Machot, E., and Kyamakya, K. (2019). A deep-learning model for subject-independent human emotion recognition using electrodermal activity sensors. Sensors, 19.","DOI":"10.3390\/s19071659"},{"key":"ref_85","doi-asserted-by":"crossref","unstructured":"Oh, S., Lee, J.Y., and Kim, D.K. (2020). The Design of CNN Architectures for Optimal Six Basic Emotion Classification Using Multiple Physiological Signals. Sensors, 20.","DOI":"10.3390\/s20030866"},{"key":"ref_86","doi-asserted-by":"crossref","unstructured":"Ali, M., Al Machot, F., Haj Mosa, A., Jdeed, M., Al Machot, E., and Kyamakya, K. (2018). A globally generalized emotion recognition system involving different physiological signals. Sensors, 18.","DOI":"10.3390\/s18061905"},{"key":"ref_87","doi-asserted-by":"crossref","unstructured":"Yang, H., Han, J., and Min, K. (2019). A Multi-Column CNN Model for Emotion Recognition from EEG Signals. Sensors, 19.","DOI":"10.3390\/s19214736"},{"key":"ref_88","doi-asserted-by":"crossref","unstructured":"Chao, H., Dong, L., Liu, Y., and Lu, B. (2019). Emotion recognition from multiband EEG signals using CapsNet. Sensors, 19.","DOI":"10.3390\/s19092212"},{"key":"ref_89","doi-asserted-by":"crossref","first-page":"17","DOI":"10.1109\/MIS.2018.2882362","article-title":"Multimodal sentiment analysis: Addressing key issues and setting up the baselines","volume":"33","author":"Poria","year":"2018","journal-title":"IEEE Intell. Syst."},{"key":"ref_90","doi-asserted-by":"crossref","unstructured":"Raheel, A., Majid, M., and Anwar, S.M. (2019). A study on the effects of traditional and olfaction enhanced multimedia on pleasantness classification based on brain activity analysis. Comput. Biol. Med., 103469.","DOI":"10.1016\/j.compbiomed.2019.103469"},{"key":"ref_91","first-page":"34","article-title":"Using Eye Tracking and Heart-Rate Activity to Examine Crossmodal Correspondences QoE in Mulsemedia","volume":"15","author":"Mesfin","year":"2019","journal-title":"ACM Trans. Multimed. Comput. Commun. Appl. (TOMM)"},{"key":"ref_92","doi-asserted-by":"crossref","unstructured":"B\u0103lan, O., Moise, G., Moldoveanu, A., Leordeanu, M., and Moldoveanu, F. (2019). Fear level classification based on emotional dimensions and machine learning techniques. Sensors, 19.","DOI":"10.3390\/s19071738"},{"key":"ref_93","doi-asserted-by":"crossref","first-page":"49","DOI":"10.1016\/0005-7916(94)90063-9","article-title":"Measuring emotion: The self-assessment manikin and the semantic differential","volume":"25","author":"Bradley","year":"1994","journal-title":"J. Behav. Ther. Exp. Psychiatry"},{"key":"ref_94","doi-asserted-by":"crossref","first-page":"25581","DOI":"10.1007\/s11042-016-4232-2","article-title":"A novel framework of EEG-based user identification by analyzing music-listening behavior","volume":"76","author":"Kaur","year":"2017","journal-title":"Multimed. Tools Appl."},{"key":"ref_95","doi-asserted-by":"crossref","first-page":"655","DOI":"10.1111\/1469-8986.00067","article-title":"Affective neuroscience and psychophysiology: Toward a synthesis","volume":"40","author":"Davidson","year":"2003","journal-title":"Psychophysiology"},{"key":"ref_96","doi-asserted-by":"crossref","first-page":"204","DOI":"10.1111\/j.1467-9280.1997.tb00413.x","article-title":"Prefrontal brain asymmetry: A biological substrate of the behavioral approach and inhibition systems","volume":"8","author":"Sutton","year":"1997","journal-title":"Psychol. Sci."},{"key":"ref_97","doi-asserted-by":"crossref","first-page":"374","DOI":"10.1109\/TAFFC.2017.2714671","article-title":"Emotions recognition using EEG signals: A survey","volume":"10","author":"Alarcao","year":"2017","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_98","doi-asserted-by":"crossref","first-page":"10","DOI":"10.1016\/j.inffus.2018.10.009","article-title":"Human emotion recognition using deep belief network architecture","volume":"51","author":"Hassan","year":"2019","journal-title":"Inf. Fusion"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/20\/14\/4037\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T09:50:09Z","timestamp":1760176209000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/20\/14\/4037"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,7,21]]},"references-count":98,"journal-issue":{"issue":"14","published-online":{"date-parts":[[2020,7]]}},"alternative-id":["s20144037"],"URL":"https:\/\/doi.org\/10.3390\/s20144037","relation":{},"ISSN":["1424-8220"],"issn-type":[{"value":"1424-8220","type":"electronic"}],"subject":[],"published":{"date-parts":[[2020,7,21]]}}}