{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,5]],"date-time":"2026-03-05T15:53:23Z","timestamp":1772726003976,"version":"3.50.1"},"reference-count":75,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2022,6,6]],"date-time":"2022-06-06T00:00:00Z","timestamp":1654473600000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,6,6]],"date-time":"2022-06-06T00:00:00Z","timestamp":1654473600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100007480","name":"Universidad de Castilla la Mancha","doi-asserted-by":"crossref","id":[{"id":"10.13039\/501100007480","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Multimed Tools Appl"],"published-print":{"date-parts":[[2023,1]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Multimodal emotion detection has been one of the main lines of research in the field of Affective Computing (AC) in recent years. Multimodal detectors aggregate information coming from different channels or modalities to determine what emotion users are expressing with a higher degree of accuracy. However, despite the benefits offered by this kind of detectors, their presence in real implementations is still scarce for various reasons. In this paper, we propose a technology-agnostic framework, HERA, to facilitate the creation of multimodal emotion detectors, offering a tool characterized by its modularity and the interface-based programming approach adopted in its development. HERA (Heterogeneous Emotional Results Aggregator) offers an architecture to integrate different emotion detection services and aggregate their heterogeneous results to produce a final result using a common format. This proposal constitutes a step forward in the development of multimodal detectors, providing an architecture to manage different detectors and fuse the results produced by them in a sensible way. We assessed the validity of the proposal by testing the system with several developers with no previous knowledge about affective technology and emotion detection. The assessment was performed applying the Computer System Usability Questionnaire and the Twelve Cognitive Dimensions Questionnaire, used by The Visual Studio Usability group at Microsoft, obtaining positive results and important feedback for future versions of the system.<\/jats:p>","DOI":"10.1007\/s11042-022-13254-8","type":"journal-article","created":{"date-parts":[[2022,6,6]],"date-time":"2022-06-06T12:03:56Z","timestamp":1654517036000},"page":"239-269","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":29,"title":["Building a three-level multimodal emotion recognition framework"],"prefix":"10.1007","volume":"82","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-2085-8544","authenticated-orcid":false,"given":"Jose Maria","family":"Garcia-Garcia","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4069-2112","authenticated-orcid":false,"given":"Maria Dolores","family":"Lozano","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1125-9344","authenticated-orcid":false,"given":"Victor M. R.","family":"Penichet","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0873-0150","authenticated-orcid":false,"given":"Effie Lai-Chong","family":"Law","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,6,6]]},"reference":[{"issue":"1","key":"13254_CR1","doi-asserted-by":"publisher","first-page":"41","DOI":"10.1007\/s11042-011-0744-y","volume":"59","author":"E Alepis","year":"2012","unstructured":"Alepis E, Virvou M (2012) Multimodal object oriented user interfaces in mobile affective interaction. Multimed Tools Appl 59(1):41\u201363","journal-title":"Multimed Tools Appl"},{"issue":"1","key":"13254_CR2","doi-asserted-by":"publisher","first-page":"17","DOI":"10.3233\/978-1-60750-028-5-17","volume":"200","author":"I Arroyo","year":"2009","unstructured":"Arroyo I, Cooper DG, Burleson W, Woolf BP, Muldner K, Christopherson R (2009) Emotion sensors go to school. Front Artificial Intel App 200(1):17\u201324. https:\/\/doi.org\/10.3233\/978-1-60750-028-5-17","journal-title":"Front Artificial Intel App"},{"key":"13254_CR3","unstructured":"Blackwell AF and Green TRG (2000) \u201cA Cognitive Dimensions Questionnaire Optimised for Users,\u201d Proc. 12th Work. Psychol. Program. Interes. Gr., no. April, pp. 137\u2013154."},{"issue":"1","key":"13254_CR4","doi-asserted-by":"publisher","first-page":"18","DOI":"10.1109\/T-AFFC.2010.1","volume":"1","author":"RA Calvo","year":"2010","unstructured":"Calvo RA, D\u2019Mello S (2010) Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans Affect Comput 1(1):18\u201337","journal-title":"IEEE Trans Affect Comput"},{"issue":"2","key":"13254_CR5","doi-asserted-by":"publisher","first-page":"557","DOI":"10.1007\/s11042-011-0815-0","volume":"59","author":"E Cambria","year":"2012","unstructured":"Cambria E, Grassi M, Hussain A, Havasi C (2012) Sentic Computing for social media marketing. Multimed Tools Appl 59(2):557\u2013577","journal-title":"Multimed Tools Appl"},{"key":"13254_CR6","doi-asserted-by":"publisher","unstructured":"Chao X, Zhiyong F (2008) A trusted affective model approach to proactive health monitoring system. Proc - 2008 Intern Sem Fut BioMed Inform Engin, FBIE 2008:429\u2013432. https:\/\/doi.org\/10.1109\/FBIE.2008.52","DOI":"10.1109\/FBIE.2008.52"},{"key":"13254_CR7","doi-asserted-by":"publisher","unstructured":"Chen J, Hu B, Li N, Mao C, and Moore P (2013) \u201cA multimodal emotion-focused e-health monitoring support system,\u201d in Proceedings - 2013 7th International Conference on Complex, Intelligent, and Software Intensive Systems, CISIS 2013, pp. 505\u2013510, https:\/\/doi.org\/10.1109\/CISIS.2013.92.","DOI":"10.1109\/CISIS.2013.92"},{"key":"13254_CR8","unstructured":"Chen LS, Huang TS, Miyasato T, and Nakatsu R 1998 \u201cMultimodal human emotion\/expression recognition,\u201d in Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition, pp. 366\u2013371."},{"key":"13254_CR9","unstructured":"Clarke S (2020) \u201cMeasuring API Usability\u201d, Dr. Dobb's: The World of Software Development, May 01, 2004. Accessed on: February 12, Available at:https:\/\/www.drdobbs.com\/windows\/measuring-api-usability\/184405654"},{"key":"13254_CR10","unstructured":"Clarke S, Becker C (2003) Using the Cognitive Dimensions Framework to evaluate the usability of a class library. Proc First Jt Conf EASE PPIG, no. April:359\u2013366"},{"key":"13254_CR11","unstructured":"Dai W, Liu Z, Yu T, and Fung P (2020) \u201cModality-transferable emotionembeddings for low-resource multimodal emotion recognition,\u201d."},{"key":"13254_CR12","doi-asserted-by":"publisher","first-page":"3242","DOI":"10.1109\/ICEEOT.2016.7755303","volume":"2016","author":"RV Darekar","year":"2016","unstructured":"Darekar RV, Dhande AP (2016) Enhancing effectiveness of emotion detection by multimodal fusion of speech parameters. Intern Conf Electri, Electron Optimi Tech, ICEEOT 2016:3242\u20133246. https:\/\/doi.org\/10.1109\/ICEEOT.2016.7755303","journal-title":"Intern Conf Electri, Electron Optimi Tech, ICEEOT"},{"key":"13254_CR13","doi-asserted-by":"publisher","unstructured":"S. D\u2019Mello, A. Graesser, and R. W. Picard, \u201cToward an affect-sensitive auHERAutor,\u201d IEEE Intell Syst, vol. 22, no. 4, pp. 53\u201361, Jul. 2007, https:\/\/doi.org\/10.1109\/MIS.2007.79.","DOI":"10.1109\/MIS.2007.79"},{"key":"13254_CR14","doi-asserted-by":"crossref","unstructured":"D\u2019Mello SK, Kory J(2015) \u201cA review and meta-analysis of multimodal affect detection systems,\u201d ACM Computing Surveys, vol. 47, no. 3. Association for Computing Machinery, 01-Feb-2015.","DOI":"10.1145\/2682899"},{"key":"13254_CR15","doi-asserted-by":"crossref","first-page":"45","DOI":"10.1002\/0470013494.ch3","volume-title":"Handbook of cognition and emotion","author":"P Ekman","year":"1999","unstructured":"Ekman P (1999) Basic emotions. In: Handbook of cognition and emotion, vol ch. 3. John Wiley & Sons, New York, pp 45\u201360"},{"key":"13254_CR16","unstructured":"Express, \u201cFast, unopinionated, minimalist web framework for Node.js\u201d(2020). Accessed on: April 10th, 2020. Available at: https:\/\/expressjs.com\/"},{"key":"13254_CR17","unstructured":"Fabien M\u00e4el (2019) \u201cMultimodal-Emotion-Recognition\u201d, June 28, 2019. Accessed on: March 31, 2020. Available: https:\/\/github.com\/maelfabien\/Multimodal-Emotion-Recognition"},{"key":"13254_CR18","unstructured":"Garcia-Garcia, Jose Maria, \u201cHERA system: Three-level multimodal emotion recognition framework to detect emotions combining different inputs with different formats. Accessed on: April 10th 2020. Available at: https:\/\/github.com\/josemariagarcia95\/hera-system"},{"key":"13254_CR19","doi-asserted-by":"crossref","unstructured":"Garcia-Garcia JM, Penichet VMR, and Lozano MD (2017) \u201cEmotion detection: a technology review,\u201d in Proceedings of the XVIII International Conference on Human Computer Interaction - Interacci\u00f3n \u201817, pp. 1\u20138.","DOI":"10.1145\/3123818.3123852"},{"issue":"10","key":"13254_CR20","doi-asserted-by":"publisher","first-page":"10","DOI":"10.1155\/2018\/8751426","volume":"2018","author":"JM Garcia-Garcia","year":"2018","unstructured":"Garcia-Garcia JM, Penichet VMR, Lozano MD, Garrido JE, Lai-Chong Law E (2018) Multimodal affective computing to enhance the user experience of educational software applications. Mob Inf Syst 2018(10):10. https:\/\/doi.org\/10.1155\/2018\/8751426","journal-title":"Mob Inf Syst"},{"key":"13254_CR21","doi-asserted-by":"publisher","unstructured":"Garcia-Garcia JM, Caba\u00f1ero M e del M, Penichet VMR, and Lozano MD(2019) \u201cEmoTEA: Teaching Children with Autism Spectrum Disorder to Identify and Express Emotions,\u201d in Proceedings of the XX International Conference on Human Computer Interaction - Interacci\u00f3n \u201819, pp. 1\u20138, https:\/\/doi.org\/10.1145\/3335595.3335639.","DOI":"10.1145\/3335595.3335639"},{"key":"13254_CR22","doi-asserted-by":"crossref","unstructured":"Gilleade KM, Alan D, and Allanson J (1997) \u201cAffective videogames and modes of affective gaming: assist me, challenge me, emote me,\u201d 2005, .D. L. Hall and J. Llinas, \u201cAn introduction to multisensor data fusion,\u201d Proc IEEE, vol. 85, no. 1, pp. 6\u201323.","DOI":"10.1109\/5.554205"},{"key":"13254_CR23","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/2396716.2396730","volume":"11","author":"J Gonzalez-Sanchez","year":"2011","unstructured":"Gonzalez-Sanchez J, Chavez-Echeagaray M-E, Atkinson R, Burleson W (2011) Affective computing meets design patterns: A pattern-based model for a multimodal emotion recognition framework. Proc 16th Eur Conf Pattern Lang Programs - Eur 11, no. July:1\u201311. https:\/\/doi.org\/10.1145\/2396716.2396730","journal-title":"Proc 16th Eur Conf Pattern Lang Programs - Eur"},{"key":"13254_CR24","doi-asserted-by":"publisher","unstructured":"J. Gonzalez-Sanchez, M. E. Chavez-Echeagaray, R. Atkinson, and W. Burleson, \u201cABE: An agent-based software architecture for a multimodal emotion recognition framework,\u201d Proc - 9th Work IEEE\/IFIP Conf Softw Archit WICSA 2011, no. May 2014, pp. 187\u2013193, 2011, https:\/\/doi.org\/10.1109\/WICSA.2011.32","DOI":"10.1109\/WICSA.2011.32"},{"key":"13254_CR25","first-page":"443","volume-title":"People and computers V","author":"TRG Green","year":"1989","unstructured":"Green TRG (1989) Cognitive dimensions of notations. In: Sutcliffe A, Macaulay L (eds) People and computers V. Cambridge University Press, Cambridge, UK, pp 443\u2013460"},{"issue":"2","key":"13254_CR26","doi-asserted-by":"publisher","first-page":"131","DOI":"10.1006\/jvlc.1996.0009","volume":"7","author":"TRG Green","year":"1996","unstructured":"Green TRG, Petre M (1996) Usability analysis of visual programming environments: a \u2018cognitive dimensions\u2019 framework. J Vis Lang Comput 7(2):131\u2013174. https:\/\/doi.org\/10.1006\/jvlc.1996.0009","journal-title":"J Vis Lang Comput"},{"issue":"18","key":"13254_CR27","doi-asserted-by":"publisher","first-page":"25321","DOI":"10.1007\/s11042-019-7651-z","volume":"78","author":"SK Gupta","year":"2019","unstructured":"Gupta SK, Ashwin TS, Guddeti RMR (2019) Students\u2019 affective content analysis in smart classroom environment using deep learning techniques. Multimed Tools Appl 78(18) Multimedia Tools and Applications:25321\u201325348","journal-title":"Multimed Tools Appl"},{"key":"13254_CR28","doi-asserted-by":"crossref","unstructured":"A. G. Hauptmann and P. McAvinney, \u201cGestures with speech for graphic manipulation,\u201d Int J Man Mach Stud, vol. 38, no. 2, pp. 231\u2013249, Feb. 1993.","DOI":"10.1006\/imms.1993.1011"},{"issue":"18","key":"13254_CR29","doi-asserted-by":"publisher","first-page":"18361","DOI":"10.1007\/s11042-016-4101-z","volume":"76","author":"JC-S Hung","year":"2017","unstructured":"Hung JC-S, Chiang K-H, Huang Y-H, Lin K-C (2017) Augmenting teacher-student interaction in digital learning through affective computing. Multimed Tools Appl 76(18) Multimedia Tools and Applications:18361\u201318386","journal-title":"Multimed Tools Appl"},{"issue":"11","key":"13254_CR30","doi-asserted-by":"publisher","first-page":"14231","DOI":"10.1007\/s11042-018-6755-1","volume":"78","author":"S Jaiswal","year":"2019","unstructured":"Jaiswal S, Virmani S, Sethi V, De K, Roy PP (2019) An intelligent recommendation system using gaze and emotion detection. Multimed Tools Appl 78(11):14231\u201314250","journal-title":"Multimed Tools Appl"},{"key":"13254_CR31","first-page":"29","volume-title":"\u201cPredicting Affect from Gaze Data during Interaction with an Intelligent Tutoring System,\u201d in Intelligent Tutoring Systems","author":"N Jaques","year":"2014","unstructured":"Jaques N, Conati C, Harley JM, Azevedo R (2014) \u201cPredicting Affect from Gaze Data during Interaction with an Intelligent Tutoring System,\u201d in Intelligent Tutoring Systems. Springer, Cham, pp 29\u201338"},{"issue":"1","key":"13254_CR32","doi-asserted-by":"publisher","first-page":"83","DOI":"10.1007\/s11042-020-09451-y","volume":"80","author":"SK Jarraya","year":"2021","unstructured":"Jarraya SK, Masmoudi M, Hammami M (2021) A comparative study of autistic children emotion recognition based on Spatio-temporal and deep analysis of facial expressions features during a meltdown crisis. Multimed Tools Appl 80(1):83\u2013125","journal-title":"Multimed Tools Appl"},{"issue":"6","key":"13254_CR33","doi-asserted-by":"publisher","first-page":"9479","DOI":"10.1007\/s11042-020-10106-1","volume":"80","author":"TLB Khanh","year":"2021","unstructured":"Khanh TLB, Kim S-H, Lee G, Yang H-J, Baek E-T (2021) Korean video dataset for emotion recognition in the wild. Multimed Tools Appl 80(6):9479\u20139492","journal-title":"Multimed Tools Appl"},{"issue":"3","key":"13254_CR34","doi-asserted-by":"publisher","first-page":"263","DOI":"10.1007\/BF00993889","volume":"5","author":"PRJ Kleinginna","year":"1981","unstructured":"Kleinginna PRJ, Kleinginna AM (1981) A categorized list of emotion definitions, with suggestions for a consensual definition. Motiv Emot 5(3):263\u2013291","journal-title":"Motiv Emot"},{"key":"13254_CR35","first-page":"55","volume-title":"Information systems development and applications","author":"A Ko\u0142akowska","year":"2015","unstructured":"Ko\u0142akowska A, Landowska A, Szwoch M, Szwoch W, Wr\u00f3bel M (2015) Modeling emotions for affect-aware applications. In: Wrzycza S (ed) Information systems development and applications. Faculty of Management University of Gda\u0144sk, Poland, pp 55\u201369"},{"issue":"1","key":"13254_CR36","doi-asserted-by":"publisher","first-page":"18","DOI":"10.1109\/T-AFFC.2011.15","volume":"3","author":"S Koelstra","year":"2012","unstructured":"Koelstra S, Muhl C, Soleymani M, Jong-Seok Lee A, Yazdani T, Ebrahimi T, Pun A, Nijholt IP (2012) DEAP: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3(1):18\u201331","journal-title":"IEEE Trans Affect Comput"},{"key":"13254_CR37","unstructured":"Kossaifi, Jean, Robert Walecki, Yannis Panagakis, Jie Shen, Maximilian Schmitt, Fabien Ringeval, Jing Han et al (2019) \"SEWA DB: A Rich Database for Audio-Visual Emotion and Sentiment Research in the Wild.\" IEEE Transactions on Pattern Analysis and Machine Intelligence."},{"issue":"17","key":"13254_CR38","doi-asserted-by":"publisher","first-page":"24103","DOI":"10.1007\/s11042-019-7390-1","volume":"78","author":"A Kumar","year":"2019","unstructured":"Kumar A, Garg G (2019) Sentiment analysis of multimodal twitter data. Multimed Tools Appl 78(17):24103\u201324119","journal-title":"Multimed Tools Appl"},{"issue":"12","key":"13254_CR39","doi-asserted-by":"publisher","first-page":"1148","DOI":"10.1080\/10447318.2017.1418805","volume":"34","author":"JR Lewis","year":"2018","unstructured":"Lewis JR (2018) Measuring perceived usability: the CSUQ, SUS, and UMUX. Int J Hum Comput Interact 34(12):1148\u20131156. https:\/\/doi.org\/10.1080\/10447318.2017.1418805","journal-title":"Int J Hum Comput Interact"},{"issue":"2","key":"13254_CR40","doi-asserted-by":"publisher","first-page":"274","DOI":"10.3390\/app8020274","volume":"8","author":"A Landowska","year":"2018","unstructured":"Landowska A (2018) Towards new mappings between emotion representa-tion models. Appl Sci 8(2):274","journal-title":"Appl Sci"},{"key":"13254_CR41","doi-asserted-by":"publisher","first-page":"57","DOI":"10.1080\/10447319509526110","volume":"7","author":"JR Lewis","year":"1995","unstructured":"Lewis JR (1995) IBM computer usability satisfaction questionnaires: psychometric evaluation and instructions for use. Int J Hum Comput Interact 7:57\u201378","journal-title":"Int J Hum Comput Interact"},{"issue":"6","key":"13254_CR42","doi-asserted-by":"publisher","first-page":"6939","DOI":"10.1007\/s11042-018-6445-z","volume":"78","author":"Z Li","year":"2019","unstructured":"Li Z, Fan Y, Jiang B, Lei T, Liu W (2019) A survey on sentiment analysis and opinion mining for social multimedia. Multimed Tools Appl 78(6):6939\u20136967","journal-title":"Multimed Tools Appl"},{"issue":"2","key":"13254_CR43","doi-asserted-by":"publisher","first-page":"277","DOI":"10.1007\/s11042-009-0344-2","volume":"49","author":"M Mansoorizadeh","year":"2010","unstructured":"Mansoorizadeh M, Moghaddam Charkari N (2010) Multimodal information fusion application to human emotion recognition from face and speech. Multimed Tools Appl 49(2):277\u2013297","journal-title":"Multimed Tools Appl"},{"key":"13254_CR44","doi-asserted-by":"publisher","first-page":"251","DOI":"10.1007\/978-3-540-72348-6_13","volume":"4451 LNAI","author":"L Maat","year":"2007","unstructured":"Maat L, Pantic M (2007) Gaze-X: Adaptive, affective, multimodal interface for single-user office scenarios. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 4451 LNAI:251\u2013271. https:\/\/doi.org\/10.1007\/978-3-540-72348-6_13","journal-title":"Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics)"},{"key":"13254_CR45","first-page":"204","volume-title":"Universal methods of design: 100 ways to research complex problems, develop innovative ideas, and design effective solutions","author":"B Martin","year":"2012","unstructured":"Martin B, Hanington B (2012) Universal methods of design: 100 ways to research complex problems, develop innovative ideas, and design effective solutions. Rockport Publishers, Beberly (Massachusetts), pp 204\u2013205"},{"key":"13254_CR46","volume-title":"An approach to environmental psychology","author":"A Mehrabian","year":"1974","unstructured":"Mehrabian A, Russell JA (1974) An approach to environmental psychology. The MIT press"},{"key":"13254_CR47","doi-asserted-by":"publisher","unstructured":"Mittal T, Guhan P, Bhattacharya U, Chandra R, Bera A, Manocha D (2020, 2020) EmotiCon: Context-Aware Multimodal Emotion Recognition Using Frege\u2019s Principle. In: IEEE\/CVF conference on computer vision and pattern recognition (CVPR), Seattle, WA, USA, pp 14222\u201314231. https:\/\/doi.org\/10.1109\/CVPR42600.2020.01424","DOI":"10.1109\/CVPR42600.2020.01424"},{"key":"13254_CR48","first-page":"24","volume-title":"A mathematical model of the finding of usability problems. Proceedings of the Interact\u201993 and CHI\u201993 Conference on Human Factors in Computing systems","author":"J Nielsen","year":"1993","unstructured":"Nielsen J, Landauer T (1993) A mathematical model of the finding of usability problems. Proceedings of the Interact\u201993 and CHI\u201993 Conference on Human Factors in Computing systems; 1993 Apr. ACM, Amsterdam, the Netherlands. New York, pp 24\u201329"},{"key":"13254_CR49","unstructured":"Osman H. Al and Falk TH (2017) \u201cMultimodal Affect Recognition: Current Approaches and Challenges,\u201d in Emotion and Attention Recognition Based on Biological Signals and Images, InTech."},{"key":"13254_CR50","doi-asserted-by":"crossref","unstructured":"Oviatt S, DeAngeli A, and Kuhn K (1997) \u201cIntegration and synchronization of input modes during multimodal human-computer interaction,\u201d in Proceedings of the SIGCHI conference on Human factors in computing systems - CHI \u201897, pp. 415\u2013422.","DOI":"10.3115\/1621585.1621587"},{"key":"13254_CR51","doi-asserted-by":"crossref","unstructured":"Oehl M, Siebert FW, Tews T-K, H\u00f6ger R, Pfister H-R (2011) Improving human-machine interaction - A non-invasive approach to detect emotions in car drivers. Lect Notes Comput Sci (including Subser Lect Notes Artif Intell Lect Notes Bioinformatics) 6763 LNCS, no. PART 3:577\u2013585","DOI":"10.1007\/978-3-642-21616-9_65"},{"key":"13254_CR52","doi-asserted-by":"crossref","unstructured":"Pantic M, Sebe N, Cohn JF, Huang T (2005) Affective multimodal human-computer interaction. Proc 13th ACM Int Conf Multimedia, MM 2005 , no. January:669\u2013676","DOI":"10.1145\/1101149.1101299"},{"key":"13254_CR53","doi-asserted-by":"publisher","unstructured":"Patwardhan AS (2018) \u201cMultimodal mixed emotion detection,\u201d in Proceedings of the 2nd International Conference on Communication and Electronics Systems, ICCES 2017, 2018, vol., pp. 139\u2013143, https:\/\/doi.org\/10.1109\/CESYS.2017.8321250.","DOI":"10.1109\/CESYS.2017.8321250"},{"key":"13254_CR54","first-page":"1","volume":"321","author":"RW Picard","year":"1995","unstructured":"Picard RW (1995) Affective Computing. MIT Press 321:1\u201316","journal-title":"MIT Press"},{"key":"13254_CR55","doi-asserted-by":"crossref","unstructured":"Poria S, Cambria E, Bajpai R, and Hussain A (2017) \u201cA review of affective computing: from unimodal analysis to multimodal fusion,\u201d Inf. Fusion.","DOI":"10.1016\/j.inffus.2017.02.003"},{"key":"13254_CR56","unstructured":"Pyeon Myeongjang (2018) \u201cIEMo: web-based interactive multimodal emotion recognition framework\u201d, Abril 30, 2018. Accessed on: April 28, 2020. Available at: https:\/\/github.com\/mjpyeon\/IEMo"},{"key":"13254_CR57","doi-asserted-by":"publisher","first-page":"22","DOI":"10.1016\/j.patrec.2014.11.007","volume":"66","author":"F Ringeval","year":"2015","unstructured":"Ringeval F, Eyben F, Kroupi E, Yuce A, Thiran JP, Ebrahimi T, Lalanne D, Schuller B (2015) Prediction of asynchronous dimensional emotion ratings from audiovisual and physiological data. Pattern Recogn Lett 66:22\u201330","journal-title":"Pattern Recogn Lett"},{"issue":"9\u201310","key":"13254_CR58","doi-asserted-by":"publisher","first-page":"6279","DOI":"10.1007\/s11042-019-08291-9","volume":"79","author":"D Rousidis","year":"2020","unstructured":"Rousidis D, Koukaras P, Tjortjis C (2020) Social media prediction: a literature review. Multimed Tools Appl 79(9\u201310):6279\u20136311","journal-title":"Multimed Tools Appl"},{"key":"13254_CR59","doi-asserted-by":"crossref","unstructured":"Sekhavat YA, Sisi MJ, and Roohi S (2020) \u201cAffective interaction: using emotions as a user interface in games\u201d, Multimedia Tools and Applications. Multimedia Tools and Applications, Affective interaction: Using emotions as a user interface in games.","DOI":"10.1007\/s11042-020-10006-4"},{"key":"13254_CR60","unstructured":"Sethu V, Provost EM, Epps J, Busso C, Cummins N, and Narayanan S 2019 \u201cThe ambiguous world of emotion representation,\u201d."},{"key":"13254_CR61","unstructured":"Silva L. C. De, Miyasato T, and Nakatsu R (1997) \u201cFacial emotion recognition using multi-modal information,\u201d Proc. ICICS, 1997 Int. Conf. Information, Commun. Signal Process. Theme Trends Inf. Syst. Eng. Wirel. Multimed. Commun. (Cat. No.97TH8237), vol. 1, no. May 2014, pp. 397\u2013401."},{"key":"13254_CR62","doi-asserted-by":"crossref","unstructured":"L. C. De Silva, Pei Chi Ng (2000) \u201cBimodal emotion recognition,\u201d in Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580), pp. 332\u2013335.","DOI":"10.1109\/AFGR.2000.840655"},{"key":"13254_CR63","doi-asserted-by":"publisher","first-page":"176274","DOI":"10.1109\/ACCESS.2020.3026823","volume":"8","author":"S Siriwardhana","year":"2020","unstructured":"Siriwardhana S, Kaluarachchi T, Billinghurst M, Nanayakkara S (2020) Multimodal emotion recognition with transformer-based self supervised feature fusion. IEEE Access 8:176274\u2013176285. https:\/\/doi.org\/10.1109\/ACCESS.2020.3026823","journal-title":"IEEE Access"},{"key":"13254_CR64","unstructured":"W3C, emotion markup language, (May 22, 2014). Accessed on: February 17th, 2020. Available: https:\/\/www.w3.org\/TR\/emotionml\/"},{"key":"13254_CR65","unstructured":"W3C, multimodal interaction framework, multimodal interaction working group, (May 06, 2003). Accessed on: February 17th, 2020. Arvailable: https:\/\/www.w3.org\/TR\/mmi-framework\/"},{"issue":"47\u201348","key":"13254_CR66","doi-asserted-by":"publisher","first-page":"35553","DOI":"10.1007\/s11042-019-08328-z","volume":"79","author":"Z Wang","year":"2020","unstructured":"Wang Z, Ho S-B, Cambria E (2020) A review of emotion sensing: categorization models and algorithms. Multimed Tools Appl 79(47\u201348):35553\u201335582","journal-title":"Multimed Tools Appl"},{"key":"13254_CR67","doi-asserted-by":"crossref","unstructured":"Wijayarathna C, Arachchilage NAG, Slay J (2017) \u201cUsing Cognitive Dimensions Questionnaire to Evaluate the Usability of Security APIs,\u201d no. 2004.","DOI":"10.1007\/978-3-319-58460-7_11"},{"key":"13254_CR68","unstructured":"Woolf B, Woolf B, Burelson W, Arroyo I(2007) \u201cEmotional Intelligence for Computer Tutors,\u201d Suppl. Proc. 13TH Int. Conf. Artif. IN-TELLIGENCE Educ. (AIED 2007), (PP, pp. 6--15."},{"key":"13254_CR69","doi-asserted-by":"crossref","unstructured":"Yamauchi T (2013) \u201cMouse Trajectories and State Anxiety: Feature Selection with Random Forest,\u201d in 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction, pp. 399\u2013404.","DOI":"10.1109\/ACII.2013.72"},{"key":"13254_CR70","doi-asserted-by":"publisher","unstructured":"Zhao S et al. (2020) \"Discrete Probability Distribution Prediction of Image Emotions with Shared Sparse Learning,\" in IEEE Transactions on Affective Computing, vol. 11, no. 4, pp. 574\u2013587, 1 Oct.-Dec, https:\/\/doi.org\/10.1109\/TAFFC.2018.2818685.","DOI":"10.1109\/TAFFC.2018.2818685"},{"key":"13254_CR71","doi-asserted-by":"publisher","first-page":"18","DOI":"10.1145\/3233184","volume":"15","author":"S Zhao","year":"2019","unstructured":"Zhao S, Gholaminejad A, Ding G, Gao Y, Han J, Keutzer K (2019) Personalized emotion recognition by personality-aware high-order learning of physiological signals. ACM Trans Multimed Comput Commun Appl 15, 1s, article 14, (February 2019):18. https:\/\/doi.org\/10.1145\/3233184","journal-title":"ACM Trans Multimed Comput Commun Appl"},{"key":"13254_CR72","doi-asserted-by":"crossref","unstructured":"Zhao S, Ding G, Gao Y, Han J(2017) \u201cApproximating discrete probability distribution of image emotions by multi-modal features fusion,\u201cin Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, IJCAI-17, pp. 4669\u20134675","DOI":"10.24963\/ijcai.2017\/651"},{"key":"13254_CR73","doi-asserted-by":"publisher","DOI":"10.1109\/CVPR.2016.374","volume-title":"Multimodal spontaneous emotion Corpus for human behavior analysis","author":"Z Zhang","year":"2016","unstructured":"Z. Zhang, J. M. Girard, Y. Wu, X. Zhang, P. Liu, U. Ciftci, S. Canavan, M. Reale, A. Horowitz, H. Yang, J. F. Cohn, Q. Ji, and L. Yin, \u201cMultimodal spontaneous emotion Corpus for human behavior analysis\u201d, 2016."},{"issue":"1","key":"13254_CR74","doi-asserted-by":"publisher","first-page":"109","DOI":"10.1007\/978-3-642-12604-8_6","volume":"2010","author":"S Zhang","year":"2010","unstructured":"Zhang S, Wu Z, Meng HM, Cai L (2010) Facial expression synthesis based on emotion dimensions for affective talking avatar. Smart Innov Syst Technol 2010(1):109\u2013132. https:\/\/doi.org\/10.1007\/978-3-642-12604-8_6","journal-title":"Smart Innov Syst Technol"},{"key":"13254_CR75","doi-asserted-by":"publisher","unstructured":"W. L. Zheng, W. Liu, Y. Lu, B. L. Lu, and A. Cichocki, \u201cEmotionMeter: a multimodal framework for recognizing human emotions,\u201d IEEE Trans Cybern, vol. 49, no. 3, pp. 1110\u20131122, Mar. 2019, https:\/\/doi.org\/10.1109\/TCYB.2018.2797176.","DOI":"10.1109\/TCYB.2018.2797176"}],"container-title":["Multimedia Tools and Applications"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11042-022-13254-8.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s11042-022-13254-8\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s11042-022-13254-8.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,11,22]],"date-time":"2023-11-22T18:36:59Z","timestamp":1700678219000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s11042-022-13254-8"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,6,6]]},"references-count":75,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2023,1]]}},"alternative-id":["13254"],"URL":"https:\/\/doi.org\/10.1007\/s11042-022-13254-8","relation":{},"ISSN":["1380-7501","1573-7721"],"issn-type":[{"value":"1380-7501","type":"print"},{"value":"1573-7721","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,6,6]]},"assertion":[{"value":"11 December 2020","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"22 February 2022","order":2,"name":"revised","label":"Revised","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"15 May 2022","order":3,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"6 June 2022","order":4,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"This work has been partially supported by the national project granted by the Ministry of Science, Innovation and Universities (Spain) with reference RTI2018\u2013099942-B-I00 and by the regional project (ref: SBPLY\/17\/180501\/000495) granted by the regional government (JCCM) and the European Regional Development Funds (FEDER).","order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"On behalf of all authors, the corresponding author states that there is no conflict of interest.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflicts of interest\/Competing interests"}}]}}