{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,5]],"date-time":"2026-03-05T23:10:27Z","timestamp":1772752227910,"version":"3.50.1"},"reference-count":44,"publisher":"MDPI AG","issue":"11","license":[{"start":{"date-parts":[[2024,11,10]],"date-time":"2024-11-10T00:00:00Z","timestamp":1731196800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Future Internet"],"abstract":"<jats:p>The Future Internet aims to revolutionize digital interaction by integrating advanced technologies like AI and IoT, enabling a dynamic and resilient network. It envisions emotionally intelligent systems that can interpret and respond to human feelings, creating immersive, empathy-driven learning experiences. This evolution aspires to form a responsive digital ecosystem that seamlessly connects technology and human emotion. This paper presents a computational model aimed at enhancing the emotional aspect of learning experiences within museum environments. The model is designed to represent and manage affective and emotional feedback, with a focus on how emotions can significantly impact the learning process in a museum context. The proposed model seeks to identify and quantify emotions during a visitor\u2019s engagement with museum exhibits. To achieve this goal, we primarily explored the following: (i) methods and techniques for assessing and recognizing emotional responses in museum visitors, (ii) feedback management strategies based on the detection of visitors\u2019 emotional states. Then, the methodology was tested on 1000 cases via specific questionnaire forms, along with the presentation of images and short videos, and the results of data analysis are reported. The findings contribute toward establishing a comprehensive methodology for the identification and quantification of the emotional state of museum visitors.<\/jats:p>","DOI":"10.3390\/fi16110417","type":"journal-article","created":{"date-parts":[[2024,11,11]],"date-time":"2024-11-11T11:34:11Z","timestamp":1731324851000},"page":"417","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":11,"title":["Approaches to Identifying Emotions and Affections During the Museum Learning Experience in the Context of the Future Internet"],"prefix":"10.3390","volume":"16","author":[{"ORCID":"https:\/\/orcid.org\/0009-0001-9008-7757","authenticated-orcid":false,"given":"Iana","family":"Fominska","sequence":"first","affiliation":[{"name":"Department of Education, Cultural Heritage and Tourism Sciences, University of Macerata, Via Giovanni Mario Crescimbeni, 30, 62100 Macerata, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1796-6744","authenticated-orcid":false,"given":"Stefano","family":"Di Tore","sequence":"additional","affiliation":[{"name":"Department of Human, Philosophical and Educational Sciences, University of Salerno, Via Giovanni Paolo II, 132, 84084 Fisciano, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2517-2867","authenticated-orcid":false,"given":"Michele","family":"Nappi","sequence":"additional","affiliation":[{"name":"Department of Computer Science, University of Salerno, Via Giovanni Paolo II, 132, 84084 Fisciano, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3119-4608","authenticated-orcid":false,"given":"Gerardo","family":"Iovane","sequence":"additional","affiliation":[{"name":"Department of Computer Science, University of Salerno, Via Giovanni Paolo II, 132, 84084 Fisciano, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1322-7776","authenticated-orcid":false,"given":"Maurizio","family":"Sibilio","sequence":"additional","affiliation":[{"name":"Department of Human, Philosophical and Educational Sciences, University of Salerno, Via Giovanni Paolo II, 132, 84084 Fisciano, Italy"}]},{"given":"Angela","family":"Gelo","sequence":"additional","affiliation":[{"name":"Department of Human, Philosophical and Educational Sciences, University of Salerno, Via Giovanni Paolo II, 132, 84084 Fisciano, Italy"}]}],"member":"1968","published-online":{"date-parts":[[2024,11,10]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Deng, Y., Zhang, X., Zhang, B., Zhang, B., and Qin, J. (2023). From digital museuming to on-site visiting: The mediation of cultural identity and perceived value. Front. Psychol., 14.","DOI":"10.3389\/fpsyg.2023.1111917"},{"key":"ref_2","unstructured":"Petrakova, A. (2024, September 28). According to a Survey by The Art Newspaper, the Social Network with Pictures Has Become the Most Popular among Museums. Word for Social Media with Video. The Art Newspaper Russia, 30 April 2021. Available online: https:\/\/www.theartnewspaper.ru\/posts\/9036\/."},{"key":"ref_3","first-page":"391","article-title":"Exhibiting Emotion: Capturing Visitors\u2019 Emotional Responses to Museum Artefacts","volume":"Volume 8014","author":"Marcus","year":"2013","journal-title":"Design, User Experience, and Usability. User Experience in Novel Technological Environments"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Benford, S., L\u00f8vlie, A.S., Ryding, K., Rajkowska, P., Bodiaj, E., Darzentas, D.P., Cameron, H., Spence, J., Egede, J., and Spanjevic, B. (May, January 29). Sensitive Pictures: Emotional Interpretation in the Museum. Proceedings of the CHI \u203222: Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, New Orleans, LA, USA.","DOI":"10.1145\/3491102.3502080"},{"key":"ref_5","unstructured":"Antona, M., and Stephanidis, C. (2021, January 24\u201329). Affective Guide for Museum: A System to Suggest Museum Paths Based on Visitors\u2019 Emotions. Proceedings of the HCII 2021, Washington, DC, USA."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"6286","DOI":"10.1109\/ACCESS.2020.3047831","article-title":"Emotion Recognition by Textual Tweets Classification Using Voting Classifier (LR-SGD)","volume":"9","author":"Yousaf","year":"2020","journal-title":"IEEE Access"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"116","DOI":"10.1016\/j.cviu.2006.10.019","article-title":"Multimodal Human-Computer Interaction: A Survey","volume":"108","author":"Jaimes","year":"2007","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"26","DOI":"10.1016\/j.neuroimage.2005.03.018","article-title":"The neural bases of amusement and sadness: A comparison of block contrast and subject-specific emotion intensity regression approaches","volume":"27","author":"Goldin","year":"2005","journal-title":"NeuroImage"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"70","DOI":"10.1016\/j.cogsys.2008.03.005","article-title":"EMA: A Model of Emotional Dynamics","volume":"10","author":"Marsella","year":"2009","journal-title":"J. Cogn. Syst. Res."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"23","DOI":"10.1007\/s10458-005-1081-1","article-title":"Evaluating a Computational Model of Emotion","volume":"11","author":"Gratch","year":"2005","journal-title":"Auton. Agents Multi-Agent Syst."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Davidson, R.J., Scherer, K.R., and Goldsmith, H.H. (2003). Handbook of Affective Sciences, Oxford University Press.","DOI":"10.1093\/oso\/9780195126013.001.0001"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"19","DOI":"10.1016\/j.inffus.2022.03.009","article-title":"A systematic review on affective computing: Emotion models, databases, and recent advances","volume":"83\u201384","author":"Wang","year":"2022","journal-title":"Inf. Fusion"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Ting, Z., Zipeng, Q., Weiwei, G., Cheng, Z., and Dingli, J. (2023). Research on the measurement and characteristics of museum visitors\u2019 emotions under digital technology environment. Front. Hum. Neurosci., 17.","DOI":"10.3389\/fnhum.2023.1251241"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"532","DOI":"10.1109\/TAFFC.2018.2817622","article-title":"EEG emotion recognition using dynamical graph convolutional neural networks","volume":"11","author":"Song","year":"2020","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"18","DOI":"10.1109\/T-AFFC.2011.15","article-title":"DEAP: A Database for Emotion Analysis Using Physiological Signals","volume":"3","author":"Koelstra","year":"2011","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"692","DOI":"10.1109\/TAFFC.2018.2887385","article-title":"A mathematical description of emotional processes and its potential applications to affective computing","volume":"12","author":"Puviani","year":"2018","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"26","DOI":"10.1109\/MPRV.2017.33","article-title":"Noninvasive Bluetooth monitoring of visitors\u2019 length of stay at the Louvre","volume":"16","author":"Yoshimura","year":"2017","journal-title":"IEEE Perv. Comput."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"169","DOI":"10.1080\/02699939208411068","article-title":"An Argument for Basic Emotions","volume":"6","author":"Ekman","year":"1992","journal-title":"Cogn. Emot."},{"key":"ref_19","unstructured":"Knapp, M.L., and Hall, J.A. (2009). Nonverbal Communication in Human Interaction, Wadsworth Publishing."},{"key":"ref_20","unstructured":"Burgoon, J.K., Buller, D.B., and Woodall, W.G. (1996). Nonverbal Communication: The Unspoken Dialogue, McGraw-Hill."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"770","DOI":"10.1037\/0033-2909.129.5.770","article-title":"Communication of Emotion in Vocal Expression and Music Performance: Different Channels, Same Code?","volume":"129","author":"Juslin","year":"2003","journal-title":"Psychol. Bull."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"76","DOI":"10.1177\/0022022101032001009","article-title":"Emotion Inferences from Vocal Expression Correlate Across Languages and Cultures","volume":"32","author":"Scherer","year":"2001","journal-title":"J. Cross-Cult. Psychol."},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"614","DOI":"10.1037\/0022-3514.70.3.614","article-title":"Acoustic Profiles in Vocal Emotion Expression","volume":"70","author":"Banse","year":"1996","journal-title":"J. Pers. Soc. Psychol."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Picard, R.W. (2000). Affective Computing, MIT Press.","DOI":"10.1007\/978-3-540-45012-2_2"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"10","DOI":"10.1111\/1467-8721.00003","article-title":"The Structure of Current Affect: Controversies and Emerging Consensus","volume":"8","author":"Barrett","year":"1999","journal-title":"Curr. Dir. Psychol. Sci."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"715","DOI":"10.1017\/S0954579405050340","article-title":"The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology","volume":"17","author":"Posner","year":"2005","journal-title":"Dev. Psychopathol."},{"key":"ref_27","unstructured":"Samsonovich, A.V., and Ascoli, G.A. (2012). Toward a Formal Theory of Meaning, The MIT Press."},{"key":"ref_28","unstructured":"Pennebaker, J.W., Booth, R.J., and Francis, M.E. (2024, September 28). Linguistic Inquiry and Word Count: LIWC [Computer Software]. 2007. Available online: http:\/\/www.liwc.net."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Schwartz, H.A., Eichstaedt, J.C., Kern, M.L., Dziurzynski, L., Ramones, S.M., Agrawal, M., Shah, A., Kosinski, M., Stillwell, D., and Seligman, M.E.P. (2013). Personality, Gender, and Age in the Language of Social Media: The Open-Vocabulary Approach. PLoS ONE, 8.","DOI":"10.1371\/journal.pone.0073791"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"24","DOI":"10.1177\/0261927X09351676","article-title":"TThe psychological meaning of words: LIWC and computerized text analysis methods","volume":"29","author":"Tausczik","year":"2009","journal-title":"J. Lang. Soc. Psychol."},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Dodds, P.S., Harris, K.D., Kloumann, I.M., Bliss, C.A., and Danforth, C.M. (2011). Temporal Patterns of Happiness and Information in a Global Social Network: Hedonometrics and Twitter. PLOS ONE, 6.","DOI":"10.1371\/journal.pone.0026752"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1561\/1500000011","article-title":"Opinion Mining and Sentiment Analysis","volume":"2","author":"Pang","year":"2008","journal-title":"Found. Trends Inf. Retr."},{"key":"ref_33","first-page":"1","article-title":"Sentiment analysis and opinion mining","volume":"5","author":"Liu","year":"2012","journal-title":"Synth. Lect. Hum. Lang. Technol."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"2544","DOI":"10.1002\/asi.21416","article-title":"Sentiment strength detection in short informal text","volume":"61","author":"Thelwall","year":"2010","journal-title":"J. Am. Soc. Inf. Sci. Technol."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"1296","DOI":"10.1037\/0022-3514.77.6.1296","article-title":"Linguistic styles: Language use as an individual difference","volume":"77","author":"Pennebaker","year":"1999","journal-title":"J. Pers. Soc. Psychol."},{"key":"ref_36","first-page":"148","article-title":"A Review of Affect Analysis in Text","volume":"8","author":"Poria","year":"2017","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_37","unstructured":"Coan, J.A., and Allen, J.J.B. (2017). Understanding Mixed Emotions: Paradigms and Measures. The Handbook of Emotion Elicitation and Assessment, Oxford University Press."},{"key":"ref_38","first-page":"105","article-title":"A New Paradigm for Intelligent Tutoring Systems: Example-Tracing Tutors","volume":"19","author":"Aleven","year":"2009","journal-title":"Int. J. Artif. Intell. Educ."},{"key":"ref_39","unstructured":"Calvo, R.A., D\u2019Mello, S.K., Gratch, J., and Kappas, A. (2012). Dynamics of Affect. Handbook of Affective Computing, Oxford University Press."},{"key":"ref_40","unstructured":"Blikstein, P. (March, January 27). Using Learning Analytics to Assess Students\u2019 Behavior in Open-Ended Programming Tasks. Proceedings of the LAK \u203211: Proceedings of the 1st International Conference on Learning Analytics and Knowledge, Banff, AB, Canada."},{"key":"ref_41","unstructured":"D\u2019Mello, S., Graesser, A., and Schuller, B. (2017). The Oxford Handbook of Affective Computing, Oxford University Press."},{"key":"ref_42","unstructured":"Novak, J.D., and Ca\u00f1as, A.J. (2006). The Theory Underlying Concept Maps and How to Construct and Use Them, Florida Institute for Human and Machine Cognition."},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"167","DOI":"10.1207\/s15327809jls0402_2","article-title":"Cognitive Tutors: Lessons Learned","volume":"4","author":"Anderson","year":"1995","journal-title":"J. Learn. Sci."},{"key":"ref_44","unstructured":"Arroyo, I., Muldner, K., Burleson, W., Woolf, B., and Cooper, D. (2009, January 6\u201310). Designing Affective Support to Foster Learning, Motivation, and Attribution. Proceedings of the 14th International Conference on Artificial Intelligence in Education, AIED Workshop 2009, Brighton, UK."}],"container-title":["Future Internet"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1999-5903\/16\/11\/417\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T16:29:45Z","timestamp":1760113785000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1999-5903\/16\/11\/417"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,11,10]]},"references-count":44,"journal-issue":{"issue":"11","published-online":{"date-parts":[[2024,11]]}},"alternative-id":["fi16110417"],"URL":"https:\/\/doi.org\/10.3390\/fi16110417","relation":{},"ISSN":["1999-5903"],"issn-type":[{"value":"1999-5903","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,11,10]]}}}