{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,12,22]],"date-time":"2025-12-22T23:06:53Z","timestamp":1766444813691,"version":"3.48.0"},"reference-count":70,"publisher":"Springer Science and Business Media LLC","issue":"12","license":[{"start":{"date-parts":[[2025,11,10]],"date-time":"2025-11-10T00:00:00Z","timestamp":1762732800000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2025,11,10]],"date-time":"2025-11-10T00:00:00Z","timestamp":1762732800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100004434","name":"Universit\u00e0 degli Studi di Firenze","doi-asserted-by":"crossref","id":[{"id":"10.13039\/501100004434","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Mach Learn"],"published-print":{"date-parts":[[2025,12]]},"abstract":"<jats:title>Abstract<\/jats:title>\n                  <jats:p>AI-based emotion recognition approaches may benefit from the integration of multimodal data, but their explainability and validation is still a critical challenge. Indeed, the limited neurophysiological understanding of novel multimodal features, e.g. brain-heart interaction, can be insufficient to assess whether the AI-extracted physiological insights (i.e., the model explanations) accurately reflect the real underlying physiological processes. To validate the explanations obtained by an AI-based model in this context, we introduce a novel framework that autonomously identifies the optimal explanations for a black-box model used in emotion recognition. Our approach leverages a convolutional neural network to process BHI features, which are derived from EEG and HRV data and rearranged as images. A model-agnostic methodology is employed to extract local explanations, which are then dynamically evaluated to select the most accurate for representing specific emotional states. The effectiveness of the proposed framework is evaluated across multiple classification tasks, including up to 9-level arousal and valence emotion classification, as well as nine discrete emotions classification, using the MAHNOB-HCI and DEAP datasets. The system achieved remarkable accuracy levels, consistently reaching approximately 97\u201398% across all tasks. Furthermore, our dynamic selection framework revealed that Integrated Gradients outperformed other state-of-the-art explainable AI approaches in reliably capturing global explanations.<\/jats:p>","DOI":"10.1007\/s10994-025-06921-y","type":"journal-article","created":{"date-parts":[[2025,11,10]],"date-time":"2025-11-10T23:00:03Z","timestamp":1762815603000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":0,"title":["Model-driven validation of visual explanations for multimodal emotion recognition"],"prefix":"10.1007","volume":"114","author":[{"given":"Guido","family":"Gagliardi","sequence":"first","affiliation":[]},{"given":"Antonio Luca","family":"Alfeo","sequence":"additional","affiliation":[]},{"given":"Vincenzo","family":"Catrambone","sequence":"additional","affiliation":[]},{"given":"Mario G. C. A.","family":"Cimino","sequence":"additional","affiliation":[]},{"given":"Maarten","family":"De Vos","sequence":"additional","affiliation":[]},{"given":"Gaetano","family":"Valenza","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2025,11,10]]},"reference":[{"key":"6921_CR1","doi-asserted-by":"publisher","DOI":"10.1016\/j.iswa.2022.200171","volume":"17","author":"N Ahmed","year":"2023","unstructured":"Ahmed, N., Aghbari, Z. A., & Girija, S. (2023). A systematic survey on multimodal emotion recognition using learning algorithms. Intelligent Systems with Applications, 17, Article 200171.","journal-title":"Intelligent Systems with Applications"},{"key":"6921_CR2","doi-asserted-by":"crossref","unstructured":"Alfeo, A. L., Cimino, M. G. C. A. & Gagliardi, G. (2022). Concept-wise granular computing for explainable artificial intelligence. Granular Computing","DOI":"10.1007\/s41066-022-00357-8"},{"key":"6921_CR3","doi-asserted-by":"publisher","first-page":"4835","DOI":"10.1007\/s11042-016-3796-1","volume":"76","author":"S Barra","year":"2017","unstructured":"Barra, S., Casanova, A., Fraschini, M., & Nappi, M. (2017). Fusion of physiological measures for multimodal biometric systems. Multimedia Tools and Applications, 76, 4835\u20134847.","journal-title":"Multimedia Tools and Applications"},{"issue":"10","key":"6921_CR4","doi-asserted-by":"publisher","first-page":"988","DOI":"10.1016\/S0025-6196(12)62272-1","volume":"68","author":"EE Benarroch","year":"2008","unstructured":"Benarroch, E. E. (2008). The central autonomic network: Functional organization, dysfunction, and perspective. Mayo Clinic Proceedings, 68(10), 988\u20131001.","journal-title":"Mayo Clinic Proceedings"},{"key":"6921_CR5","first-page":"55","volume":"17","author":"RA Calvo","year":"2014","unstructured":"Calvo, R. A., & D\u2019Mello, S. K. (2014). Affective computing and the impact of gender and age. Pervasive and Mobile Computing, 17, 55\u201363.","journal-title":"Pervasive and Mobile Computing"},{"key":"6921_CR6","doi-asserted-by":"crossref","unstructured":"Candia-Rivera, D. et\u00a0al. (2022) Cardiac sympathetic-vagal activity initiates a functional brain-body response to emotional arousal. Proceeding of the National Academy of Science","DOI":"10.1101\/2021.06.05.447188"},{"key":"6921_CR7","doi-asserted-by":"publisher","DOI":"10.1016\/j.jneumeth.2021.109269","volume":"360","author":"D Candia-Rivera","year":"2021","unstructured":"Candia-Rivera, D., Catrambone, V., & Valenza, G. (2021). The role of electroencephalography electrical reference in the assessment of functional brain-heart interplay: From methodology to user guidelines. Journal of Neuroscience Methods, 360, Article 109269.","journal-title":"Journal of Neuroscience Methods"},{"key":"6921_CR8","doi-asserted-by":"crossref","unstructured":"Catrambone, V. & Valenza, G. (2021) Functional Brain-Heart Interplay: From Physiology to Advanced Methodology of Signal Processing and Modeling. Springer Nature.","DOI":"10.1007\/978-3-030-79934-2"},{"key":"6921_CR9","doi-asserted-by":"crossref","unstructured":"Catrambone, V. & Valenza, G. (2023). Nervous\u2013system\u2013wise functional estimation of directed brain\u2013heart interplay through microstate occurrences. IEEE Transactions on Biomedical Engineering","DOI":"10.1109\/TBME.2023.3240593"},{"key":"6921_CR10","unstructured":"Catrambone, V. (2019). https:\/\/it.mathworks.com\/matlabcentral\/fileexchange\/72704-brain-heart-interaction-indexes,"},{"issue":"6","key":"6921_CR11","doi-asserted-by":"publisher","first-page":"1479","DOI":"10.1007\/s10439-019-02251-y","volume":"47","author":"V Catrambone","year":"2019","unstructured":"Catrambone, V., et al. (2019). Time-resolved directional brain-heart interplay measurement through synthetic data generation models. Annals of Biomedical Engineering, 47(6), 1479\u20131489.","journal-title":"Annals of Biomedical Engineering"},{"issue":"1","key":"6921_CR12","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1038\/s41398-021-01336-4","volume":"11","author":"V Catrambone","year":"2021","unstructured":"Catrambone, V., et al. (2021). Intensification of functional neural control on heartbeat dynamics in subclinical depression. Translational Psychiatry, 11(1), 1\u201310.","journal-title":"Translational Psychiatry"},{"issue":"11","key":"6921_CR13","doi-asserted-by":"publisher","first-page":"3366","DOI":"10.1109\/TBME.2021.3071348","volume":"68","author":"V Catrambone","year":"2021","unstructured":"Catrambone, V., Talebi, A., Barbieri, R., & Valenza, G. (2021). Time-resolved brain-to-heart probabilistic information transfer estimation using inhomogeneous point-process models. IEEE Transactions on Biomedical Engineering, 68(11), 3366\u20133374.","journal-title":"IEEE Transactions on Biomedical Engineering"},{"key":"6921_CR14","doi-asserted-by":"publisher","first-page":"203814","DOI":"10.1109\/ACCESS.2020.3036877","volume":"8","author":"DY Choi","year":"2020","unstructured":"Choi, D. Y., Kim, D.-H., & Song, B. C. (2020). Multimodal attention network for continuous-time emotion recognition using video and EEG signals. IEEE Access, 8, 203814\u2013203826.","journal-title":"IEEE Access"},{"issue":"10","key":"6921_CR15","doi-asserted-by":"publisher","first-page":"2828","DOI":"10.1109\/TBME.2012.2211356","volume":"59","author":"L Citi","year":"2012","unstructured":"Citi, L., Brown, E. N., & Barbieri, R. (2012). A real-time automated point-process method for the detection and correction of erroneous and ectopic heartbeats. IEEE Transactions on Biomedical Engineering, 59(10), 2828\u20132837.","journal-title":"IEEE Transactions on Biomedical Engineering"},{"issue":"3","key":"6921_CR16","doi-asserted-by":"publisher","first-page":"1364","DOI":"10.1007\/s12559-023-10171-2","volume":"16","author":"T Dhara","year":"2024","unstructured":"Dhara, T., Singh, P. K., & Mahmud, M. (2024). A fuzzy ensemble-based deep learning model for EEG-based emotion recognition. Cognitive Computation, 16(3), 1364\u20131378.","journal-title":"Cognitive Computation"},{"issue":"3","key":"6921_CR17","doi-asserted-by":"publisher","first-page":"2238","DOI":"10.1109\/TAFFC.2022.3169001","volume":"14","author":"Y Ding","year":"2022","unstructured":"Ding, Y., Neethu Robinson, S., Zhang, Q. Z., & Guan, C. (2022). Tsception: Capturing temporal dynamics and spatial asymmetry from EEG for emotion recognition. IEEE Transactions on Affective Computing, 14(3), 2238\u20132250.","journal-title":"IEEE Transactions on Affective Computing"},{"key":"6921_CR18","doi-asserted-by":"publisher","first-page":"133180","DOI":"10.1109\/ACCESS.2020.3010311","volume":"8","author":"W Di","year":"2020","unstructured":"Di, W., Zhang, J., & Zhao, Q. (2020). Multimodal fused emotion recognition about expression-EEG interaction and collaboration using deep learning. IEEE Access, 8, 133180\u2013133189.","journal-title":"IEEE Access"},{"issue":"7","key":"6921_CR19","doi-asserted-by":"publisher","first-page":"620","DOI":"10.1038\/s42256-021-00343-w","volume":"3","author":"G Erion","year":"2021","unstructured":"Erion, G., Janizek, J. D., Sturmfels, P., Lundberg, S. M., & Lee, S.-I. (2021). Improving performance of deep learning models with axiomatic attribution priors and expected gradients. Nature Machine Intelligence, 3(7), 620\u2013631.","journal-title":"Nature Machine Intelligence"},{"issue":"3","key":"6921_CR20","doi-asserted-by":"publisher","DOI":"10.1103\/PhysRevE.91.032904","volume":"91","author":"L Faes","year":"2015","unstructured":"Faes, L., Kugiumtzis, D., Nollo, G., Jurysta, F., & Marinazzo, D. (2015). Estimating the decomposition of predictive information in multivariate systems. Physical Review E, 91(3), Article 032904.","journal-title":"Physical Review E"},{"key":"6921_CR21","doi-asserted-by":"crossref","unstructured":"Gagliardi, G., Alfeo, A. L., Catrambone, V., Candia-Rivera, D., Cimino, M. G. C. A. & Valenza, G. (2023) Improving emotion recognition systems by exploiting the spatial information of EEG sensors. IEEE Access (pp. 1\u20131).","DOI":"10.1109\/ACCESS.2023.3268233"},{"key":"6921_CR22","doi-asserted-by":"crossref","unstructured":"Gagliardi, G., Alfeo, A. L., Catrambone, V., Candia-Rivera, D., Cimino, M. G. C. A., Valenza, G., & De Vos, M. (2023). Fine-grained emotion recognition using brain-heart interplay measurements and explainable convolutional neural networks. In Proceedings of the 11th international IEEE EMBS conference on neural engineering (pp. 1\u20131).","DOI":"10.1109\/NER52421.2023.10123758"},{"key":"6921_CR23","doi-asserted-by":"crossref","unstructured":"Gagliardi, G., Alfeo, A. L., Catrambone, V., Cimino, M. G. C. A., De Vos, M. & Valenza, G. (2023). Using contrastive learning to inject domain-knowledge into neural networks for recognizing emotions. In 2023 IEEE symposium series on computational intelligence (SSCI) (pp. 1587\u20131592). IEEE","DOI":"10.1109\/SSCI52147.2023.10371895"},{"key":"6921_CR24","doi-asserted-by":"crossref","unstructured":"Gu, Y., Zhong, X., Qu, C., Liu, C., & Chen, B. (2023). A domain generative graph network for EEG-based emotion recognition. IEEE Journal of Biomedical and Health Informatics,","DOI":"10.1109\/JBHI.2023.3242090"},{"key":"6921_CR25","doi-asserted-by":"crossref","unstructured":"Guo, H., Jiang, N. & Shao, D. (2020). Research on multi-modal emotion recognition based on speech, eeg and ecg signals. In Robotics and rehabilitation intelligence: First international conference, ICRRI 2020, Fushun, China, September 9\u201311, 2020, Proceedings, Part I 1 (pp. 272\u2013288). Springer.","DOI":"10.1007\/978-981-33-4929-2_19"},{"key":"6921_CR26","doi-asserted-by":"publisher","DOI":"10.1016\/j.cviu.2024.104121","volume":"248","author":"I Hosseini","year":"2024","unstructured":"Hosseini, I., Hossain, M. Z., Zhang, Y., & Rahman, S. (2024). Deep learning model for simultaneous recognition of quantitative and qualitative emotion using visual and bio-sensing data. Computer Vision and Image Understanding, 248, Article 104121.","journal-title":"Computer Vision and Image Understanding"},{"issue":"5","key":"6921_CR27","doi-asserted-by":"publisher","first-page":"105","DOI":"10.3390\/fi11050105","volume":"11","author":"Y Huang","year":"2019","unstructured":"Huang, Y., Yang, J., Liu, S., & Pan, J. (2019). Combining facial expressions and electroencephalography to enhance emotion recognition. Future Internet, 11(5), 105.","journal-title":"Future Internet"},{"issue":"12","key":"6921_CR28","doi-asserted-by":"publisher","first-page":"7382","DOI":"10.1109\/TSMC.2020.2969686","volume":"51","author":"S Issa","year":"2020","unstructured":"Issa, S., Peng, Q., & You, X. (2020). Emotion classification using EEG brain signals and the broad learning system. IEEE Transactions on Systems, Man, and Cybernetics: Systems, 51(12), 7382\u20137391.","journal-title":"IEEE Transactions on Systems, Man, and Cybernetics: Systems"},{"issue":"1","key":"6921_CR29","first-page":"96","volume":"13","author":"T-P Jung","year":"2019","unstructured":"Jung, T.-P., Sejnowski, T. J., et al. (2019). Utilizing deep learning towards multi-modal bio-sensing and vision-based affective computing. IEEE Transactions on Affective Computing, 13(1), 96\u2013107.","journal-title":"IEEE Transactions on Affective Computing"},{"key":"6921_CR30","unstructured":"Khosrowabadi, R., Quek, H. C., Ang, K. K., & Tung, S. W. (2010). qeeg-based emotion recognition. Neural information processing (pp. 594\u2013603)."},{"key":"6921_CR31","doi-asserted-by":"crossref","unstructured":"Kim, S.-H. (2025). Mifu-er: Modality quality index-based incremental fusion for emotion recognition. IEEE Access","DOI":"10.1109\/ACCESS.2025.3584642"},{"issue":"1","key":"6921_CR32","doi-asserted-by":"publisher","first-page":"18","DOI":"10.1109\/T-AFFC.2011.15","volume":"3","author":"S Koelstra","year":"2011","unstructured":"Koelstra, S., Muhl, C., Soleymani, M., Lee, J.-S., Yazdani, A., Ebrahimi, T., Pun, T., Nijholt, A., & Patras, I. (2011). DEAP: A database for emotion analysis; using physiological signals. IEEE Transactions on Affective Computing, 3(1), 18\u201331.","journal-title":"IEEE Transactions on Affective Computing"},{"key":"6921_CR33","doi-asserted-by":"publisher","DOI":"10.1016\/j.bspc.2024.107039","volume":"100","author":"A Kumar","year":"2025","unstructured":"Kumar, A., & Kumar, A. (2025). Human emotion recognition using machine learning techniques based on the physiological signal. Biomedical Signal Processing and Control, 100, Article 107039.","journal-title":"Biomedical Signal Processing and Control"},{"key":"6921_CR34","doi-asserted-by":"crossref","unstructured":"Lan, Y.-T., Liu, W., & Lu, B.-L. (2020) Multimodal emotion recognition using deep generalized canonical correlation analysis with an attention mechanism. In 2020 international joint conference on neural networks (IJCNN) (pp. 1\u20136). IEEE.","DOI":"10.1109\/IJCNN48605.2020.9207625"},{"issue":"5","key":"6921_CR35","doi-asserted-by":"publisher","first-page":"372","DOI":"10.1037\/0003-066X.50.5.372","volume":"50","author":"PJ Lang","year":"1995","unstructured":"Lang, P. J. (1995). The emotion probe: Studies of motivation and attention. American Psychologist, 50(5), 372.","journal-title":"American Psychologist"},{"issue":"1","key":"6921_CR36","doi-asserted-by":"publisher","first-page":"155","DOI":"10.1146\/annurev.neuro.23.1.155","volume":"23","author":"JE LeDoux","year":"2000","unstructured":"LeDoux, J. E. (2000). Emotion circuits in the brain. Annual Review of Neuroscience, 23(1), 155\u2013184.","journal-title":"Annual Review of Neuroscience"},{"key":"6921_CR37","doi-asserted-by":"crossref","unstructured":"Li, W., Fang, C., Zhu, Z., Chen, C., & Song, A. (2023). Fractal spiking neural network scheme for EEG-based emotion recognition. IEEE Journal of Translational Engineering in Health and Medicine,","DOI":"10.1109\/JTEHM.2023.3320132"},{"key":"6921_CR38","doi-asserted-by":"publisher","DOI":"10.1016\/j.bspc.2024.107462","volume":"103","author":"Y Lian","year":"2025","unstructured":"Lian, Y., Zhu, M., Sun, Z., Liu, J., & Hou, Y. (2025). Emotion recognition based on EEG signals and face images. Biomedical Signal Processing and Control, 103, Article 107462.","journal-title":"Biomedical Signal Processing and Control"},{"issue":"4","key":"6921_CR39","doi-asserted-by":"publisher","first-page":"1656","DOI":"10.1109\/TCDS.2023.3270170","volume":"15","author":"W Li","year":"2023","unstructured":"Li, W., Wang, M., Zhu, J., & Song, A. (2023). EEG-based emotion recognition using trainable adjacency relation driven graph convolutional network. IEEE Transactions on Cognitive and Developmental Systems, 15(4), 1656\u20131672.","journal-title":"IEEE Transactions on Cognitive and Developmental Systems"},{"key":"6921_CR40","doi-asserted-by":"publisher","first-page":"225","DOI":"10.1016\/j.neucom.2020.07.072","volume":"415","author":"Y Li","year":"2020","unstructured":"Li, Y., Yang, H., Li, J., Chen, D., & Min, D. (2020). EEG-based intention recognition with deep recurrent-convolution neural network: Performance and channel selection by grad-cam. Neurocomputing, 415, 225\u2013233.","journal-title":"Neurocomputing"},{"key":"6921_CR41","doi-asserted-by":"crossref","unstructured":"Lu, L., Yuan, L., & Chen, L. (2025). Deep learning based emotion recognition for analyzing students\u2019 psychological states during competitions. Entertainment Computing, 101005.","DOI":"10.1016\/j.entcom.2025.101005"},{"key":"6921_CR42","doi-asserted-by":"crossref","unstructured":"Oostenveld, R., Fries, P., Maris, E., & Schoffelen, J.-M. (2011). FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data. Computational Intelligence and Neuroscience.","DOI":"10.1155\/2011\/156869"},{"issue":"3","key":"6921_CR43","doi-asserted-by":"publisher","first-page":"663","DOI":"10.1109\/TBME.2011.2171959","volume":"59","author":"M Orini","year":"2011","unstructured":"Orini, M., Bail\u00f3n, R., Mainardi, L. T., Laguna, P., & Flandrin, P. (2011). Characterization of dynamic interactions between cardiovascular signals by time-frequency coherence. IEEE Transactions on Biomedical Engineering, 59(3), 663\u2013673.","journal-title":"IEEE Transactions on Biomedical Engineering"},{"key":"6921_CR44","unstructured":"Petsiuk, V., Das, A., & Saenko, K. (2018). Rise: Randomized input sampling for explanation of black-box models. arXiv preprint arXiv:1806.07421,"},{"issue":"5","key":"6921_CR45","doi-asserted-by":"publisher","first-page":"131","DOI":"10.1007\/s10462-025-11126-9","volume":"58","author":"R Pillalamarri","year":"2025","unstructured":"Pillalamarri, R., & Shanmugam, U. (2025). A review on EEG-based multimodal learning for emotion recognition. Artificial Intelligence Review, 58(5), 131.","journal-title":"Artificial Intelligence Review"},{"issue":"3","key":"6921_CR46","doi-asserted-by":"publisher","first-page":"715","DOI":"10.1017\/S0954579405050340","volume":"17","author":"J Posner","year":"2005","unstructured":"Posner, J., Russell, J. A., & Peterson, B. S. (2005). The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology, 17(3), 715\u2013734.","journal-title":"Development and Psychopathology"},{"issue":"3","key":"6921_CR47","doi-asserted-by":"publisher","first-page":"1876","DOI":"10.1109\/TAFFC.2022.3176135","volume":"14","author":"S Saganowski","year":"2022","unstructured":"Saganowski, S., Perz, B., Polak, A., & Kazienko, P. (2022). Emotion recognition for everyday life using physiological signals from wearables: A systematic literature review. IEEE Transactions on Affective Computing, 14(3), 1876\u20131897.","journal-title":"IEEE Transactions on Affective Computing"},{"key":"6921_CR48","doi-asserted-by":"publisher","DOI":"10.1016\/j.bspc.2025.108367","volume":"111","author":"Y Said","year":"2026","unstructured":"Said, Y., Saidani, T., Atri, M., Alsheikhy, A. A., & Shawly, T. (2026). Computational intelligence for emotion recognition in autism spectrum disorder: A systematic review of signal-based modeling, simulation, and clinical potential. Biomedical Signal Processing and Control, 111, Article 108367.","journal-title":"Biomedical Signal Processing and Control"},{"issue":"2","key":"6921_CR49","doi-asserted-by":"publisher","DOI":"10.1016\/j.patter.2020.100017","volume":"1","author":"GP Sarma","year":"2020","unstructured":"Sarma, G. P., Reinertsen, E., Aguirre, A., Anderson, C., Batra, P., Choi, S.-H., Achille, P. D., Diamant, N., Ellinor, P., Emdin, C., et al. (2020). Physiology as a lingua franca for clinical machine learning. Patterns, 1(2), Article 100017.","journal-title":"Patterns"},{"key":"6921_CR50","doi-asserted-by":"crossref","unstructured":"Sedehi, J. F., Dabanloo, N. J., Maghooli, K., & Sheikhani, A. (2025). Develop an emotion recognition system using jointly connectivity between electroencephalogram and electrocardiogram signals. Heliyon, 11(2).","DOI":"10.1016\/j.heliyon.2025.e41767"},{"key":"6921_CR51","doi-asserted-by":"crossref","unstructured":"Selvaraju, R. R., Cogswell, M., Das, A., Vedantam, R., Parikh, D. & Batra, D. (2017). Grad-cam: Visual explanations from deep networks via gradient-based localization. In Proceedings of the IEEE international conference on computer vision (pp. 618\u2013626).","DOI":"10.1109\/ICCV.2017.74"},{"key":"6921_CR52","doi-asserted-by":"publisher","first-page":"336","DOI":"10.1007\/s11263-019-01228-7","volume":"128","author":"RR Selvaraju","year":"2020","unstructured":"Selvaraju, R. R., Cogswell, M., Das, A., Vedantam, R., Parikh, D., & Batra, D. (2020). Grad-cam: Visual explanations from deep networks via gradient-based localization. International Journal of Computer Vision, 128, 336\u2013359.","journal-title":"International Journal of Computer Vision"},{"key":"6921_CR53","unstructured":"Shrikumar, A., Greenside, P., & Kundaje, A. (2017). Learning important features through propagating activation differences. In International conference on machine learning (pp. 3145\u20133153). PMlR"},{"issue":"1","key":"6921_CR54","doi-asserted-by":"publisher","first-page":"42","DOI":"10.1109\/T-AFFC.2011.25","volume":"3","author":"M Soleymani","year":"2011","unstructured":"Soleymani, M., Lichtenauer, J., Pun, T., & Pantic, M. (2011). A multimodal database for affect recognition and implicit tagging. IEEE Transactions on Affective Computing, 3(1), 42\u201355.","journal-title":"IEEE Transactions on Affective Computing"},{"key":"6921_CR55","doi-asserted-by":"crossref","unstructured":"Suhaimi, N. S., Mountstephens, J., Teo, J. et\u00a0al. (2020). Eeg-based emotion recognition: A state-of-the-art review of current trends and opportunities. Computational Intelligence and Neuroscience.","DOI":"10.1155\/2020\/8875426"},{"key":"6921_CR56","unstructured":"Sundararajan, M., Taly, A. & Yan, Q. (2017). Axiomatic attribution for deep networks. In International conference on machine learning (pp. 3319\u20133328). PMLR."},{"issue":"2","key":"6921_CR57","doi-asserted-by":"publisher","first-page":"747","DOI":"10.1016\/j.neubiorev.2011.11.009","volume":"36","author":"JF Thayer","year":"2012","unstructured":"Thayer, J. F., \u00c5hs, F., Fredrikson, M., Sollers, J. J., & Wager, T. D. (2012). A meta-analysis of heart rate variability and neuroimaging studies: Implications for heart rate variability as a marker of stress and health. Neuroscience & Biobehavioral Reviews, 36(2), 747\u2013756.","journal-title":"Neuroscience & Biobehavioral Reviews"},{"key":"6921_CR58","doi-asserted-by":"crossref","unstructured":"Torres-Valencia, C. A., Garcia-Arias, H. F., Lopez, M. A. A. & Orozco-Guti\u00e9rrez, A. A. (2014). Comparative analysis of physiological signals and electroencephalogram (EEG) for multimodal emotion recognition using generative models. In 2014 XIX symposium on image, signal processing and artificial vision (pp. 1\u20135). IEEE.","DOI":"10.1109\/STSIVA.2014.7010181"},{"issue":"1","key":"6921_CR59","doi-asserted-by":"publisher","first-page":"19","DOI":"10.1152\/japplphysiol.00842.2017","volume":"125","author":"G Valenza","year":"2018","unstructured":"Valenza, G., Citi, L., Saul, J. P., & Barbieri, R. (2018). Measures of sympathetic and parasympathetic autonomic outflow from heartbeat dynamics. Journal of Applied Physiology, 125(1), 19\u201339.","journal-title":"Journal of Applied Physiology"},{"issue":"164","key":"6921_CR60","doi-asserted-by":"publisher","first-page":"20190878","DOI":"10.1098\/rsif.2019.0878","volume":"17","author":"G Valenza","year":"2020","unstructured":"Valenza, G., Passamonti, L., Duggento, A., Toschi, N., & Barbieri, R. (2020). Uncovering complex central autonomic networks at rest: A functional magnetic resonance imaging study on complex cardiovascular oscillations. Journal of the Royal Society Interface, 17(164), 20190878.","journal-title":"Journal of the Royal Society Interface"},{"key":"6921_CR61","doi-asserted-by":"publisher","first-page":"383","DOI":"10.1016\/j.neuroimage.2019.04.075","volume":"197","author":"G Valenza","year":"2019","unstructured":"Valenza, G., Sclocco, R., Duggento, A., Passamonti, L., Napadow, V., Barbieri, R., & Toschi, N. (2019). The central autonomic network at rest: Uncovering functional MRI correlates of time-varying autonomic outflow. Neuroimage, 197, 383\u2013390.","journal-title":"Neuroimage"},{"issue":"7","key":"6921_CR62","doi-asserted-by":"publisher","first-page":"2533","DOI":"10.1109\/JBHI.2021.3049119","volume":"25","author":"Z Wang","year":"2021","unstructured":"Wang, Z., Tianhao, G., Zhu, Y., Li, D., Yang, H., & Wenli, D. (2021). FLDNet: Frame-level distilling neural network for EEG emotion recognition. IEEE Journal of Biomedical and Health Informatics, 25(7), 2533\u20132544.","journal-title":"IEEE Journal of Biomedical and Health Informatics"},{"key":"6921_CR63","doi-asserted-by":"publisher","first-page":"1512799","DOI":"10.3389\/fnins.2025.1512799","volume":"19","author":"Z Wang","year":"2025","unstructured":"Wang, Z., & Wang, Y. (2025). Emotion recognition based on multimodal physiological electrical signals. Frontiers in Neuroscience, 19, 1512799.","journal-title":"Frontiers in Neuroscience"},{"key":"6921_CR64","doi-asserted-by":"crossref","unstructured":"Wei, Y., Lil, Y., Xu, M., Hua, Y., Gong, Y., Osawa, K. & Tanaka, E. (2023) A real-time and two-dimensional emotion recognition system based on EEG and HRV using machine learning. In 2023 IEEE\/SICE international symposium on system integration (SII) (pp. 1\u20136). IEEE.","DOI":"10.1109\/SII55687.2023.10039222"},{"key":"6921_CR65","unstructured":"Yeh, C.-K., Hsieh, C.-Y., Suggala, A., Inouye, D. I., & Ravikumar, P. K. (2019). On the (in) fidelity and sensitivity of explanations. Advances in neural information processing systems, 32"},{"key":"6921_CR66","doi-asserted-by":"publisher","DOI":"10.1016\/j.asoc.2020.106954","volume":"100","author":"Y Yin","year":"2021","unstructured":"Yin, Y., Zheng, X., Bin, H., Zhang, Y., & Cui, X. (2021). EEG emotion recognition using fusion model of graph convolutional neural networks and LSTM. Applied Soft Computing, 100, Article 106954.","journal-title":"Applied Soft Computing"},{"key":"6921_CR67","doi-asserted-by":"publisher","DOI":"10.1016\/j.bspc.2022.103877","volume":"77","author":"Y Zhang","year":"2022","unstructured":"Zhang, Y., Cheng, C., Wang, S., & Xia, T. (2022). Emotion recognition using heterogeneous convolutional neural networks combined with multimodal factorized bilinear pooling. Biomedical Signal Processing and Control, 77, Article 103877.","journal-title":"Biomedical Signal Processing and Control"},{"key":"6921_CR68","unstructured":"Zhang, G., Minjing, Yu., Liu, Y.-J., Zhao, G., & Zhang, D. (2021). and Wenming Zheng. Sparsedgcnn: Recognizing emotion from multichannel EEG signals. IEEE Transactions on Affective Computing."},{"issue":"3","key":"6921_CR69","doi-asserted-by":"publisher","first-page":"417","DOI":"10.1109\/TAFFC.2017.2712143","volume":"10","author":"W-L Zheng","year":"2017","unstructured":"Zheng, W.-L., Zhu, J.-Y., & Bao-Liang, L. (2017). Identifying stable patterns over time for emotion recognition from EEG. IEEE Transactions on Affective Computing, 10(3), 417\u2013429.","journal-title":"IEEE Transactions on Affective Computing"},{"key":"6921_CR70","doi-asserted-by":"crossref","unstructured":"Zhong, X., Wu, F., Yin, Z., & Liu, G. (2024). An attention-enhanced retentive broad learning system for subject-generic emotion recognition from eeg signals. In ICASSP 2024-2024 IEEE international conference on acoustics, speech and signal processing (ICASSP) (pp. 2310\u20132314). IEEE.","DOI":"10.1109\/ICASSP48485.2024.10446817"}],"container-title":["Machine Learning"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10994-025-06921-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s10994-025-06921-y","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10994-025-06921-y.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,12,22]],"date-time":"2025-12-22T23:02:22Z","timestamp":1766444542000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s10994-025-06921-y"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,11,10]]},"references-count":70,"journal-issue":{"issue":"12","published-print":{"date-parts":[[2025,12]]}},"alternative-id":["6921"],"URL":"https:\/\/doi.org\/10.1007\/s10994-025-06921-y","relation":{},"ISSN":["0885-6125","1573-0565"],"issn-type":[{"type":"print","value":"0885-6125"},{"type":"electronic","value":"1573-0565"}],"subject":[],"published":{"date-parts":[[2025,11,10]]},"assertion":[{"value":"25 March 2025","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"16 September 2025","order":2,"name":"revised","label":"Revised","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"13 October 2025","order":3,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"10 November 2025","order":4,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare no Conflict of interest.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}}],"article-number":"273"}}