{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,7,30]],"date-time":"2025-07-30T16:05:14Z","timestamp":1753891514131,"version":"3.41.2"},"reference-count":25,"publisher":"Frontiers Media SA","license":[{"start":{"date-parts":[[2025,6,16]],"date-time":"2025-06-16T00:00:00Z","timestamp":1750032000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":["frontiersin.org"],"crossmark-restriction":true},"short-container-title":["Front. Comput. Neurosci."],"abstract":"<jats:p>EEG emotion recognition has important applications in human-computer interaction and mental health assessment, but existing models have limitations in capturing the complex spatial and temporal features of EEG signals. To overcome this problem, we propose an innovative model that combines CNN-BiLSTM and DC-IGN and fused both outputs for sentiment classification via a fully connected layer. In addition, we use a piecewise exponential decay strategy to optimize the training process. We conducted a comprehensive comparative experiment on the SEED and DEAP datasets, it includes traditional models, existing advanced models, and different combination models (such as CNN\u202f+\u202fLSTM, CNN\u202f+\u202fLSTM+DC-IGN). The results show that our model achieves 94.35% accuracy on SEED dataset, 89.84% on DEAP-valence, 90.31% on DEAP-arousal, which is significantly better than other models. In addition, we further verified the superiority of the model through subject independent experiment and learning rate scheduling strategy comparison experiment. These results not only improve the performance of EEG emotion recognition, but also provide new ideas and methods for research in related fields, and prove the significant advantages of our model in capturing complex features and improving classification accuracy.<\/jats:p>","DOI":"10.3389\/fncom.2025.1589247","type":"journal-article","created":{"date-parts":[[2025,6,16]],"date-time":"2025-06-16T04:10:32Z","timestamp":1750047032000},"update-policy":"https:\/\/doi.org\/10.3389\/crossmark-policy","source":"Crossref","is-referenced-by-count":0,"title":["CNN-BiLSTM and DC-IGN fusion model and piecewise exponential attenuation optimization: an innovative approach to improve EEG emotion recognition performance"],"prefix":"10.3389","volume":"19","author":[{"given":"Shaohua","family":"Zhang","sequence":"first","affiliation":[]},{"given":"Yan","family":"Feng","sequence":"additional","affiliation":[]},{"given":"Ruzhen","family":"Chen","sequence":"additional","affiliation":[]},{"given":"Song","family":"Huang","sequence":"additional","affiliation":[]},{"given":"Qianchu","family":"Wang","sequence":"additional","affiliation":[]}],"member":"1965","published-online":{"date-parts":[[2025,6,16]]},"reference":[{"key":"ref1","doi-asserted-by":"publisher","first-page":"374","DOI":"10.1109\/TAFFC.2017.2714671","article-title":"Emotions recognition using EEG signals: a survey","volume":"10","author":"Alarcao","year":"2019","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref2","doi-asserted-by":"publisher","first-page":"2937","DOI":"10.1007\/s10115-020-01449-0","article-title":"A survey of state-of-the-art approaches for emotion recognition in text","volume":"62","author":"Alswaidan","year":"2020","journal-title":"Knowl. Inf. Syst."},{"key":"ref3","doi-asserted-by":"publisher","first-page":"453","DOI":"10.1109\/JBHI.2020.2995767","article-title":"Emotion recognition from multi-channel EEG via deep forest","volume":"25","author":"Cheng","year":"2020","journal-title":"IEEE J. Biomed. Health Inform."},{"key":"ref4","doi-asserted-by":"publisher","first-page":"106243","DOI":"10.1016\/j.knosys.2020.106243","article-title":"Eeg-based emotion recognition using an end-to-end regional-asymmetric convolutional neural network","volume":"205","author":"Cui","year":"2020","journal-title":"Knowl.-Based Syst."},{"key":"ref5","article-title":"Convolutional neural networks on graphs with fast localized spectral filtering","author":"Defferrard","year":"2016","journal-title":"Adv. Neural Inf. Proces. Syst."},{"volume-title":"Tutorial on variational autoencoders","year":"2016","author":"Doersch","key":"ref6"},{"key":"ref7","doi-asserted-by":"publisher","first-page":"1528","DOI":"10.1109\/TAFFC.2020.3013711","article-title":"An efficient lstm network for emotion recognition from multichannel eeg signals","volume":"13","author":"Du","year":"2022","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref8","doi-asserted-by":"publisher","first-page":"815","DOI":"10.1007\/s11571-020-09634-1","article-title":"EEG-based emotion recognition using 4d convolutional recurrent neural network","volume":"14","author":"Fangyao","year":"2020","journal-title":"Cogn. Neurodyn."},{"key":"ref9","doi-asserted-by":"publisher","first-page":"11021","DOI":"10.1109\/TNNLS.2022.3168935","article-title":"Frame-level teacher\u2013student learning with data privacy for eeg emotion recognition","volume":"34","author":"Gu","year":"2023","journal-title":"IEEE Trans. Neural Networks Learn. Syst."},{"key":"ref10","doi-asserted-by":"publisher","first-page":"117327","DOI":"10.1109\/ACCESS.2019.2936124","article-title":"Speech emotion recognition using deep learning techniques: a review","volume":"7","author":"Khalil","year":"2019","journal-title":"IEEE Access"},{"key":"ref11","doi-asserted-by":"crossref","first-page":"18","DOI":"10.1109\/T-AFFC.2011.15","article-title":"Deap: a database for emotion analysis; using physiological signals","volume":"3","author":"Koelstra","year":"2011","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref12","first-page":"2539","article-title":"Deep convolutional inverse graphics network","volume-title":"Advances in neural information processing systems","author":"Kulkarni","year":"2015"},{"key":"ref13","doi-asserted-by":"publisher","first-page":"2278","DOI":"10.1109\/5.726791","article-title":"Gradient-based learning applied to document recognition","volume":"86","author":"Lecun","year":"1998","journal-title":"Proc. IEEE"},{"key":"ref14","doi-asserted-by":"publisher","first-page":"2306","DOI":"10.1109\/TAFFC.2020.2981446","article-title":"Deep facial expression recognition: a survey","volume":"25","author":"Li","year":"2022","journal-title":"IEEE Trans. Affect. Comput."},{"year":"2016","author":"Li","key":"ref15"},{"key":"ref16","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/3524499","article-title":"Eeg based emotion recognition: a tutorial and review","volume":"55","author":"Li","year":"2023","journal-title":"ACM Comp. Surv."},{"key":"ref17","doi-asserted-by":"publisher","first-page":"494","DOI":"10.1109\/TAFFC.2018.2885474","article-title":"A bi-hemisphere domain adversarial neural network model for EEG emotion recognition","volume":"12","author":"Li","year":"2021","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref18","doi-asserted-by":"publisher","DOI":"10.1109\/TAFFC.2018.2817622","article-title":"EEG emotion recognition using dynamical graph convolutional neural networks","author":"Song","year":"2020","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref19","doi-asserted-by":"publisher","DOI":"10.1109\/TAFFC.2020.3025777","article-title":"Eeg-based emotion recognition via channel-wise attention and self attention","volume":"14","author":"Tao","year":"2023","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref20","doi-asserted-by":"publisher","DOI":"10.3390\/app14062636","article-title":"Exploring EEG emotion recognitionthrough complex networks: insights from the visibility graph of ordinal patterns","volume":"14","author":"Yao","year":"2024","journal-title":"Appl. Sci."},{"key":"ref21","doi-asserted-by":"publisher","first-page":"106954","DOI":"10.1016\/j.asoc.2020.106954","article-title":"Eeg emotion recognition using fusion model of graph convolutional neural networks and lstm","volume":"100","author":"Yin","year":"2020","journal-title":"Appl. Soft Comput."},{"key":"ref22","doi-asserted-by":"publisher","first-page":"23","DOI":"10.3390\/brainsci14030271","article-title":"Subject-independent emotion recognition based on EEG frequency band features and self-adaptive graph construction","volume":"14","author":"Zhang","year":"2024","journal-title":"Brain Sci."},{"key":"ref23","doi-asserted-by":"publisher","first-page":"162","DOI":"10.1109\/TAMD.2015.2431497","article-title":"Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks","volume":"7","author":"Zheng","year":"2015","journal-title":"IEEE Trans. Auton. Ment. Dev."},{"key":"ref24","doi-asserted-by":"publisher","first-page":"1290","DOI":"10.1109\/TAFFC.2020.2994159","article-title":"Eeg-based emotion recognition using regularized graph neural networks","volume":"13","author":"Zhong","year":"2020","journal-title":"IEEE Trans. Affect. Comput."},{"year":"2016","author":"Zhou","key":"ref25"}],"container-title":["Frontiers in Computational Neuroscience"],"original-title":[],"link":[{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/fncom.2025.1589247\/full","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,16]],"date-time":"2025-06-16T04:10:33Z","timestamp":1750047033000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.frontiersin.org\/articles\/10.3389\/fncom.2025.1589247\/full"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,6,16]]},"references-count":25,"alternative-id":["10.3389\/fncom.2025.1589247"],"URL":"https:\/\/doi.org\/10.3389\/fncom.2025.1589247","relation":{},"ISSN":["1662-5188"],"issn-type":[{"type":"electronic","value":"1662-5188"}],"subject":[],"published":{"date-parts":[[2025,6,16]]},"article-number":"1589247"}}