{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T22:10:30Z","timestamp":1776118230417,"version":"3.50.1"},"reference-count":38,"publisher":"Springer Science and Business Media LLC","issue":"4","license":[{"start":{"date-parts":[[2021,4,3]],"date-time":"2021-04-03T00:00:00Z","timestamp":1617408000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2021,4,3]],"date-time":"2021-04-03T00:00:00Z","timestamp":1617408000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"crossref","award":["81771926"],"award-info":[{"award-number":["81771926"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"crossref"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["61763022"],"award-info":[{"award-number":["61763022"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Complex Intell. Syst."],"published-print":{"date-parts":[[2022,8]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>EEG-based emotion recognition has attracted substantial attention from researchers due to its extensive application prospects, and substantial progress has been made in feature extraction and classification modelling from EEG data. However, insufficient high-quality training data are available for building EEG-based emotion recognition models via machine learning or deep learning methods. The artificial generation of high-quality data is an effective approach for overcoming this problem. In this paper, a multi-generator conditional Wasserstein GAN method is proposed for the generation of high-quality artificial that covers a more comprehensive distribution of real data through the use of various generators. Experimental results demonstrate that the artificial data that are generated by the proposed model can effectively improve the performance of emotion classification models that are based on EEG.<\/jats:p>","DOI":"10.1007\/s40747-021-00336-7","type":"journal-article","created":{"date-parts":[[2021,4,3]],"date-time":"2021-04-03T05:02:46Z","timestamp":1617426166000},"page":"3059-3071","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":50,"title":["EEG data augmentation for emotion recognition with a multiple generator conditional Wasserstein GAN"],"prefix":"10.1007","volume":"8","author":[{"given":"Aiming","family":"Zhang","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-0210-6506","authenticated-orcid":false,"given":"Lei","family":"Su","sequence":"additional","affiliation":[]},{"given":"Yin","family":"Zhang","sequence":"additional","affiliation":[]},{"given":"Yunfa","family":"Fu","sequence":"additional","affiliation":[]},{"given":"Liping","family":"Wu","sequence":"additional","affiliation":[]},{"given":"Shengjin","family":"Liang","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2021,4,3]]},"reference":[{"key":"336_CR1","doi-asserted-by":"publisher","first-page":"374","DOI":"10.1109\/TAFFC.2017.2714671","volume":"10","author":"SM Alarcao","year":"2017","unstructured":"Alarcao SM, Fonseca MJ (2017) Emotions recognition using EEG signals: a survey. IEEE Trans Affect Comput 10:374\u2013393","journal-title":"IEEE Trans Affect Comput"},{"key":"336_CR2","doi-asserted-by":"publisher","first-page":"1239","DOI":"10.3390\/app7121239","volume":"7","author":"A Al-Nafjan","year":"2017","unstructured":"Al-Nafjan A, Hosny M, Al-Ohali Y, Al-Wabil A (2017) Review and classification of emotion recognition based on EEG brain\u2013computer interface system research: a systematic review. Appl Sci 7:1239","journal-title":"Appl Sci"},{"key":"336_CR3","unstructured":"Arjovsky M, Chintala S, Bottou L (2017) Wasserstein generative adversarial networks. In: International conference on machine learning, 2017. PMLR, pp 214\u2013223"},{"key":"336_CR4","doi-asserted-by":"publisher","first-page":"1650035","DOI":"10.1142\/S0219519416500354","volume":"16","author":"GM Bairy","year":"2016","unstructured":"Bairy GM, Niranjan U, Puthankattil SD (2016) Automated classification of depression EEG signals using wavelet entropies and energies. J Mech Med Biol 16:1650035","journal-title":"J Mech Med Biol"},{"key":"336_CR5","doi-asserted-by":"publisher","first-page":"464","DOI":"10.1016\/j.eswa.2017.09.030","volume":"91","author":"G Douzas","year":"2018","unstructured":"Douzas G, Bacao F (2018) Effective data generation for imbalanced learning using conditional generative adversarial networks. Expert Syst Appl 91:464\u2013471","journal-title":"Expert Syst Appl"},{"key":"336_CR6","unstructured":"Goodfellow I (2016) Nips 2016 tutorial: generative adversarial networks. arXiv preprint arXiv: 170100160"},{"key":"336_CR7","unstructured":"Goodfellow I, Pouget-Abadie J, Mirza M, Xu B, Warde-Farley D, Ozair S, Courville A, Bengio Y (2014) Generative adversarial nets. In: Advances in neural information processing systems, vol 27. pp 2672\u20132680"},{"key":"336_CR8","first-page":"513","volume":"19","author":"A Gretton","year":"2006","unstructured":"Gretton A, Borgwardt K, Rasch M, Sch\u00f6lkopf B, Smola A (2006) A kernel method for the two-sample-problem. Adv Neural Inf Process Syst 19:513\u2013520","journal-title":"Adv Neural Inf Process Syst"},{"key":"336_CR9","unstructured":"Gulrajani I, Ahmed F, Arjovsky M, Dumoulin V, Courville A (2017) Improved training of wasserstein gans. arXiv preprint arXiv: 170400028"},{"key":"336_CR10","unstructured":"Hartmann KG, Schirrmeister RT, Ball T (2018) EEG-GAN: generative adversarial networks for electroencephalograhic (EEG) brain signals. arXiv preprint arXiv:  180601875"},{"key":"336_CR11","unstructured":"Hoang Q, Nguyen TD, Le T, Phung D (2018) MGAN: training generative adversarial nets with multiple generators. In: International conference on learning representations, 2018"},{"key":"336_CR12","doi-asserted-by":"publisher","first-page":"98","DOI":"10.1109\/JBHI.2017.2688239","volume":"22","author":"S Katsigiannis","year":"2017","unstructured":"Katsigiannis S, Ramzan N (2017) DREAMER: a database for emotion recognition through EEG and ECG signals from wireless low-cost off-the-shelf devices. IEEE J Biomed Health Inform 22:98\u2013107","journal-title":"IEEE J Biomed Health Inform"},{"key":"336_CR13","doi-asserted-by":"publisher","first-page":"18","DOI":"10.1109\/T-AFFC.2011.15","volume":"3","author":"S Koelstra","year":"2011","unstructured":"Koelstra S, Muhl C, Soleymani M, Lee J-S, Yazdani A, Ebrahimi T, Pun T, Nijholt A, Patras I (2011) Deap: a database for emotion analysis; using physiological signals. IEEE Trans Affect Comput 3:18\u201331","journal-title":"IEEE Trans Affect Comput"},{"key":"336_CR14","first-page":"1097","volume":"25","author":"A Krizhevsky","year":"2012","unstructured":"Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. Adv Neural Inf Process Syst 25:1097\u20131105","journal-title":"Adv Neural Inf Process Syst"},{"key":"336_CR15","first-page":"39","volume":"1","author":"PJ Lang","year":"1997","unstructured":"Lang PJ, Bradley MM, Cuthbert BN (1997) International affective picture system (IAPS): technical manual and affective ratings. NIMH Center Study Emot Atten 1:39\u201358","journal-title":"NIMH Center Study Emot Atten"},{"key":"336_CR16","doi-asserted-by":"crossref","unstructured":"Luo Y, Lu B-L (2018) EEG data augmentation for emotion recognition using a conditional Wasserstein GAN. In: 2018 40th annual international conference of the IEEE engineering in medicine and biology society (EMBC), 2018. IEEE, pp 2535\u20132538","DOI":"10.1109\/EMBC.2018.8512865"},{"key":"336_CR17","doi-asserted-by":"publisher","first-page":"056021","DOI":"10.1088\/1741-2552\/abb580","volume":"17","author":"Y Luo","year":"2020","unstructured":"Luo Y, Zhu L-Z, Wan Z-Y, Lu B-L (2020) Data augmentation for enhancing EEG-based emotion recognition with deep generative models. J Neural Eng 17:056021","journal-title":"J Neural Eng"},{"key":"336_CR18","doi-asserted-by":"publisher","first-page":"500","DOI":"10.3414\/ME15-01-0005","volume":"54","author":"A Maglione","year":"2015","unstructured":"Maglione A, Scorpecci A, Malerba P, Marsella P, Giannantonio S, Colosimo A, Babiloni F, Vecchiato G (2015) Alpha EEG frontal asymmetries during audiovisual perception in cochlear implant users. Methods Inf Med 54:500\u2013504","journal-title":"Methods Inf Med"},{"key":"336_CR19","doi-asserted-by":"publisher","first-page":"847","DOI":"10.1097\/00006324-199511000-00013","volume":"72","author":"E Marg","year":"1995","unstructured":"Marg E (1995) DESCARTES\u2019ERROR: emotion, reason, and the human brain. Optom Vis Sci 72:847\u2013848","journal-title":"Optom Vis Sci"},{"key":"336_CR20","doi-asserted-by":"crossref","unstructured":"Martin O, Kotsia I, Macq B, Pitas I (2006) The eNTERFACE'05 audio-visual emotion database. In: 22nd international conference on data engineering workshops (ICDEW'06), 2006. IEEE, pp 8\u20138","DOI":"10.1109\/ICDEW.2006.145"},{"key":"336_CR21","unstructured":"Mescheder L, Geiger A, Nowozin S (2018) Which training methods for GANs do actually converge? In: International conference on machine learning, 2018. PMLR, pp 3481\u20133490"},{"key":"336_CR22","unstructured":"Mirza M, Osindero S (2014) Conditional generative adversarial nets. arXiv preprint arXiv:  14111784"},{"key":"336_CR23","doi-asserted-by":"publisher","first-page":"66","DOI":"10.1080\/2326263X.2014.912881","volume":"1","author":"C M\u00fchl","year":"2014","unstructured":"M\u00fchl C, Allison B, Nijholt A, Chanel G (2014) A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges. Brain Comput Interfaces 1:66\u201384","journal-title":"Brain Comput Interfaces"},{"key":"336_CR24","doi-asserted-by":"crossref","unstructured":"Palazzo S, Spampinato C, Kavasidis I, Giordano D, Shah M (2017) Generative adversarial networks conditioned by brain signals. In: Proceedings of the IEEE international conference on computer vision, 2017, pp 3410\u20133418","DOI":"10.1109\/ICCV.2017.369"},{"key":"336_CR25","first-page":"2825","volume":"12","author":"F Pedregosa","year":"2011","unstructured":"Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V (2011) Scikit-learn: machine learning in Python. J Mach Learn Res 12:2825\u20132830","journal-title":"J Mach Learn Res"},{"key":"336_CR26","unstructured":"Smith D, Burke B (2019) Hype cycle for emerging technologies. Gartner, Inc Abgerufen am 12:2019"},{"key":"336_CR27","doi-asserted-by":"publisher","first-page":"42","DOI":"10.1109\/T-AFFC.2011.25","volume":"3","author":"M Soleymani","year":"2011","unstructured":"Soleymani M, Lichtenauer J, Pun T, Pantic M (2011) A multimodal database for affect recognition and implicit tagging. IEEE Trans Affect Comput 3:42\u201355","journal-title":"IEEE Trans Affect Comput"},{"key":"336_CR28","unstructured":"Somers M (2019) Emotion AI, explained. Pe\u0436\u0438\u043c \u0434oc\u0442y\u043fa: https:\/\/mitsloanmit.edu\/ideas-made-to-matter\/emotion-ai-explained. \u0414a\u0442a o\u0431pa\u0449e\u043d\u0438\u044f 20"},{"key":"336_CR29","unstructured":"Van der Maaten L, Hinton G (2008) Visualizing data using t-SNE. J Mach Learn Res 9"},{"key":"336_CR30","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1198\/10618600152418584","volume":"10","author":"DA Van Dyk","year":"2001","unstructured":"Van Dyk DA, Meng X-L (2001) The art of data augmentation. J Comput Graph Stat 10:1\u201350","journal-title":"J Comput Graph Stat"},{"key":"336_CR31","doi-asserted-by":"crossref","unstructured":"Volpi R, Morerio P, Savarese S, Murino V (2018) Adversarial feature augmentation for unsupervised domain adaptation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, 2018. pp 5495\u20135504","DOI":"10.1109\/CVPR.2018.00576"},{"key":"336_CR32","doi-asserted-by":"publisher","first-page":"208","DOI":"10.1016\/j.inffus.2020.10.004","volume":"67","author":"S-H Wang","year":"2021","unstructured":"Wang S-H, Govindaraj VV, G\u00f3rriz JM, Zhang X, Zhang Y-D (2021) Covid-19 classification by FGCNet with deep feature fusion from graph convolutional network and convolutional neural network. Inf Fusion 67:208\u2013229","journal-title":"Inf Fusion"},{"key":"336_CR33","doi-asserted-by":"publisher","first-page":"131","DOI":"10.1016\/j.inffus.2020.11.005","volume":"68","author":"S-H Wang","year":"2021","unstructured":"Wang S-H, Nayak DR, Guttery DS, Zhang X, Zhang Y-D (2021) COVID-19 classification by CCSHNet with deep fusion using transfer learning and discriminant correlation analysis. Inf Fusion 68:131\u2013148","journal-title":"Inf Fusion"},{"key":"336_CR34","doi-asserted-by":"publisher","first-page":"101551","DOI":"10.1016\/j.bspc.2019.04.028","volume":"53","author":"Z Wei","year":"2019","unstructured":"Wei Z, Zou J, Zhang J, Xu J (2019) Automatic epileptic EEG detection using convolutional neural network with improvements in time-domain. Biomed Signal Process Control 53:101551","journal-title":"Biomed Signal Process Control"},{"key":"336_CR35","doi-asserted-by":"publisher","first-page":"016003","DOI":"10.1088\/1741-2560\/9\/1\/016003","volume":"9","author":"TO Zander","year":"2011","unstructured":"Zander TO, Jatzev S (2011) Context-aware brain\u2013computer interfaces: exploring the information space of user, technical system and environment. J Neural Eng 9:016003","journal-title":"J Neural Eng"},{"key":"336_CR36","doi-asserted-by":"publisher","first-page":"3613","DOI":"10.1007\/s11042-017-5243-3","volume":"78","author":"Y-D Zhang","year":"2019","unstructured":"Zhang Y-D, Dong Z, Chen X, Jia W, Du S, Muhammad K, Wang S-H (2019) Image-based fruit category classification by 13-layer deep convolutional neural network and data augmentation. Multimed Tools Appl 78:3613\u20133632","journal-title":"Multimed Tools Appl"},{"key":"336_CR37","doi-asserted-by":"publisher","first-page":"162","DOI":"10.1109\/TAMD.2015.2431497","volume":"7","author":"W-L Zheng","year":"2015","unstructured":"Zheng W-L, Lu B-L (2015) Investigating critical frequency bands and channels for EEG-based emotion recognition with deep neural networks. IEEE Trans Auton Ment Dev 7:162\u2013175","journal-title":"IEEE Trans Auton Ment Dev"},{"key":"336_CR38","doi-asserted-by":"crossref","unstructured":"Zheng Z, Zheng L, Yang Y (2017) Unlabeled samples generated by gan improve the person re-identification baseline in vitro. In: Proceedings of the IEEE international conference on computer vision, 2017. pp 3754\u20133762","DOI":"10.1109\/ICCV.2017.405"}],"container-title":["Complex &amp; Intelligent Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s40747-021-00336-7.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s40747-021-00336-7\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s40747-021-00336-7.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,8,3]],"date-time":"2022-08-03T10:19:59Z","timestamp":1659521999000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s40747-021-00336-7"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,4,3]]},"references-count":38,"journal-issue":{"issue":"4","published-print":{"date-parts":[[2022,8]]}},"alternative-id":["336"],"URL":"https:\/\/doi.org\/10.1007\/s40747-021-00336-7","relation":{},"ISSN":["2199-4536","2198-6053"],"issn-type":[{"value":"2199-4536","type":"print"},{"value":"2198-6053","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,4,3]]},"assertion":[{"value":"28 December 2020","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"11 March 2021","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"3 April 2021","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}}]}}