{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T01:33:26Z","timestamp":1760060006678,"version":"build-2065373602"},"reference-count":34,"publisher":"MDPI AG","issue":"8","license":[{"start":{"date-parts":[[2025,7,24]],"date-time":"2025-07-24T00:00:00Z","timestamp":1753315200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100003069","name":"Instituto Polit\u00e9cnico Nacional (IPN)","doi-asserted-by":"publisher","award":["SIP-2259","SIP-20250071","SIP-20251352","SIP-20251124","SIP-20253439"],"award-info":[{"award-number":["SIP-2259","SIP-20250071","SIP-20251352","SIP-20251124","SIP-20253439"]}],"id":[{"id":"10.13039\/501100003069","id-type":"DOI","asserted-by":"publisher"}]},{"name":"Comisi\u00f3n de Operaci\u00f3n y Fomento de Actividades Acad\u00e9micas del IPN (IPN-COFAA)","award":["SIP-2259","SIP-20250071","SIP-20251352","SIP-20251124","SIP-20253439"],"award-info":[{"award-number":["SIP-2259","SIP-20250071","SIP-20251352","SIP-20251124","SIP-20253439"]}]},{"name":"Programa de Est\u00edmulos al Desempe\u00f1o de los Investigadores (IPN-EDI)","award":["SIP-2259","SIP-20250071","SIP-20251352","SIP-20251124","SIP-20253439"],"award-info":[{"award-number":["SIP-2259","SIP-20250071","SIP-20251352","SIP-20251124","SIP-20253439"]}]},{"name":"Secretar\u00eda de Ciencia, Humanidades, Tecnolog\u00eda e Innovaci\u00f3n, Sistema Nacional de Investigadores (SECIHTI-SNII)","award":["SIP-2259","SIP-20250071","SIP-20251352","SIP-20251124","SIP-20253439"],"award-info":[{"award-number":["SIP-2259","SIP-20250071","SIP-20251352","SIP-20251124","SIP-20253439"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["BDCC"],"abstract":"<jats:p>Emotion detection using computer vision has advanced significantly in recent years, achieving remarkable performance that, in some cases, surpasses that of humans. Convolutional neural networks (CNNs) excel in this task by capturing facial features that allow for effective emotion classification. However, most research focuses on basic emotions, such as happiness, anger, or sadness, neglecting more complex emotions, like frustration. People set expectations or goals to meet; if they do not happen, frustration arises, generating reactions such as annoyance, anger, and disappointment, which can harm confidence and motivation. These aspects make it especially relevant in mental health and educational contexts, where detecting it could help mitigate its adverse effects. In this research, we developed a CNN-based approach to detect frustration through facial expressions. The scarcity of specific datasets for this task led us to create an experimental protocol to generate our dataset. This classification task presents a high degree of difficulty due to the variability in facial expressions among different participants when feeling frustrated. Despite this, our new model achieved an F1-score of 0.8080, thus obtaining an adequate baseline model.<\/jats:p>","DOI":"10.3390\/bdcc9080195","type":"journal-article","created":{"date-parts":[[2025,7,24]],"date-time":"2025-07-24T07:54:26Z","timestamp":1753343666000},"page":"195","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":0,"title":["Discovering the Emotions of Frustration and Confidence During the Application of Cognitive Tests in Mexican University Students"],"prefix":"10.3390","volume":"9","author":[{"ORCID":"https:\/\/orcid.org\/0000-0003-1028-9197","authenticated-orcid":false,"given":"Marco A.","family":"Moreno-Armend\u00e1riz","sequence":"first","affiliation":[{"name":"Center for Computing Research, Instituto Polit\u00e9cnico Nacional, Mexico City 07738, Mexico"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-7129-7636","authenticated-orcid":false,"given":"Jes\u00fas","family":"Mercado-R\u00edos","sequence":"additional","affiliation":[{"name":"Center for Computing Research, Instituto Polit\u00e9cnico Nacional, Mexico City 07738, Mexico"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-4572-5713","authenticated-orcid":false,"given":"Jos\u00e9 E.","family":"Valdez-Rodr\u00edguez","sequence":"additional","affiliation":[{"name":"Center for Computing Research, Instituto Polit\u00e9cnico Nacional, Mexico City 07738, Mexico"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4454-8791","authenticated-orcid":false,"given":"Rolando","family":"Quintero","sequence":"additional","affiliation":[{"name":"Center for Computing Research, Instituto Polit\u00e9cnico Nacional, Mexico City 07738, Mexico"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5699-0478","authenticated-orcid":false,"given":"Victor H.","family":"Ponce-Ponce","sequence":"additional","affiliation":[{"name":"Center for Computing Research, Instituto Polit\u00e9cnico Nacional, Mexico City 07738, Mexico"}]}],"member":"1968","published-online":{"date-parts":[[2025,7,24]]},"reference":[{"key":"ref_1","first-page":"151","article-title":"Universal facial expressions of emotion","volume":"8","author":"Ekman","year":"1970","journal-title":"Calif. Ment. Health Res. Dig."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"24287","DOI":"10.1007\/s11042-021-10836-w","article-title":"Action unit classification for facial expression recognition using active learning and SVM","volume":"80","author":"Yao","year":"2021","journal-title":"Multimed. Tools Appl."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., and Matthews, I. (2010, January 13\u201318). The extended cohn-kanade dataset (ck+): A complete dataset for action unit and emotion-specified expression. Proceedings of the 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition-Workshops, San Francisco, CA, USA.","DOI":"10.1109\/CVPRW.2010.5543262"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Roy, A.K., Kathania, H.K., Sharma, A., Dey, A., and Ansari, M.S.A. (2024). ResEmoteNet: Bridging Accuracy and Loss Reduction in Facial Emotion Recognition. arXiv.","DOI":"10.36227\/techrxiv.172651476.62062165\/v1"},{"key":"ref_5","unstructured":"Goodfellow, I., Cukierski, W., and Bengio, Y. (2025, July 23). Challenges in Representation Learning: Facial Expression Recognition Challenge. Kaggle. Available online: https:\/\/kaggle.com\/competitions\/challenges-in-representation-learning-facial-expression-recognition-challenge."},{"key":"ref_6","unstructured":"Her, M.B., Jeong, J., Song, H., and Han, J.H. (2024). Batch Transformer: Look for Attention in Batch. arXiv."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"18","DOI":"10.1109\/TAFFC.2017.2740923","article-title":"Affectnet: A database for facial expression, valence, and arousal computing in the wild","volume":"10","author":"Mollahosseini","year":"2017","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_8","unstructured":"Grafsgaard, J., Wiggins, J.B., Boyer, K.E., Wiebe, E.N., and Lester, J. (2013, January 6\u20139). Automatically recognizing facial expression: Predicting engagement and frustration. Proceedings of the Educational Data Mining, Memphis, TN, USA."},{"key":"ref_9","unstructured":"(2025, June 16). Facial Action Coding System (FACS)\u2014A Visual Guidebook. Available online: https:\/\/imotions.com\/blog\/learning\/research-fundamentals\/facial-action-coding-system\/."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"323","DOI":"10.1109\/T-AFFC.2012.11","article-title":"Exploring temporal patterns in classifying frustrated and delighted smiles","volume":"3","author":"Hoque","year":"2012","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_11","first-page":"254","article-title":"ULearn: Understanding and reacting to student frustration using deep learning, mobile vision and NLP","volume":"Volume 11018","author":"Grewe","year":"2019","journal-title":"Proceedings of the Signal Processing, Sensor\/Information Fusion, and Target Recognition XXVIII"},{"key":"ref_12","unstructured":"Arriaga, O., Valdenegro-Toro, M., and Pl\u00f6ger, P. (2017). Real-time convolutional neural networks for emotion and gender classification. arXiv."},{"key":"ref_13","unstructured":"(2024, November 16). La Frustraci\u00f3n, \u00bfc\u00f3mo Manejarla?. Available online: https:\/\/www.unisabana.edu.co\/portaldenoticias\/al-dia\/la-frustracion-como-manejarla\/."},{"key":"ref_14","unstructured":"(2024, November 16). Trabajemos en la Tolerancia a la Frustraci\u00f3n. Available online: https:\/\/www.gaceta.unam.mx\/trabajemos-en-la-tolerancia-a-la-frustracion\/#:~:text=La%20frustraci%C3%B3n%20es%20la%20respuesta,mayor%20ser%C3%A1%20la%20frustraci%C3%B3n%20resultante."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"367","DOI":"10.2466\/pms.1977.44.2.367","article-title":"Paced auditory serial-addition task: A measure of recovery from concussion","volume":"44","author":"Gronwall","year":"1977","journal-title":"Percept. Mot. Ski."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"80","DOI":"10.5839\/rcnp.2011.0602.04","article-title":"Estrategias de resoluci\u00f3n del PASAT en pacientes con Esclerosis M\u00faltiple y viabili-dad de una versi\u00f3n corta del test","volume":"6","author":"Cores","year":"2011","journal-title":"Rev. Chil. Neuropsicol."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Baghdadi, G., Towhidkhah, F., and Rajabi, M. (2021). Chapter 7\u2014Assessment methods. Neurocognitive Mechanisms of Attention, Academic Press.","DOI":"10.1016\/B978-0-323-90935-8.00005-6"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"352","DOI":"10.1037\/h0043688","article-title":"Age differences in short-term retention of rapidly changing information","volume":"55","author":"Kirchner","year":"1958","journal-title":"J. Exp. Psychol."},{"key":"ref_19","first-page":"70","article-title":"Requisitos metodol\u00f3gicos y estad\u00edsticos para publicaciones cient\u00edficas: Parte I","volume":"66","author":"Castiglia","year":"2000","journal-title":"Rev. Asoc. Argent. Ortop. Traumatol."},{"key":"ref_20","unstructured":"(2025, June 19). Cognitive Software. Available online: https:\/\/drive.google.com\/file\/d\/1kXCgibJm32hVxH0JOFU6oC72GpwtQuIA\/view?usp=sharing."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"40","DOI":"10.1080\/00221309.1935.9920087","article-title":"La naturaleza gen\u00e9rica de los conceptos de est\u00edmulo y respuesta","volume":"12","author":"Skinner","year":"1935","journal-title":"J. Gen. Psychol."},{"key":"ref_22","unstructured":"(2025, June 14). Consent and Rights Session Forms of Frustration Dataset. Available online: https:\/\/drive.google.com\/file\/d\/1eL6n-jTmK-PNocTYgcEey03AMDTqAHel\/view?usp=sharing."},{"key":"ref_23","unstructured":"Google (2024, August 15). Gu\u00eda de Detecci\u00f3n de Puntos de Referencia Facial para Python. Available online: https:\/\/ai.google.dev\/edge\/mediapipe\/solutions\/vision\/face_landmarker\/python?hl=es-419."},{"key":"ref_24","unstructured":"Paredes, L. (2024, October 15). Emociones B\u00e1sicas y Complejas. Available online: https:\/\/psicologalorenaparedes.wordpress.com\/2024\/03\/28\/emociones-basicas-y-complejas\/."},{"key":"ref_25","first-page":"20","article-title":"La estructura de la emoci\u00f3n humana: Un modelo crom\u00e1tico del sistema afectivo","volume":"24","author":"Enrique","year":"2001","journal-title":"Salud Ment."},{"key":"ref_26","unstructured":"(2024, October 20). Transforming and Augmenting Images. Available online: https:\/\/pytorch.org\/vision\/0.12\/transforms.html#transforming-and-augmenting-images."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"2278","DOI":"10.1109\/5.726791","article-title":"Gradient-based learning applied to document recognition","volume":"86","author":"LeCun","year":"2002","journal-title":"Proc. IEEE"},{"key":"ref_28","unstructured":"(2025, July 14). FrustNet. Available online: https:\/\/github.com\/EduardoValdezRdz\/FrustNet."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"1","DOI":"10.5121\/ijdkp.2015.5201","article-title":"A review on evaluation metrics for data classification evaluations","volume":"5","author":"Hossin","year":"2015","journal-title":"Int. J. Data Min. Knowl. Manag. Process"},{"key":"ref_30","first-page":"1","article-title":"Confusion matrix in binary classification problems: A step-by-step tutorial","volume":"6","author":"Amin","year":"2022","journal-title":"J. Eng. Res."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"133982","DOI":"10.1109\/ACCESS.2020.3010715","article-title":"Deep Neural Networks for Human Activity Recognition With Wearable Sensors: Leave-One-Subject-Out Cross-Validation for Model Selection","volume":"8","author":"Gholamiangonabadi","year":"2020","journal-title":"IEEE Access"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Kuhn, M., and Johnson, K. (2013). Over-Fitting and Model Tuning. Applied Predictive Modeling, Springer.","DOI":"10.1007\/978-1-4614-6849-3"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Selvaraju, R.R., Cogswell, M., Das, A., Vedantam, R., Parikh, D., and Batra, D. (2017, January 22\u201329). Grad-cam: Visual explanations from deep networks via gradient-based localization. Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy.","DOI":"10.1109\/ICCV.2017.74"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"King, A.P., and Eckersley, R.J. (2019). Chapter 7\u2014Inferential Statistics IV: Choosing a Hypothesis Test. Statistics for Biomedical Engineers and Scientists, Academic Press.","DOI":"10.1016\/B978-0-08-102939-8.00016-5"}],"container-title":["Big Data and Cognitive Computing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2504-2289\/9\/8\/195\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,9]],"date-time":"2025-10-09T18:14:59Z","timestamp":1760033699000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2504-2289\/9\/8\/195"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,7,24]]},"references-count":34,"journal-issue":{"issue":"8","published-online":{"date-parts":[[2025,8]]}},"alternative-id":["bdcc9080195"],"URL":"https:\/\/doi.org\/10.3390\/bdcc9080195","relation":{},"ISSN":["2504-2289"],"issn-type":[{"type":"electronic","value":"2504-2289"}],"subject":[],"published":{"date-parts":[[2025,7,24]]}}}