{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T02:09:47Z","timestamp":1760234987437,"version":"build-2065373602"},"reference-count":36,"publisher":"MDPI AG","issue":"13","license":[{"start":{"date-parts":[[2021,7,5]],"date-time":"2021-07-05T00:00:00Z","timestamp":1625443200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Sensors"],"abstract":"<jats:p>People tend to display fake expressions to conceal their true feelings. False expressions are observable by facial micromovements that occur for less than a second. Systems designed to recognize facial expressions (e.g., social robots, recognition systems for the blind, monitoring systems for drivers) may better understand the user\u2019s intent by identifying the authenticity of the expression. The present study investigated the characteristics of real and fake facial expressions of representative emotions (happiness, contentment, anger, and sadness) in a two-dimensional emotion model. Participants viewed a series of visual stimuli designed to induce real or fake emotions and were signaled to produce a facial expression at a set time. From the participant\u2019s expression data, feature variables (i.e., the degree and variance of movement, and vibration level) involving the facial micromovements at the onset of the expression were analyzed. The results indicated significant differences in the feature variables between the real and fake expression conditions. The differences varied according to facial regions as a function of emotions. This study provides appraisal criteria for identifying the authenticity of facial expressions that are applicable to future research and the design of emotion recognition systems.<\/jats:p>","DOI":"10.3390\/s21134616","type":"journal-article","created":{"date-parts":[[2021,7,6]],"date-time":"2021-07-06T02:59:47Z","timestamp":1625540387000},"page":"4616","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":6,"title":["The Analysis of Emotion Authenticity Based on Facial Micromovements"],"prefix":"10.3390","volume":"21","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-1242-9264","authenticated-orcid":false,"given":"Sung","family":"Park","sequence":"first","affiliation":[{"name":"School of Design, Savannah College of Art and Design, Savannah, GA 31401, USA"}]},{"given":"Seong Won","family":"Lee","sequence":"additional","affiliation":[{"name":"Department of Human Centered Artificial Intelligence, Sangmyung University, Jongno-gu, Seoul 03016, Korea"}]},{"given":"Mincheol","family":"Whang","sequence":"additional","affiliation":[{"name":"Department of Human Centered Artificial Intelligence, Sangmyung University, Jongno-gu, Seoul 03016, Korea"}]}],"member":"1968","published-online":{"date-parts":[[2021,7,5]]},"reference":[{"key":"ref_1","first-page":"1","article-title":"Nonverbal communication","volume":"30","author":"Patterson","year":"2010","journal-title":"Corsini Encycl. Psychol."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"37","DOI":"10.1037\/h0027993","article-title":"Nonverbal concomitants of perceived and intended persuasiveness","volume":"13","author":"Mehrabian","year":"1969","journal-title":"J. Pers. Soc. Psychol."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"3","DOI":"10.1007\/BF02173410","article-title":"Invited article: A parallel process model of nonverbal communication","volume":"19","author":"Patterson","year":"1995","journal-title":"J. Nonverbal Behav."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"288","DOI":"10.1037\/h0036006","article-title":"Detecting deception from the body or face","volume":"29","author":"Ekman","year":"1974","journal-title":"J. Pers. Soc. Psychol."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1429","DOI":"10.1037\/0022-3514.72.6.1429","article-title":"The ability to detect deceit generalizes across different types of high-stake lies","volume":"72","author":"Frank","year":"1997","journal-title":"J. Pers. Soc. Psychol."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"205","DOI":"10.1196\/annals.1280.010","article-title":"Darwin, deception, and facial expression","volume":"1000","author":"Ekman","year":"2003","journal-title":"Ann. N. Y. Acad. Sci."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"243","DOI":"10.1016\/S0010-9452(84)80041-6","article-title":"Asymmetry of Facial Expression in Spontaneous Emotion","volume":"20","author":"Dopson","year":"1984","journal-title":"Cortex"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"145","DOI":"10.1016\/S0010-9452(80)80029-3","article-title":"Sensitivity to Emotional Expressions and Situations in Organic Patients","volume":"16","author":"Cicone","year":"1980","journal-title":"Cortex"},{"unstructured":"Duchenne, G.B., and de Boulogne, G.B. (1990). The Mechanism of Human Facial Expression, Cambridge University Press.","key":"ref_9"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"56","DOI":"10.1002\/mus.880130111","article-title":"Duchenne de boulogne: Electrodiagnosis of poliomyelitis","volume":"13","author":"Reincke","year":"1990","journal-title":"Muscle Nerve"},{"doi-asserted-by":"crossref","unstructured":"Ekman, P., Friesen, W.V., O\u2019Sullivan, M., and Rosenberg, E.L. (2005). Smiles When Lying. What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS), Oxford University Press.","key":"ref_11","DOI":"10.1093\/acprof:oso\/9780195179644.001.0001"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"508","DOI":"10.1111\/j.1467-9280.2008.02116.x","article-title":"Reading between the lies: Identifying concealed and falsified emotions in universal facial expressions","volume":"19","author":"Porter","year":"2008","journal-title":"Psychol. Sci."},{"doi-asserted-by":"crossref","unstructured":"Endres, J., and Laidlaw, A. (2009). Micro-expression recognition training in medical students: A pilot study. BMC Med. Educ., 9.","key":"ref_13","DOI":"10.1186\/1472-6920-9-47"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"181","DOI":"10.1007\/s11031-011-9212-2","article-title":"Evidence for training the ability to read microexpressions of emotion","volume":"35","author":"Matsumoto","year":"2011","journal-title":"Motiv. Emot."},{"unstructured":"Ramachandran, V.S. (2012). The Tell-Tale Brain: A Neuroscientist\u2019s Quest for What Makes Us Human, WW Norton & Company.","key":"ref_15"},{"doi-asserted-by":"crossref","unstructured":"Sebe, N., Cohen, I., Gevers, T., and Huang, T.S. (2006, January 20\u201324). Emotion Recognition Based on Joint Visual and Audio Cues. Proceedings of the 18th International Conference on Pattern Recognition (ICPR\u201906), Hong Kong, China.","key":"ref_16","DOI":"10.1109\/ICPR.2006.489"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"1175","DOI":"10.1016\/j.procs.2017.05.025","article-title":"Emotion recognition using facial expressions","volume":"108","author":"Tarnowski","year":"2017","journal-title":"Procedia Comput. Sci."},{"doi-asserted-by":"crossref","unstructured":"See, J., Yap, M.H., Li, J., Hong, X., and Wang, S.-J. (2019, January 14\u201318). MEGC 2019\u2014The Second Facial Micro-Expressions Grand Challenge. Proceedings of the 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019), Lille, France.","key":"ref_18","DOI":"10.1109\/FG.2019.8756611"},{"doi-asserted-by":"crossref","unstructured":"Liu, Y., Du, H., Zheng, L., and Gedeon, T. (2019, January 14\u201318). A Neural Micro-Expression Recognizer. Proceedings of the 2019 14th IEEE International Conference on Automatic Face & Gesture Recognition (FG 2019), Lille, France.","key":"ref_19","DOI":"10.1109\/FG.2019.8756583"},{"unstructured":"Xie, H.X., Lo, L., Shuai, H.H., and Cheng, W.H. (2020). An Overview of Facial Micro-Expression Analysis: Data, Methodology and Challenge. arXiv.","key":"ref_20"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.vrih.2020.10.003","article-title":"Review of micro-expression spotting and recognition in video sequences","volume":"3","author":"Pan","year":"2021","journal-title":"Virtual Real. Intell. Hardw."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"121549","DOI":"10.1109\/ACCESS.2020.3006958","article-title":"Facial Micro-Expression Recognition Using Two-Dimensional Landmark Feature Maps","volume":"8","author":"Choi","year":"2020","journal-title":"IEEE Access"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"705","DOI":"10.1007\/s11265-020-01523-4","article-title":"Evaluation of the Spatio-Temporal Features and GAN for Micro-Expression Recognition System","volume":"92","author":"Liong","year":"2020","journal-title":"J. Signal Process. Syst."},{"doi-asserted-by":"crossref","unstructured":"Zhang, F., Zhang, T., Mao, Q., and Xu, C. (2018, January 18\u201323). Joint Pose and Expression Modeling for Facial Expression Recognition. Proceedings of the 2018 IEEE\/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA.","key":"ref_24","DOI":"10.1109\/CVPR.2018.00354"},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"174517","DOI":"10.1109\/ACCESS.2019.2942358","article-title":"Extended Local Binary Patterns for Efficient and Robust Spontaneous Facial Micro-Expression Recognition","volume":"7","author":"Guo","year":"2019","journal-title":"IEEE Access"},{"unstructured":"Nikolova, D., Petkova, P., Manolova, A., and Georgieva, P. (2018). ECG-based Emotion Recognition: Overview of Methods and Applications. ANNA\u201918; Advances in Neural Networks and Applications, VDE.","key":"ref_26"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"467","DOI":"10.3389\/fpsyg.2018.00467","article-title":"Biometric and Emotion Identification: An ECG Compression Based Method","volume":"9","author":"Ferreira","year":"2018","journal-title":"Front. Psychol."},{"key":"ref_28","first-page":"127","article-title":"Facial feature detection using Haar classifiers","volume":"21","author":"Wilson","year":"2006","journal-title":"J. Comput. Sci. Coll."},{"key":"ref_29","first-page":"1755","article-title":"Dlib-ml: A machine learning toolkit","volume":"10","author":"King","year":"2009","journal-title":"J. Mach. Learn. Res."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"49","DOI":"10.14253\/kjcn.2014.16.2.49","article-title":"Assessing Methods of Heart Rate Variability","volume":"16","author":"Park","year":"2014","journal-title":"Korean J. Clin. Neurophysiol."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"88","DOI":"10.1111\/j.1542-474X.2005.10101.x","article-title":"Heart rate variability: Measurement and clinical utility","volume":"10","author":"Kleiger","year":"2005","journal-title":"Ann. Noninvasive Electrocardiol."},{"key":"ref_32","first-page":"418","article-title":"Functional atlas of emotional faces processing: A voxel-based meta-analysis of 105 functional magnetic resonance imaging studies","volume":"34","author":"Placentino","year":"2009","journal-title":"J. Psychiatry Neurosci."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"327","DOI":"10.1016\/S0010-9452(78)80061-6","article-title":"Asymmetries in Interpreting and Expressing a Posed Facial Expression","volume":"14","author":"Campbell","year":"1978","journal-title":"Cortex"},{"key":"ref_34","first-page":"e00465","article-title":"Facial micro-expression recognition: A machine learning approach","volume":"8","author":"Adegun","year":"2020","journal-title":"Sci. Afr."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"3067","DOI":"10.1109\/TPAMI.2017.2787130","article-title":"Facial Landmark Detection with Tweaked Convolutional Neural Networks","volume":"40","author":"Wu","year":"2017","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"doi-asserted-by":"crossref","unstructured":"Melinte, D.O., and Vladareanu, L. (2020). Facial Expressions Recognition for Human\u2013Robot Interaction Using Deep Convolutional Neural Networks with Rectified Adam Optimizer. Sensors, 20.","key":"ref_36","DOI":"10.3390\/s20082393"}],"container-title":["Sensors"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1424-8220\/21\/13\/4616\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T06:26:23Z","timestamp":1760163983000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1424-8220\/21\/13\/4616"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,7,5]]},"references-count":36,"journal-issue":{"issue":"13","published-online":{"date-parts":[[2021,7]]}},"alternative-id":["s21134616"],"URL":"https:\/\/doi.org\/10.3390\/s21134616","relation":{},"ISSN":["1424-8220"],"issn-type":[{"type":"electronic","value":"1424-8220"}],"subject":[],"published":{"date-parts":[[2021,7,5]]}}}