{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T04:13:19Z","timestamp":1760242399215,"version":"build-2065373602"},"reference-count":55,"publisher":"MDPI AG","issue":"7","license":[{"start":{"date-parts":[[2017,6,30]],"date-time":"2017-06-30T00:00:00Z","timestamp":1498780800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Symmetry"],"abstract":"<jats:p>The application of user emotion recognition for fear is expanding in various fields, including the quantitative evaluation of horror movies, dramas, advertisements, games, and the monitoring of emergency situations in convenience stores (i.e., a clerk threatened by a robber), in addition to criminal psychology. Most of the existing methods for the recognition of fear involve referring to a single physiological signal or recognizing circumstances in which users feel fear by selecting the most informative one among multiple physiological signals. However, the level of accuracy as well as the credibility of these study methods is low. Therefore, in this study, data with high credibility were obtained using non-intrusive multimodal sensors of near-infrared and far-infrared light cameras and selected based on t-tests and Cohen\u2019s d analysis considering the symmetrical characteristics of face and facial feature points. The selected data were then combined into a fuzzy system using the input and output membership functions of symmetrical shape to ultimately derive a new method that can quantitatively show the level of a user\u2019s fear. The proposed method is designed to enhance conventional subjective evaluation (SE) by fuzzy system based on multi-modalities. By using four objective features except for SE and combining these four features into a fuzzy system, our system can produce an accurate level of fear without being affected by the physical, psychological, or fatigue condition of the participants in SE. After conducting a study on 20 subjects of various races and genders, the results indicate that the new method suggested in this study has a higher level of credibility for the recognition of fear than the methods used in previous studies.<\/jats:p>","DOI":"10.3390\/sym9070102","type":"journal-article","created":{"date-parts":[[2017,6,30]],"date-time":"2017-06-30T10:04:58Z","timestamp":1498817098000},"page":"102","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":7,"title":["Fuzzy System-Based Fear Estimation Based on the Symmetrical Characteristics of Face and Facial Feature Points"],"prefix":"10.3390","volume":"9","author":[{"given":"Kwan","family":"Lee","sequence":"first","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 100-715, Korea"}]},{"given":"Hyung","family":"Hong","sequence":"additional","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 100-715, Korea"}]},{"given":"Kang","family":"Park","sequence":"additional","affiliation":[{"name":"Division of Electronics and Electrical Engineering, Dongguk University, 30 Pildong-ro 1-gil, Jung-gu, Seoul 100-715, Korea"}]}],"member":"1968","published-online":{"date-parts":[[2017,6,30]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Kwon, D.-S., Kwak, Y.K., Park, J.C., Chung, M.J., Jee, E.-S., Park, K.-S., Kim, H.-R., Kim, Y.-M., Park, J.-C., and Kim, E.H. (2007, January 26\u201329). Emotion interaction system for a service robot. Proceedings of the 16th IEEE International Conference on Robot and Human Interactive Communication, Jeju, Korea.","DOI":"10.1109\/ROMAN.2007.4415108"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Machot, F.A., Mosa, A.H., Dabbour, K., Fasih, A., Schwarzlm\u00fcller, C., Ali, M., and Kyamakya, K. (2011, January 25\u201327). A novel real-time emotion detection system from audio streams based on Bayesian quadratic discriminate classifier for ADAS. Proceedings of the Joint 3rd International Workshop on Nonlinear Dynamics and Synchronization and 16th International Symposium on Theoretical Electrical Engineering, Klagenfurt, Austria.","DOI":"10.1109\/INDS.2011.6024783"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"889","DOI":"10.1162\/jocn.2006.18.6.889","article-title":"Fear recognition ability predicts differences in social cognitive and neural functioning in men","volume":"18","author":"Corden","year":"2006","journal-title":"J Cogn Neurosci."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"470","DOI":"10.1167\/9.8.470","article-title":"On the neural mechanism of fear recognition","volume":"9","author":"Roy","year":"2009","journal-title":"J. Vis."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"50","DOI":"10.1111\/1475-3588.00047","article-title":"Fear recognition and the neural basis of social cognition","volume":"8","author":"Skuse","year":"2003","journal-title":"Child Adolesc. Ment. Health"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Visser-Keizer, A.C., Westerhof-Evers, H.J., Gerritsen, M.J.J., van der Naalt, J., and Spikman, J.M. (2016). To fear is to gain? The role of fear recognition in risky decision making in TBI patients and healthy controls. PLoS ONE, 11.","DOI":"10.1371\/journal.pone.0166995"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"4121","DOI":"10.1073\/pnas.062018499","article-title":"Fear recognition in the voice is modulated by unconsciously recognized facial expressions but not by unconsciously recognized affective pictures","volume":"99","author":"Pourtois","year":"2002","journal-title":"Proc. Natl. Acad. Sci. USA"},{"key":"ref_8","unstructured":"(2016, October 17). Facial Recognition Software SHORE\u2122: Fast, Reliable and Real-time Capable. Available online: http:\/\/www.iis.fraunhofer.de\/en\/ff\/bsy\/tech\/bildanalyse\/shore-gesichtsdetektion.html."},{"key":"ref_9","first-page":"356","article-title":"Visual-based emotion detection for natural man-machine interaction","volume":"5243","author":"Strupp","year":"2008","journal-title":"Lect. Notes Artif. Intell."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"94","DOI":"10.1007\/978-3-540-24837-8_10","article-title":"Authentic emotion detection in real-time video","volume":"3058","author":"Sun","year":"2004","journal-title":"Lect. Notes Comput. Sci."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"160","DOI":"10.1016\/S1077-3142(03)00081-X","article-title":"Facial expression recognition from video sequences: Temporal and static modeling","volume":"91","author":"Cohen","year":"2003","journal-title":"Comput. Vis. Image Underst."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"32","DOI":"10.1109\/79.911197","article-title":"Emotion recognition in human-computer interaction","volume":"18","author":"Cowie","year":"2001","journal-title":"IEEE Signal Process. Mag."},{"key":"ref_13","unstructured":"Pal, P., Iyer, A.N., and Yantorno, R.E. (2006, January 14\u201319). Emotion detection from infant facial expressions and cries. Proceedings of the IEEE International Conference on Acoustics Speech and Signal Processing, Toulouse, France."},{"key":"ref_14","unstructured":"De silva, L.C., Miyasato, T., and Nakatsu, R. (1997, January 9\u201312). Facial emotion recognition using multi-modal information. Proceedings of the International Conference on Information, Communications and Signal Processing, Singapore."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"196","DOI":"10.1007\/s10527-008-9045-9","article-title":"Application of vibraimage technology and system for analysis of motor activity and study of functional state of the human body","volume":"42","author":"Minkin","year":"2008","journal-title":"Biomed. Eng."},{"key":"ref_16","unstructured":"Pavlidis, I., Levine, J., and Baukol, P. (2001, January 7\u201310). Thermal image analysis for anxiety detection. Proceedings of IEEE International Conference on Image Processing, Thessaloniki, Greece."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"427","DOI":"10.5143\/JESK.2012.31.3.427","article-title":"Emotion recognition using facial thermal images","volume":"31","author":"Eom","year":"2012","journal-title":"J. Ergon. Soc. Korea"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Bedoya-Jaramillo, S., Belalcazar-Bola\u00f1os, E., Villa-Ca\u00f1as, T., Orozco-Arroyave, J.R., Arias-Londo\u00f1o, J.D., and Vargas-Bonilla, J.F. (2012, January 12\u201314). Automatic emotion detection in speech using mel frequency cesptral coefficients. Proceedings of the XVII Symposium of Image, Signal Processing, and Artificial Vision, Medell\u00edn, Colombia.","DOI":"10.1109\/STSIVA.2012.6340558"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Sanchez, M.H., Tur, G., Ferrer, L., and Hakkani-T\u00fcr, D. (2010, January 26\u201330). Domain adaptation and compensation for emotion detection. Proceedings of the Interspeech 2010, Makuhari, Japan.","DOI":"10.21437\/Interspeech.2010-685"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"102","DOI":"10.1109\/T-AFFC.2011.28","article-title":"ECG pattern analysis for emotion detection","volume":"3","author":"Agrafioti","year":"2012","journal-title":"IEEE Trans. Affect. Comput."},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Lin, Y.-P., Wang, C.-H., Wu, T.-L., Jeng, S.-K., and Chen, J.-H. (2009, January 19\u201324). EEG-based emotion recognition in music listening: a comparison of schemes for multiclass support vector machine. Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing, Taipei, Taiwan.","DOI":"10.1109\/ICASSP.2009.4959627"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Cheemalapati, S., Gubanov, M., Vale, M.D., and Pyayt, A. (2013, January 14\u201316). A real-time classification algorithm for emotion detection using portable EEG. Proceedings of the 14th International Conference on Information Reuse and Integration, San Francisco, CA, USA.","DOI":"10.1109\/IRI.2013.6642541"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"685","DOI":"10.1016\/j.cogbrainres.2005.04.002","article-title":"Electrophysiological ratio markers for the balance between reward and punishment","volume":"24","author":"Schutter","year":"2005","journal-title":"Cogn. Brain Res."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"73","DOI":"10.1016\/j.biopsycho.2009.10.008","article-title":"EEG theta\/beta ratio in relation to fear-modulated response-inhibition, attentional control, and affective traits","volume":"83","author":"Putman","year":"2010","journal-title":"Biol. Psychol."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"419","DOI":"10.1007\/BF02344719","article-title":"Emotion recognition system using short-term monitoring of physiological signals","volume":"42","author":"Kim","year":"2004","journal-title":"Med. Biol. Eng. Comput."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"263","DOI":"10.1016\/j.biopsych.2007.05.013","article-title":"Exogenous testosterone enhances responsiveness to social threat in the neural circuitry of social aggression in humans","volume":"63","author":"Hermans","year":"2008","journal-title":"Biol. Psychiatry"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"17507","DOI":"10.3390\/s150717507","article-title":"Evaluation of fear using nonintrusive measurement of multimodal sensors","volume":"15","author":"Choi","year":"2015","journal-title":"Sensors"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"34","DOI":"10.1016\/j.ijpsycho.2005.04.007","article-title":"From emotion perception to emotion experience: Emotions evoked by pictures and classical music","volume":"60","author":"Baumgarter","year":"2006","journal-title":"Int. J. Psychophysiol."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Cheng, K.-S., Chen, Y.-S., and Wang, T. (2012, January 17\u201319). Physiological parameters assessment for emotion recognition. Proceedings of the IEEE EMBS International Conference on Biomedical Engineering and Sciences, Langkawi, Malaysia.","DOI":"10.1109\/IECBES.2012.6498118"},{"key":"ref_30","unstructured":"Chun, J., Lee, H., Park, Y.S., Park, W., Park, J., Han, S.H., Choi, S., and Kim, G.H. (2007, January 17\u201319). Real-time classification of fear\/panic emotion based on physiological signals. Proceedings of the Eighth Pan-Pacific Conference on Occupational Ergonomics, Bangkok, Thailand."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"3963","DOI":"10.1080\/03610928908830135","article-title":"The two-sample test versus Satterthwaite\u2019s approximate f-test","volume":"18","author":"Moser","year":"1989","journal-title":"Commun Statist Theory Meth."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"591","DOI":"10.1111\/j.1469-185X.2007.00027.x","article-title":"Effect size, confidence interval and statistical significance: a practical guide for biologists","volume":"82","author":"Nakagawa","year":"2007","journal-title":"Biol. Rev."},{"key":"ref_33","unstructured":"(2016, October 14). Tau\u00ae2 Uncooled Cores. Available online: http:\/\/www.flir.com\/cores\/display\/?id=54717."},{"key":"ref_34","unstructured":"(2016, October 14). Webcam C600. Available online: http:\/\/www.logitech.com\/en-us\/support\/5869."},{"key":"ref_35","unstructured":"(2016, October 14). SFH 4550. Available online: http:\/\/www.osram-os.com\/Graphics\/XPic3\/00116140_0.pdf."},{"key":"ref_36","unstructured":"(2016, October 14). Samsung Smart TV. Available online: http:\/\/www.samsung.com\/us\/system\/consumer\/product\/un\/60\/es\/un60es8000fxza\/7654_SlimLED_60_8000_V14.pdf."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"23","DOI":"10.5594\/J15117","article-title":"Research on human factors in ultrahigh-definition television (UHDTV) to determine its specifications","volume":"117","author":"Sugawara","year":"2008","journal-title":"SMPTE Motion Imaging J."},{"key":"ref_38","first-page":"100110C-1","article-title":"Face liveness detection for face recognition based on cardiac features of skin color image","volume":"10011","author":"Suh","year":"2016","journal-title":"Proc. SPIE"},{"key":"ref_39","unstructured":"(2016, November 03). Dlib C++ Library (Real-time face pose estimation). Available online: http:\/\/blog.dlib.net\/2014\/08\/real-time-face-pose-estimation.html."},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Kazemi, V., and Sullivan, J. (2014, January 23\u201328). One millisecond face alignment with an ensemble of regression trees. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA.","DOI":"10.1109\/CVPR.2014.241"},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Suh, K.H., Kim, Y., and Lee, E.C. (2016). Facial feature movements caused by various emotions: differences according to sex. Symmetry-Basel, 8.","DOI":"10.3390\/sym8090086"},{"key":"ref_42","first-page":"164","article-title":"Why trapezoidal and triangular membership functions work so well: Towards a theoretical explanation","volume":"8","author":"Barua","year":"2014","journal-title":"J. Uncertain Syst."},{"key":"ref_43","unstructured":"Dowdy, S., and Wearden, S. (1983). Statistics for Research, John Wiley & Sons."},{"key":"ref_44","doi-asserted-by":"crossref","first-page":"1668","DOI":"10.1016\/j.asoc.2012.01.023","article-title":"Sustainable supplier selection: A ranking model based on fuzzy inference system","volume":"12","author":"Amindoust","year":"2012","journal-title":"Appl. Soft. Comput."},{"key":"ref_45","unstructured":"Aboelela, E, and Douligeris, C. (1999, January 18\u201320). Fuzzy temporal reasoning model for event correlation in network management. Proceedings of the IEEE International Conference on Local Computer Networks, Lowell, MA, USA."},{"key":"ref_46","doi-asserted-by":"crossref","first-page":"159","DOI":"10.1016\/S0165-0114(97)00337-0","article-title":"Defuzzification: Criteria and classification","volume":"108","author":"Leekwijck","year":"1999","journal-title":"Fuzzy Sets Syst."},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"904","DOI":"10.1016\/j.fss.2005.11.005","article-title":"Fast and accurate center of gravity defuzzification of fuzzy system outputs defined on trapezoidal fuzzy partitions","volume":"157","author":"Broekhoven","year":"2006","journal-title":"Fuzzy Sets Syst."},{"key":"ref_48","unstructured":"(2016, October 21). Epitaph (2007 Film). Available online: https:\/\/en.wikipedia.org\/wiki\/Epitaph_(2007_film)."},{"key":"ref_49","unstructured":"(2016, October 21). The Conjuring. Available online: https:\/\/en.wikipedia.org\/wiki\/The_Conjuring."},{"key":"ref_50","unstructured":"(2016, October 21). Dead Silence. Available online: https:\/\/en.wikipedia.org\/wiki\/Dead_Silence."},{"key":"ref_51","unstructured":"(2016, October 21). Insidious (Film). Available online: https:\/\/en.wikipedia.org\/wiki\/Insidious_(film)."},{"key":"ref_52","unstructured":"Lang, P.J., Bradley, M.M., and Cuthbert, B.N. (2008). International Affective Picture System (IAPS): Affective Ratings of Pictures and Instruction Manual, University of Florida. Technical Report A-8."},{"key":"ref_53","doi-asserted-by":"crossref","first-page":"155","DOI":"10.1037\/0033-2909.112.1.155","article-title":"A power primer","volume":"112","author":"Cohen","year":"1992","journal-title":"Psychol. Bull."},{"key":"ref_54","doi-asserted-by":"crossref","first-page":"71","DOI":"10.1162\/jocn.1991.3.1.71","article-title":"Eigenfaces for recognition","volume":"3","author":"Turk","year":"1991","journal-title":"J. Cogn. Neurosci."},{"key":"ref_55","unstructured":"Friesen, W.V., and Ekman, P. (1984). EMFACS-7: Emotional Facial Action Coding System, University of California. unpublished manuscript."}],"container-title":["Symmetry"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2073-8994\/9\/7\/102\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T18:40:56Z","timestamp":1760208056000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2073-8994\/9\/7\/102"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2017,6,30]]},"references-count":55,"journal-issue":{"issue":"7","published-online":{"date-parts":[[2017,7]]}},"alternative-id":["sym9070102"],"URL":"https:\/\/doi.org\/10.3390\/sym9070102","relation":{},"ISSN":["2073-8994"],"issn-type":[{"type":"electronic","value":"2073-8994"}],"subject":[],"published":{"date-parts":[[2017,6,30]]}}}