{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,6,8]],"date-time":"2025-06-08T06:21:25Z","timestamp":1749363685733,"version":"3.37.3"},"reference-count":89,"publisher":"Springer Science and Business Media LLC","issue":"8","license":[{"start":{"date-parts":[[2024,8,1]],"date-time":"2024-08-01T00:00:00Z","timestamp":1722470400000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2024,8,5]],"date-time":"2024-08-05T00:00:00Z","timestamp":1722816000000},"content-version":"vor","delay-in-days":4,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100020959","name":"JST-Mirai Program","doi-asserted-by":"publisher","award":["JPMJMI20D7"],"award-info":[{"award-number":["JPMJMI20D7"]}],"id":[{"id":"10.13039\/501100020959","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100009427","name":"Telecommunications Advancement Foundation","doi-asserted-by":"publisher","id":[{"id":"10.13039\/501100009427","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001691","name":"Japan Society for the Promotion of Science","doi-asserted-by":"publisher","award":["23K17180"],"award-info":[{"award-number":["23K17180"]}],"id":[{"id":"10.13039\/501100001691","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100003790","name":"Hiroshima University","doi-asserted-by":"crossref","id":[{"id":"10.13039\/501100003790","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Int J of Soc Robotics"],"published-print":{"date-parts":[[2024,8]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>The \u201cthinking face\u201d is a facial signal used to convey being in thought. For androids, the thinking face may be important to achieve natural human\u2013robot interaction. However, the facial pattern necessary for portraying the thinking face remains unclear and has not yet been investigated in androids. The current study aims to (a) identify the facial patterns when people are engaged in answering complex questions (i.e., thinking face) and (b) clarify whether implementing the observed thinking faces in an android can facilitate natural human\u2013robot interaction. In Study 1, we analyze the facial movements of 40 participants after they are prompted with difficult questions and indicate five facial patterns that corresponded to thinking faces. In Study 2, we further focus on the pattern of furrowing of the brows and narrowing of the eyes among the observed thinking facial patterns and implement this pattern in an android. The results show that thinking faces enhance the perception of being in thought, genuineness, human-likeness, and appropriateness in androids while decreasing eeriness. The free-description data also revealed that negative emotions are attributed to the thinking face. In Study 3, we compared the thinking vs. neutral faces in a question\u2013answer situation. The results showed that the android's thinking face facilitated the perception of being in thought and human-likeness. These findings suggest that the thinking face of androids can facilitate natural human\u2013robot interaction.<\/jats:p>","DOI":"10.1007\/s12369-024-01163-9","type":"journal-article","created":{"date-parts":[[2024,8,5]],"date-time":"2024-08-05T18:14:24Z","timestamp":1722881664000},"page":"1861-1877","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["How an Android Expresses \u201cNow Loading\u2026\u201d: Examining the Properties of Thinking Faces"],"prefix":"10.1007","volume":"16","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-7768-2738","authenticated-orcid":false,"given":"Shushi","family":"Namba","sequence":"first","affiliation":[]},{"given":"Wataru","family":"Sato","sequence":"additional","affiliation":[]},{"given":"Saori","family":"Namba","sequence":"additional","affiliation":[]},{"given":"Alexander","family":"Diel","sequence":"additional","affiliation":[]},{"given":"Carlos","family":"Ishi","sequence":"additional","affiliation":[]},{"given":"Takashi","family":"Minato","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2024,8,5]]},"reference":[{"key":"1163_CR1","doi-asserted-by":"publisher","first-page":"98","DOI":"10.1075\/gest.00012.bav","volume":"17","author":"J Bavelas","year":"2018","unstructured":"Bavelas J, Chovil N (2018) Some pragmatic functions of conversational facial gestures. Gesture 17:98\u2013127. https:\/\/doi.org\/10.1075\/gest.00012.bav","journal-title":"Gesture"},{"key":"1163_CR2","doi-asserted-by":"publisher","first-page":"165","DOI":"10.1080\/1047840X.2017.1328951","volume":"28","author":"A Scarantino","year":"2017","unstructured":"Scarantino A (2017) How to do things with emotional expressions: the theory of affective pragmatics. Psychol Inq 28:165\u2013185. https:\/\/doi.org\/10.1080\/1047840X.2017.1328951","journal-title":"Psychol Inq"},{"key":"1163_CR3","doi-asserted-by":"publisher","first-page":"163","DOI":"10.1080\/08351819109389361","volume":"25","author":"N Chovil","year":"1991","unstructured":"Chovil N (1991) Discourse-oriented facial displays in conversation. Res Lang Soc Interact 25:163\u2013194. https:\/\/doi.org\/10.1080\/08351819109389361","journal-title":"Res Lang Soc Interact"},{"key":"1163_CR4","doi-asserted-by":"publisher","first-page":"1478","DOI":"10.1037\/pspp0000272","volume":"119","author":"J Sun","year":"2020","unstructured":"Sun J, Harris K, Vazire S (2020) Is well-being associated with the quantity and quality of social interactions? J Pers Soc Psychol 119:1478\u20131496. https:\/\/doi.org\/10.1037\/pspp0000272","journal-title":"J Pers Soc Psychol"},{"key":"1163_CR5","doi-asserted-by":"publisher","first-page":"1011","DOI":"10.1037\/0022-3514.63.6.1011","volume":"63","author":"D Watson","year":"1992","unstructured":"Watson D, Clark LA, McIntyre CW, Hamaker S (1992) Affect, personality, and social activity. J Pers Soc Psychol 63:1011\u20131025","journal-title":"J Pers Soc Psychol"},{"key":"1163_CR6","doi-asserted-by":"publisher","DOI":"10.21437\/Interspeech.2017-631","author":"C Ishi","year":"2017","unstructured":"Ishi C, Minato T, Ishiguro H (2017) Motion analysis in vocalized surprise expressions. Interspeech. https:\/\/doi.org\/10.21437\/Interspeech.2017-631","journal-title":"Interspeech"},{"key":"1163_CR7","doi-asserted-by":"publisher","first-page":"1851","DOI":"10.1109\/jproc.2004.835355","volume":"92","author":"T Fukuda","year":"2004","unstructured":"Fukuda T, Jung MJ, Nakashima M et al (2004) Facial expressive robotic head system for human-robot communication and its application in home environment. Proceedings of the IEEE 92:1851\u20131865. https:\/\/doi.org\/10.1109\/jproc.2004.835355","journal-title":"Proceedings of the IEEE"},{"key":"1163_CR8","doi-asserted-by":"publisher","unstructured":"Kobayashi H, Hara F (1997) Facial interaction between animated 3D face robot and human beings. 1997 IEEE International conference on systems, man, and cybernetics computational cybernetics and simulation. https:\/\/doi.org\/10.1109\/icsmc.1997.633250","DOI":"10.1109\/icsmc.1997.633250"},{"key":"1163_CR9","doi-asserted-by":"publisher","first-page":"64","DOI":"10.3389\/fbioe.2015.00064","volume":"3","author":"N Lazzeri","year":"2015","unstructured":"Lazzeri N, Mazzei D, Greco A, Rotesi A, Lanat\u00e0 A, De Rossi DE (2015) Can a humanoid face be expressive? a psychophysiological investigation. Front Bioeng Biotechnol 3:64","journal-title":"Front Bioeng Biotechnol"},{"key":"1163_CR10","doi-asserted-by":"publisher","DOI":"10.1037\/0003-066X.48.4.384","volume":"48","author":"P Ekman","year":"1993","unstructured":"Ekman P (1993) Facial expression and emotion. Am Psychol 48:384392","journal-title":"Am Psychol"},{"issue":"2","key":"1163_CR11","doi-asserted-by":"publisher","first-page":"389","DOI":"10.1007\/s12369-021-00778-6","volume":"14","author":"R Stock-Homburg","year":"2022","unstructured":"Stock-Homburg R (2022) Survey of emotions in human\u2013robot interactions: perspectives from robotic psychology on 20 years of research. Int J Soc Robot 14(2):389\u2013411","journal-title":"Int J Soc Robot"},{"key":"1163_CR12","doi-asserted-by":"crossref","first-page":"51","DOI":"10.1515\/semi.1986.62.1-2.29","volume":"62","author":"MH Goodwin","year":"1986","unstructured":"Goodwin MH, Goodwin C (1986) Gesture and coparticipation in the activity of searching for a word. Semiotica 62:51\u201375","journal-title":"Semiotica"},{"key":"1163_CR13","doi-asserted-by":"publisher","first-page":"636671","DOI":"10.3389\/fpsyg.2021.636671","volume":"12","author":"V Heller","year":"2021","unstructured":"Heller V (2021) Embodied displays of \u201cdoing thinking\u201d epistemic and interactive functions of thinking displays in children\u2019s argumentative activities. Front Psychol 12:636671","journal-title":"Front Psychol"},{"key":"1163_CR14","doi-asserted-by":"publisher","first-page":"1017","DOI":"10.3390\/brainsci11081017","volume":"11","author":"N Nota","year":"2021","unstructured":"Nota N, Trujillo JP, Holler J (2021) Facial signals and social actions in multimodal face-to-face interaction. Brain Sci 11:1017. https:\/\/doi.org\/10.3390\/brainsci11081017","journal-title":"Brain Sci"},{"key":"1163_CR15","first-page":"159","volume":"9","author":"PER Bitti","year":"2014","unstructured":"Bitti PER, Bonfiglioli L, Melani P, Caterina R, Garotti P (2014) Expression and communication of doubt\/uncertainty through facial expression. Ricerche di pedagogia e didattica. J Theor Res Educ 9:159\u2013177","journal-title":"J Theor Res Educ"},{"key":"1163_CR16","doi-asserted-by":"publisher","first-page":"369","DOI":"10.1002\/ejsp.2420200502","volume":"20","author":"U Hess","year":"1990","unstructured":"Hess U, Kleck RE (1990) Differentiating emotion elicited and deliberate emotional facial expressions. Eur J Soc Psychol 20:369\u2013385. https:\/\/doi.org\/10.1002\/ejsp.2420200502","journal-title":"Eur J Soc Psychol"},{"key":"1163_CR17","doi-asserted-by":"publisher","first-page":"35","DOI":"10.1007\/s10919-008-0058-6","volume":"33","author":"KL Schmidt","year":"2009","unstructured":"Schmidt KL, Bhattacharya S, Denlinger R (2009) Comparison of deliberate and spontaneous facial movement in smiles and eyebrow raises. J Nonverbal Behav 33:35\u201345. https:\/\/doi.org\/10.1007\/s10919-008-0058-6","journal-title":"J Nonverbal Behav"},{"key":"1163_CR18","doi-asserted-by":"publisher","first-page":"593","DOI":"10.1007\/s12144-016-9448-9","volume":"36","author":"S Namba","year":"2017","unstructured":"Namba S, Makihara S, Kabir RS, Miyatani M, Nakao T (2017) Spontaneous facial expressions are different from posed facial expressions: morphological properties and dynamic sequences. Curr Psychol 36:593\u2013605. https:\/\/doi.org\/10.1007\/s12144-016-9448-9","journal-title":"Curr Psychol"},{"key":"1163_CR19","doi-asserted-by":"publisher","first-page":"29","DOI":"10.1177\/00238309050480010201","volume":"48","author":"E Krahmer","year":"2005","unstructured":"Krahmer E, Swerts M (2005) How children and adults produce and perceive uncertainty in audiovisual speech. Lang Speech 48:29\u201353. https:\/\/doi.org\/10.1177\/00238309050480010201","journal-title":"Lang Speech"},{"key":"1163_CR20","doi-asserted-by":"publisher","first-page":"366","DOI":"10.1080\/15475441.2019.1645669","volume":"15","author":"I H\u00fcbscher","year":"2019","unstructured":"H\u00fcbscher I, Vincze L, Prieto P (2019) Children\u2019s signaling of their uncertain knowledge state: prosody, face, and body cues come first. Lang Learn Dev 15:366\u2013389. https:\/\/doi.org\/10.1080\/15475441.2019.1645669","journal-title":"Lang Learn Dev"},{"key":"1163_CR21","doi-asserted-by":"publisher","first-page":"68","DOI":"10.1037\/1528-3542.3.1.68","volume":"3","author":"P Rozin","year":"2003","unstructured":"Rozin P, Cohen AB (2003) High frequency of facial expressions corresponding to confusion, concentration, and worry in an analysis of naturally occurring facial expressions of americans. Emotion 3:68\u201375. https:\/\/doi.org\/10.1037\/1528-3542.3.1.68","journal-title":"Emotion"},{"key":"1163_CR22","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/3470742","volume":"11","author":"A Diel","year":"2021","unstructured":"Diel A, Weigelt S, MacDorman KF (2021) A meta-analysis of the uncanny valley\u2019s independent and dependent variables. J Hum Robot Interact 11:1\u201333. https:\/\/doi.org\/10.1145\/3470742","journal-title":"J Hum Robot Interact"},{"key":"1163_CR23","doi-asserted-by":"publisher","first-page":"98","DOI":"10.1109\/MRA.2012.2192811","volume":"19","author":"M Mori","year":"2012","unstructured":"Mori M, MacDorman K, Kageki N (2012) The uncanny valley [from the field]. IEEE Robot Automat Mag 19:98\u2013100. https:\/\/doi.org\/10.1109\/MRA.2012.2192811","journal-title":"IEEE Robot Automat Mag"},{"key":"1163_CR24","doi-asserted-by":"publisher","first-page":"129","DOI":"10.1007\/s12369-016-0380-9","volume":"9","author":"C-C Ho","year":"2017","unstructured":"Ho C-C, MacDorman KF (2017) Measuring the uncanny valley effect: refinements to indices for perceived humanness, attractiveness, and eeriness. Int J Soc Robot 9:129\u2013139. https:\/\/doi.org\/10.1007\/s12369-016-0380-9","journal-title":"Int J Soc Robot"},{"key":"1163_CR25","doi-asserted-by":"publisher","first-page":"741","DOI":"10.1016\/j.chb.2010.10.018","volume":"27","author":"A Tinwell","year":"2011","unstructured":"Tinwell A, Grimshaw M, Nabi DA, Williams A (2011) Facial expression of emotion and perception of the uncanny valley in virtual characters. Comput Hum Behav 27:741\u2013749. https:\/\/doi.org\/10.1016\/j.chb.2010.10.018","journal-title":"Comput Hum Behav"},{"key":"1163_CR26","doi-asserted-by":"publisher","first-page":"1443","DOI":"10.1007\/s12369-020-00726-w","volume":"13","author":"C Thepsoonthorn","year":"2021","unstructured":"Thepsoonthorn C, Ogawa K-i, Miyake Y (2021) The exploration of the uncanny valley from the viewpoint of the robot\u2019s nonverbal behaviour. Int J Soc Robot Adv Online Publ 13:1443\u20131455. https:\/\/doi.org\/10.1007\/s12369-020-00726-w","journal-title":"Int J Soc Robot Adv Online Publ"},{"key":"1163_CR27","doi-asserted-by":"publisher","DOI":"10.3389\/fpsyg.2021.800657","volume":"12","author":"W Sato","year":"2021","unstructured":"Sato W, Namba S, Yang D, Nishida S, Ishi C, Minato T (2021) An android for emotional interaction: spatiotemporal validation of its facial expressions. Front Psychol 12:800657. https:\/\/doi.org\/10.3389\/fpsyg.2021.800657","journal-title":"Front Psychol"},{"key":"1163_CR28","doi-asserted-by":"publisher","first-page":"788","DOI":"10.1038\/44565","volume":"401","author":"DD Lee","year":"1999","unstructured":"Lee DD, Seung HS (1999) Learning the parts of objects by non-negative matrix factorization. Nature 401:788\u2013791. https:\/\/doi.org\/10.1038\/44565","journal-title":"Nature"},{"key":"1163_CR29","doi-asserted-by":"publisher","first-page":"3362","DOI":"10.1038\/s41598-021-83077-4","volume":"11","author":"S Namba","year":"2021","unstructured":"Namba S, Matsui H, Zloteanu M (2021) Distinct temporal features of genuine and deliberate facial expressions of surprise. Sci Rep 11:3362. https:\/\/doi.org\/10.1038\/s41598-021-83077-4","journal-title":"Sci Rep"},{"key":"1163_CR30","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0271047","volume":"17","author":"S Namba","year":"2022","unstructured":"Namba S, Nakamura K, Watanabe K (2022) The spatio-temporal features of perceived-as-genuine and deliberate expressions. PLOS ONE 17:e0271047. https:\/\/doi.org\/10.1371\/journal.pone.0271047","journal-title":"PLOS ONE"},{"key":"1163_CR31","doi-asserted-by":"publisher","unstructured":"Perusqu\u00eda-Hern\u00e1ndez M, Dollack F, Tan CK, Namba S, Ayabe-Kanamura S, Suzuki K (2021) Smile action unit detection from distal wearable electromyography and computer vision. In: Vol. 2021 16th IEEE international conference on automatic face and gesture recognition (FG 2021). IEEE Publications, 1\u20138. https:\/\/doi.org\/10.1109\/FG52635.2021.9667047","DOI":"10.1109\/FG52635.2021.9667047"},{"key":"1163_CR32","doi-asserted-by":"publisher","first-page":"175","DOI":"10.3758\/bf03193146","volume":"39","author":"F Faul","year":"2007","unstructured":"Faul F, Erdfelder E, Lang AG, Buchner A (2007) G*Power 3: a flexible statistical power analysis program for the social, behavioral, and biomedical sciences. Behav Res Methods 39:175\u2013191. https:\/\/doi.org\/10.3758\/bf03193146","journal-title":"Behav Res Methods"},{"key":"1163_CR33","volume-title":"Statistical power analysis for the behavioral science","author":"J Cohen","year":"1988","unstructured":"Cohen J (1988) Statistical power analysis for the behavioral science, 2nd edn. Lawrence Erlbaum, Hillsdale, NJ","edition":"2"},{"key":"1163_CR34","doi-asserted-by":"publisher","unstructured":"Baltru\u0161aitis T, Mahmoud M, Robinson P (2015) Cross-dataset learning and person-specific normalisation for automatic action unit detection. In: 1th IEEE international conference and workshops on automatic face and gesture recognition (FG), Vol. 6. IEEE Publications, 1\u20136. https:\/\/doi.org\/10.1109\/FG.2015.7284869","DOI":"10.1109\/FG.2015.7284869"},{"key":"1163_CR35","doi-asserted-by":"publisher","unstructured":"Baltru\u0161aitis T, Zadeh A, Lim YC, Morency LP (2018) Openface 2.0: facial behavior analysis toolkit. In: 3th IEEE international conference on automatic face & gesture recognition (FG 2018). IEEE Publications, 59\u201366. https:\/\/doi.org\/10.1109\/FG.2018.00019","DOI":"10.1109\/FG.2018.00019"},{"key":"1163_CR36","volume-title":"Facial action coding system","author":"P Ekman","year":"2002","unstructured":"Ekman P, Friesen WV, Hager JC (2002) Facial action coding system, 2nd edn. Research nexus e-book, Salt Lake City","edition":"2"},{"key":"1163_CR37","volume-title":"What the face reveals","author":"P Ekman","year":"2005","unstructured":"Ekman P, Rosenberg E (2005) What the face reveals, 2nd edn. Oxford University Press, New York","edition":"2"},{"key":"1163_CR38","doi-asserted-by":"publisher","unstructured":"Wood E, Baltruaitis T, Zhang X, Sugano Y, Robinson P, Bulling A (2015) Rendering of eyes for eye-shape registration and gaze estimation. In: proceedings of the IEEE international conference on computer vision, 3756\u20133764. https:\/\/doi.org\/10.1109\/ICCV.2015.428","DOI":"10.1109\/ICCV.2015.428"},{"key":"1163_CR39","doi-asserted-by":"publisher","first-page":"4164","DOI":"10.1073\/pnas.0308531101","volume":"101","author":"JP Brunet","year":"2004","unstructured":"Brunet JP, Tamayo P, Golub TR, Mesirov JP (2004) Metagenes and molecular pattern discovery using matrix factorization. Proc Natl Acad Sci U S A 101:4164\u20134169. https:\/\/doi.org\/10.1073\/pnas.0308531101","journal-title":"Proc Natl Acad Sci U S A"},{"key":"1163_CR40","doi-asserted-by":"publisher","first-page":"1495","DOI":"10.1093\/bioinformatics\/btm134","volume":"23","author":"H Kim","year":"2007","unstructured":"Kim H, Park H (2007) Sparse non-negative matrix factorizations via alternating non-negativity-constrained least squares for microarray data analysis. Bioinformatics 23:1495\u20131502. https:\/\/doi.org\/10.1093\/bioinformatics\/btm134","journal-title":"Bioinformatics"},{"key":"1163_CR41","doi-asserted-by":"publisher","first-page":"2684","DOI":"10.1093\/bioinformatics\/btn526","volume":"24","author":"LN Hutchins","year":"2008","unstructured":"Hutchins LN, Murphy SM, Singh P, Graber JH (2008) Position-dependent motif characterization using non-negative matrix factorization. Bioinformatics 24:2684\u20132690. https:\/\/doi.org\/10.1093\/bioinformatics\/btn526","journal-title":"Bioinformatics"},{"key":"1163_CR42","unstructured":"Del Re AC (2013). Compute. es: compute effect sizes. R package version, 0\u20132"},{"key":"1163_CR43","doi-asserted-by":"publisher","first-page":"367","DOI":"10.1186\/1471-2105-11-367","volume":"11","author":"R Gaujoux","year":"2010","unstructured":"Gaujoux R, Seoighe C (2010) A flexible R package for nonnegative matrix factorization. BMC Bioinformatics 11:367. https:\/\/doi.org\/10.1186\/1471-2105-11-367","journal-title":"BMC Bioinformatics"},{"key":"1163_CR44","unstructured":"Iseki R (2023) Retrieved [Jul 10, 2023] from @@@@"},{"key":"1163_CR45","doi-asserted-by":"publisher","unstructured":"Wickham H, Averick M, Bryan J et al (2019) Welcome to the tidyverse. J Open Source Softw 4:1686. https:\/\/doi.org\/10.21105\/joss.01686","DOI":"10.21105\/joss.01686"},{"key":"1163_CR46","doi-asserted-by":"publisher","first-page":"4222","DOI":"10.3390\/s21124222","volume":"21","author":"S Namba","year":"2021","unstructured":"Namba S, Sato W, Osumi M, Shimokawa K (2021) Assessing automated facial action unit detection systems for analyzing cross-domain facial expression databases. Sensors (Basel) 21:4222. https:\/\/doi.org\/10.3390\/s21124222","journal-title":"Sensors (Basel)"},{"key":"1163_CR47","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1177\/1529100619832930","volume":"20","author":"LF Barrett","year":"2019","unstructured":"Barrett LF, Adolphs R, Marsella S, Martinez AM, Pollak SD (2019) Emotional expressions reconsidered: challenges to inferring emotion from human facial movements. Psychol Sci Public Interest 20:1\u201368. https:\/\/doi.org\/10.1177\/1529100619832930","journal-title":"Psychol Sci Public Interest"},{"key":"1163_CR48","doi-asserted-by":"publisher","first-page":"286","DOI":"10.1177\/0963721411422522","volume":"20","author":"LF Barrett","year":"2011","unstructured":"Barrett LF, Mesquita B, Gendron M (2011) Context in emotion perception. Curr Dir Psychol Sci 20:286\u2013290. https:\/\/doi.org\/10.1177\/0963721411422522","journal-title":"Curr Dir Psychol Sci"},{"key":"1163_CR49","doi-asserted-by":"publisher","first-page":"7559","DOI":"10.1073\/pnas.1812250116","volume":"116","author":"Z Chen","year":"2019","unstructured":"Chen Z, Whitney D (2019) Tracking the affective state of unseen persons. Proc Natl Acad Sci U S A 116:7559\u20137564. https:\/\/doi.org\/10.1073\/pnas.1812250116","journal-title":"Proc Natl Acad Sci U S A"},{"key":"1163_CR50","doi-asserted-by":"publisher","first-page":"378","DOI":"10.3389\/fpsyg.2017.00378","volume":"8","author":"Y Majima","year":"2017","unstructured":"Majima Y, Nishiyama K, Nishihara A, Hata R (2017) Conducting online behavioral research using crowdsourcing services in Japan. Front Psychol 8:378. https:\/\/doi.org\/10.3389\/fpsyg.2017.00378","journal-title":"Front Psychol"},{"key":"1163_CR51","doi-asserted-by":"publisher","first-page":"1539","DOI":"10.3758\/s13428-016-0813-2","volume":"49","author":"A Dawel","year":"2017","unstructured":"Dawel A, Wright L, Irons J, Dumbleton R, Palermo R, O\u2019Kearney R, McKone E (2017) Perceived emotion genuineness: normative ratings for popular facial expression stimuli and the development of perceived-as-genuine and perceived-as-fake sets. Behav Res Methods 49:1539\u20131562. https:\/\/doi.org\/10.3758\/s13428-016-0813-2","journal-title":"Behav Res Methods"},{"key":"1163_CR52","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0072589","author":"E Broadbent","year":"2013","unstructured":"Broadbent E, Kumar V, Li X, Sollers J, Stafford RQ, MacDonald BA, Wegner DM (2013) Robots with display screens: a robot with a humanlike face display is perceived to have more mind and a better personality. PLOS ONE. https:\/\/doi.org\/10.1371\/journal.pone.0072589","journal-title":"PLOS ONE"},{"key":"1163_CR53","doi-asserted-by":"publisher","first-page":"535","DOI":"10.1177\/0022243718822827","volume":"56","author":"M Mende","year":"2019","unstructured":"Mende M, Scott ML, van Doorn J, Grewal D, Shanks I (2019) Service robots rising: how humanoid robots influence service experiences and elicit compensatory consumer responses. J Mark Res 56:535\u2013556. https:\/\/doi.org\/10.1177\/0022243718822827","journal-title":"J Mark Res"},{"key":"1163_CR54","doi-asserted-by":"publisher","first-page":"97","DOI":"10.1016\/j.obhdp.2017.10.002","volume":"144","author":"A Cheshin","year":"2018","unstructured":"Cheshin A, Amit A, Van Kleef GA (2018) The interpersonal effects of emotion intensity in customer service: perceived appropriateness and authenticity of attendants\u2019 emotional displays shape customer trust and satisfaction. Organ Behav Hum Decis Processes 144:97\u2013111. https:\/\/doi.org\/10.1016\/j.obhdp.2017.10.002","journal-title":"Organ Behav Hum Decis Processes"},{"key":"1163_CR55","doi-asserted-by":"publisher","first-page":"360","DOI":"10.1177\/002224378402100402","volume":"21","author":"GA Churchill","year":"1984","unstructured":"Churchill GA, Peter JP (1984) Research design effects on the reliability of rating scales: a meta-analysis. J Mark Res 21:360\u2013375. https:\/\/doi.org\/10.1177\/002224378402100402","journal-title":"J Mark Res"},{"key":"1163_CR56","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/s0001-6918(99)00050-5","volume":"104","author":"CC Preston","year":"2000","unstructured":"Preston CC, Colman AM (2000) Optimal number of response categories in rating scales: reliability, validity, discriminating power, and respondent preferences. Acta Psychol 104:1\u201315. https:\/\/doi.org\/10.1016\/s0001-6918(99)00050-5","journal-title":"Acta Psychol"},{"key":"1163_CR57","first-page":"263","volume-title":"Handbook of survey research 2nd","author":"JA Krosnick","year":"2010","unstructured":"Krosnick JA, Presser S (2010) Question and questionnaire design. In: Marsden PV, Wright JD (eds) Handbook of survey research 2nd. Emerald Publishing Group, Bingley, UK, pp 263\u2013313"},{"key":"1163_CR58","doi-asserted-by":"publisher","first-page":"1","DOI":"10.18637\/jss.v082.i13","volume":"82","author":"A Kuznetsova","year":"2017","unstructured":"Kuznetsova A, Brockhoff PB, Christensen RHB (2017) lmerTest package: tests in linear mixed effects models. J Stat Soft 82:1\u201326","journal-title":"J Stat Soft"},{"key":"1163_CR59","unstructured":"Revelle W, Revelle MW (2015) Package \u201cpsych\u201d. The Compr R Arch Netw 337"},{"key":"1163_CR60","doi-asserted-by":"publisher","first-page":"17","DOI":"10.1007\/s10919-008-0059-5","volume":"33","author":"Z Ambadar","year":"2009","unstructured":"Ambadar Z, Cohn JF, Reed LI (2009) All smiles are not created equal: morphology and timing of smiles perceived as amused, polite, and embarrassed\/nervous. J Nonverbal Behav 33:17\u201334. https:\/\/doi.org\/10.1007\/s10919-008-0059-5","journal-title":"J Nonverbal Behav"},{"key":"1163_CR61","doi-asserted-by":"publisher","first-page":"1566","DOI":"10.1177\/17456916211071083","volume":"17","author":"EG Krumhuber","year":"2022","unstructured":"Krumhuber EG, Kappas A (2022) More what Duchenne smiles do, less what they express. Perspect Psychol Sci 17:1566\u20131575. https:\/\/doi.org\/10.1177\/17456916211071083","journal-title":"Perspect Psychol Sci"},{"key":"1163_CR62","doi-asserted-by":"publisher","first-page":"789","DOI":"10.1080\/02699930701516791","volume":"22","author":"KR Scherer","year":"2008","unstructured":"Scherer KR, Grandjean D (2008) Facial expressions allow inference of both emotions and their components. Cogn Emot 22:789\u2013801. https:\/\/doi.org\/10.1080\/02699930701516791","journal-title":"Cogn Emot"},{"key":"1163_CR63","doi-asserted-by":"publisher","first-page":"150","DOI":"10.1037\/1528-3542.3.2.150","volume":"3","author":"G Horstmann","year":"2003","unstructured":"Horstmann G (2003) What do facial expressions convey: feeling states, behavioral intentions, or action requests? Emotion 3:150\u2013166. https:\/\/doi.org\/10.1037\/1528-3542.3.2.150","journal-title":"Emotion"},{"issue":"19","key":"1163_CR64","doi-asserted-by":"publisher","first-page":"1702","DOI":"10.1111\/j.1559-1816.1996.tb00093.x","volume":"26","author":"MK Hui","year":"1996","unstructured":"Hui MK, Zhou L (1996) How does waiting duration information influence customers\u2019 reactions to waiting for services? J Appl Soc Psychol 26(19):1702\u20131717","journal-title":"J Appl Soc Psychol"},{"issue":"4","key":"1163_CR65","doi-asserted-by":"publisher","first-page":"11","DOI":"10.1145\/1165385.317459","volume":"16","author":"BA Myers","year":"1985","unstructured":"Myers BA (1985) The importance of percent-done progress indicators for computer-human interfaces. ACM SIGCHI Bulletin 16(4):11\u201317","journal-title":"ACM SIGCHI Bulletin"},{"issue":"4","key":"1163_CR66","doi-asserted-by":"publisher","first-page":"528","DOI":"10.1177\/0018720809345684","volume":"51","author":"RJ Branaghan","year":"2009","unstructured":"Branaghan RJ, Sanchez CA (2009) Feedback preferences and impressions of waiting. Hum factors 51(4):528\u2013538","journal-title":"Hum factors"},{"key":"1163_CR67","doi-asserted-by":"crossref","unstructured":"Wintersberger P, Klotz T, Riener A (2020, October) Tell me more: transparency and time-fillers to optimize chatbots\u2019 waiting time experience. In: proceedings of the 11th Nordic conference on human-computer interaction: shaping experiences, Shaping society. pp. 1\u20136","DOI":"10.1145\/3419249.3420170"},{"issue":"85","key":"1163_CR68","doi-asserted-by":"publisher","first-page":"5351","DOI":"10.21105\/joss.05351","volume":"8","author":"JR de Leeuw","year":"2023","unstructured":"de Leeuw JR, Gilbert RA, Luchterhandt B (2023) jsPsych: enabling an open-source collaborative ecosystem of behavioral experiments. J Open Source Softw 8(85):5351. https:\/\/doi.org\/10.21105\/joss.05351","journal-title":"J Open Source Softw"},{"key":"1163_CR69","unstructured":"Ten Bosch L, Oostdijk N, De Ruiter JP (2004) Turn-taking in social talk dialogues: temporal, formal and functional aspects. In: 9th international conference speech and computer (SPECOM'2004),pp 454\u2013461"},{"key":"1163_CR70","first-page":"3113","volume-title":"Intelligent robots and systems","author":"S Sosnowski","year":"2006","unstructured":"Sosnowski S, Bittermann A, Kuhnlenz K, Buss M (2006) (2006) Design and evaluation of emotion-display eddie. Intelligent robots and systems. IEEE\/RSJ international conference on, IEEE, pp 3113\u20133118"},{"key":"1163_CR71","doi-asserted-by":"crossref","unstructured":"Fukuda T, Taguri J, Arai F, Nakashima M, Tachibana D, Hasegawa Y (2002). Facial expression of robot face for human-robot mutual communication. In: proceedings 2002 ieee international conference on robotics and automation, pp 46\u201351","DOI":"10.1109\/ROBOT.2002.1013337"},{"key":"1163_CR72","doi-asserted-by":"crossref","unstructured":"Glas DF, Minato T, Ishi CT, Kawahara T, Ishiguro H (2016) Erica: the erato intelligent conversational android. In:\u00a02016 25th IEEE International symposium on robot and human interactive communication (RO-MAN),\u00a0pp 22\u201329","DOI":"10.1109\/ROMAN.2016.7745086"},{"key":"1163_CR73","unstructured":"Hegel F, Eyssel F, Wrede B (2010) The social robot \u2018flobi\u2019: keyconcepts of industrial design. In: 19th international symposium inrobot and human interactive communication. IEEE, pp 107\u2013112"},{"key":"1163_CR74","doi-asserted-by":"publisher","first-page":"63","DOI":"10.1007\/s12369-014-0261-z","volume":"7","author":"N Mirnig","year":"2015","unstructured":"Mirnig N, Strasser E, Weiss A, K\u00fchnlenz B, Wollherr D, Tscheligi M (2015) Can you read my face? a methodological variation for assessing facial expressions of robotic heads. Int J Soc Robot 7:63\u201376","journal-title":"Int J Soc Robot"},{"key":"1163_CR75","doi-asserted-by":"publisher","first-page":"54","DOI":"10.1080\/08351813.2017.1262143","volume":"50","author":"P H\u00f6mke","year":"2017","unstructured":"H\u00f6mke P, Holler J, Levinson SC (2017) Eye blinking as addressee feedback in face-to-face conversation. Res Lang Soc Interact 50:54\u201370. https:\/\/doi.org\/10.1080\/08351813.2017.1262143","journal-title":"Res Lang Soc Interact"},{"key":"1163_CR76","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0208030","volume":"13","author":"P H\u00f6mke","year":"2018","unstructured":"H\u00f6mke P, Holler J, Levinson SC (2018) Eye blinks are perceived as communicative signals in human face-to-face interaction. PLOS ONE 13:e0208030. https:\/\/doi.org\/10.1371\/journal.pone.0208030","journal-title":"PLOS ONE"},{"key":"1163_CR77","doi-asserted-by":"publisher","first-page":"90","DOI":"10.1177\/1754073912451332","volume":"5","author":"HA Elfenbein","year":"2013","unstructured":"Elfenbein HA (2013) Nonverbal dialects and accents in facial expressions of emotion. Emot Rev 5:90\u201396. https:\/\/doi.org\/10.1177\/1754073912451332","journal-title":"Emot Rev"},{"key":"1163_CR78","doi-asserted-by":"publisher","first-page":"75","DOI":"10.1037\/emo0000302","volume":"18","author":"DT Cordaro","year":"2018","unstructured":"Cordaro DT, Sun R, Keltner D, Kamble S, Huddar N, McNeil G (2018) Universals and cultural variations in 22 emotional expressions across five cultures. Emotion 18:75\u201393. https:\/\/doi.org\/10.1037\/emo0000302","journal-title":"Emotion"},{"key":"1163_CR79","doi-asserted-by":"publisher","first-page":"471","DOI":"10.1177\/00220221221095208","volume":"53","author":"X Fang","year":"2022","unstructured":"Fang X, Sauter DA, Heerdink MW, van Kleef GA (2022) Culture shapes the distinctiveness of posed and spontaneous facial expressions of anger and disgust. J Cross Cult Psychol 53:471\u2013487. https:\/\/doi.org\/10.1177\/00220221221095208","journal-title":"J Cross Cult Psychol"},{"key":"1163_CR80","doi-asserted-by":"publisher","first-page":"708","DOI":"10.1037\/xge0000162","volume":"145","author":"RE Jack","year":"2016","unstructured":"Jack RE, Sun W, Delis I, Garrod OG, Schyns PG (2016) Four not six: revealing culturally common facial expressions of emotion. J Exp Psychol Gen 145:708\u2013730. https:\/\/doi.org\/10.1037\/xge0000162","journal-title":"J Exp Psychol Gen"},{"key":"1163_CR81","doi-asserted-by":"publisher","first-page":"365","DOI":"10.1037\/0022-3514.94.3.365","volume":"94","author":"T Masuda","year":"2008","unstructured":"Masuda T, Ellsworth PC, Mesquita B, Leu J, Tanida S, Van de Veerdonk E (2008) Placing the face in context: cultural differences in the perception of facial emotion. J Pers Soc Psychol 94:365\u2013381. https:\/\/doi.org\/10.1037\/0022-3514.94.3.365","journal-title":"J Pers Soc Psychol"},{"key":"1163_CR82","doi-asserted-by":"publisher","first-page":"309","DOI":"10.1007\/s41809-020-00066-1","volume":"4","author":"S Namba","year":"2020","unstructured":"Namba S, Rychlowska M, Orlowska A, Aviezer H, Krumhuber EG (2020) Social context and culture influence judgments of non-Duchenne smiles. J Cult Cogn Sci 4:309\u2013321. https:\/\/doi.org\/10.1007\/s41809-020-00066-1","journal-title":"J Cult Cogn Sci"},{"key":"1163_CR83","first-page":"435","volume-title":"The science of facial expression","author":"B Parkinson","year":"2017","unstructured":"Parkinson B (2017) Interpersonal effects and functions of facial activity. In: Fern\u00e1ndez-Dols JM, Russell JA (eds) The science of facial expression. Oxford University Press, New York, pp 435\u2013456"},{"key":"1163_CR84","doi-asserted-by":"publisher","DOI":"10.1016\/j.biopsycho.2020.107989","volume":"158","author":"H Kiilavuori","year":"2021","unstructured":"Kiilavuori H, Sariola V, Peltola MJ, Hietanen JK (2021) Making eye contact with a robot: psychophysiological responses to eye contact with a human and with a humanoid robot. Biol Psychol 158:107989","journal-title":"Biol Psychol"},{"key":"1163_CR85","doi-asserted-by":"publisher","first-page":"41","DOI":"10.1177\/1754073912451349","volume":"5","author":"EG Krumhuber","year":"2013","unstructured":"Krumhuber EG, Kappas A, Manstead ASR (2013) Effects of dynamic aspects of facial expressions: a review. Emot Rev 5:41\u201346. https:\/\/doi.org\/10.1177\/1754073912451349","journal-title":"Emot Rev"},{"key":"1163_CR86","doi-asserted-by":"publisher","first-page":"283","DOI":"10.1038\/s44159-023-00172-1","volume":"2","author":"EG Krumhuber","year":"2023","unstructured":"Krumhuber EG, Skora LI, Hill HCH, Lander K (2023) The role of facial movements in emotion recognition. Nat Rev Psychol 2:283\u2013296. https:\/\/doi.org\/10.1038\/s44159-023-00172-1","journal-title":"Nat Rev Psychol"},{"key":"1163_CR87","doi-asserted-by":"publisher","first-page":"2836","DOI":"10.3389\/fpsyg.2019.02836","volume":"10","author":"W Sato","year":"2019","unstructured":"Sato W, Krumhuber EG, Jellema T, Williams JHG (2019) Editorial: dynamic emotional communication. Front Psychol 10:2836. https:\/\/doi.org\/10.3389\/fpsyg.2019.02836","journal-title":"Front Psychol"},{"key":"1163_CR88","doi-asserted-by":"publisher","first-page":"158","DOI":"10.1109\/tbiom.2020.2977225","volume":"2","author":"IO Ertugrul","year":"2020","unstructured":"Ertugrul IO, Cohn JF, Jeni LA, Zhang Z, Yin L, Ji Q (2020) Crossing domains for au coding: perspectives, approaches, and measures. IEEE Trans Biom Behav Identity Sci 2:158\u2013171. https:\/\/doi.org\/10.1109\/tbiom.2020.2977225","journal-title":"IEEE Trans Biom Behav Identity Sci"},{"key":"1163_CR89","doi-asserted-by":"publisher","first-page":"407","DOI":"10.1016\/B978-0-12-814601-9.00026-2","volume-title":"Affective facial computing: generalizability across domains, Multimodal behavior analysis in the wild","author":"JF Cohn","year":"2019","unstructured":"Cohn JF, Ertugrul IO, Chu WS, Girard JM, Jeni LA, Hammal Z (2019) Affective facial computing: generalizability across domains, Multimodal behavior analysis in the wild. Academic Press, USA, pp 407\u2013441"}],"container-title":["International Journal of Social Robotics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12369-024-01163-9.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s12369-024-01163-9\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12369-024-01163-9.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,8,28]],"date-time":"2024-08-28T13:26:15Z","timestamp":1724851575000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s12369-024-01163-9"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,8]]},"references-count":89,"journal-issue":{"issue":"8","published-print":{"date-parts":[[2024,8]]}},"alternative-id":["1163"],"URL":"https:\/\/doi.org\/10.1007\/s12369-024-01163-9","relation":{},"ISSN":["1875-4791","1875-4805"],"issn-type":[{"type":"print","value":"1875-4791"},{"type":"electronic","value":"1875-4805"}],"subject":[],"published":{"date-parts":[[2024,8]]},"assertion":[{"value":"22 July 2024","order":1,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"5 August 2024","order":2,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors have no competing interests to declare that are relevant to the content of this article.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interests"}}]}}