{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,8,30]],"date-time":"2025-08-30T16:55:36Z","timestamp":1756572936334,"version":"3.40.4"},"reference-count":47,"publisher":"Springer Science and Business Media LLC","issue":"3","license":[{"start":{"date-parts":[[2023,10,11]],"date-time":"2023-10-11T00:00:00Z","timestamp":1696982400000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2023,10,11]],"date-time":"2023-10-11T00:00:00Z","timestamp":1696982400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Int J of Soc Robotics"],"published-print":{"date-parts":[[2025,3]]},"abstract":"<jats:title>Abstract<\/jats:title>\n          <jats:p>Gestures, a form of body language, significantly influence how users perceive humanoid robots. Recent data-driven methods for co-speech gestures have successfully enhanced the naturalness of the generated gestures. Moreover, compared to rule-based systems, these methods are more generalizable for unseen speech input. However, many of these methods cannot directly influence people\u2019s perceptions of robots. The primary challenge lies in the intricacy of constructing a dataset with varied impression labels to develop a conditional generation model. In our prior work ([22]) Controlling the impression of robots via gan-based gesture generation. In:Proceedings of the international conference on intelligent robots and systems. IEEE, pp 9288-9295), we introduced a heuristic approach for automatic labeling, training a deep learning model to control robot impressions. We demonstrated the model\u2019s effectiveness on both a virtual agent and a humanoid robot. In this study, we refined the motion retargeting algorithm for the humanoid robot and conducted a user study using four questions representing different aspects of extroversion. Our results show an improved capability in controlling the perceived degree of extroversion in the humanoid robot compared to previous methods. Furthermore, we discovered that different aspects of extroversion interact uniquely with motion statistics<\/jats:p>","DOI":"10.1007\/s12369-023-01051-8","type":"journal-article","created":{"date-parts":[[2023,10,11]],"date-time":"2023-10-11T10:04:40Z","timestamp":1697018680000},"page":"457-472","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":4,"title":["Extrovert or Introvert? GAN-Based Humanoid Upper-Body Gesture Generation for Different Impressions"],"prefix":"10.1007","volume":"17","author":[{"given":"Bowen","family":"Wu","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3789-2981","authenticated-orcid":false,"given":"Chaoran","family":"Liu","sequence":"additional","affiliation":[]},{"given":"Carlos Toshinori","family":"Ishi","sequence":"additional","affiliation":[]},{"given":"Jiaqi","family":"Shi","sequence":"additional","affiliation":[]},{"given":"Hiroshi","family":"Ishiguro","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2023,10,11]]},"reference":[{"key":"1051_CR1","doi-asserted-by":"publisher","first-page":"204","DOI":"10.3389\/fpsyg.2015.00204","volume":"6","author":"M Destephe","year":"2015","unstructured":"Destephe M, Brandao M, Kishi T, Zecca M, Hashimoto K, Takanishi A (2015) Walking in the uncanny valley: importance of the attractiveness on the acceptance of a robot as a working partner. Front Psychol 6:204","journal-title":"Front Psychol"},{"key":"1051_CR2","doi-asserted-by":"crossref","unstructured":"Yamashita Y, Ishihara H, Ikeda T, Asada M (2017) Appearance of a robot influences causal relationship between touch sensation and the personality impression. In: Proceedings of the international conference on human agent interaction, pp 457\u2013461","DOI":"10.1145\/3125739.3132587"},{"issue":"3","key":"1051_CR3","doi-asserted-by":"publisher","first-page":"253","DOI":"10.1007\/s12369-011-0100-4","volume":"3","author":"R Tamagawa","year":"2011","unstructured":"Tamagawa R, Watson CI, Kuo IH, MacDonald BA, Broadbent E (2011) The effects of synthesized voice accents on user perceptions of robots. Int J Soc Robot 3(3):253\u2013262","journal-title":"Int J Soc Robot"},{"key":"1051_CR4","doi-asserted-by":"crossref","unstructured":"Torre I, Goslin J, White L, Zanatto D (2018) Trust in artificial voices: A \u201ccongruency effect\u201d of first impressions and behavioural experience. In: Proceedings of the technology, mind, and society, pp 1\u20136","DOI":"10.1145\/3183654.3183691"},{"key":"1051_CR5","doi-asserted-by":"crossref","unstructured":"Ryoko S, Chie F, Takatsugu K, Kaori S, Yuki H, Motoyuki O, Natsuki O (2012) Does talking to a robot in a high-pitched voice create a good impression of the robot? In: ACIS. IEEE, pp 19\u201324","DOI":"10.1109\/SNPD.2012.72"},{"issue":"1","key":"1051_CR6","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1038\/s41598-018-25314-x","volume":"8","author":"C Thepsoonthorn","year":"2018","unstructured":"Thepsoonthorn C, Ogawa K-I, Miyake Y (2018) The relationship between robot\u2019s nonverbal behaviour and human\u2019s likability based on human\u2019s personality. Sci Rep 8(1):1\u201311","journal-title":"Sci Rep"},{"key":"1051_CR7","doi-asserted-by":"crossref","unstructured":"Hoffman G, Birnbaum GE, Vanunu K, Sass O, Reis HT (2014) Robot responsiveness to human disclosure affects social impression and appeal. In: International conference on human-robot interaction, pp 1\u20138","DOI":"10.1145\/2559636.2559660"},{"key":"1051_CR8","doi-asserted-by":"crossref","unstructured":"Kim H, Kwak SS, Kim M (2008) Personality design of sociable robots by control of gesture design factors. In: International symposium on robot and human interactive communication. IEEE, pp 494\u2013499","DOI":"10.1109\/ROMAN.2008.4600715"},{"key":"1051_CR9","doi-asserted-by":"crossref","unstructured":"Bergmann K, Eyssel F, Kopp, S (2012) A second chance to make a first impression? how appearance and nonverbal behavior affect perceived warmth and competence of virtual agents over time. In: International conference on intelligent virtual agents. Springer, pp 126\u2013138","DOI":"10.1007\/978-3-642-33197-8_13"},{"key":"1051_CR10","doi-asserted-by":"crossref","unstructured":"Cao Z, Simon T, Wei S-E, Sheikh Y (2017) Realtime multi-person 2d pose estimation using part affinity fields. In: IEEE conference on computer vision and pattern recognition, pp 7291\u20137299","DOI":"10.1109\/CVPR.2017.143"},{"key":"1051_CR11","doi-asserted-by":"crossref","unstructured":"G\u00fcler RA, Neverova N, Kokkinos I (2018) DensePose: dense human pose estimation in the wild. In: IEEE conference on computer vision and pattern recognition, pp 7297\u20137306","DOI":"10.1109\/CVPR.2018.00762"},{"key":"1051_CR12","doi-asserted-by":"crossref","unstructured":"Takeuchi K, Kubota S, Suzuki K, Hasegawa D, Sakuta H (2017) Creating a gesture-speech dataset for speech-based automatic gesture generation. In: International conference on human-computer interaction. Springer, pp 198\u2013202","DOI":"10.1007\/978-3-319-58750-9_28"},{"key":"1051_CR13","doi-asserted-by":"crossref","unstructured":"Yoon Y, Ko W-R, Jang M, Lee J, Kim J, Lee G (2019) Robots learn social skills: end-to-end learning of co-speech gesture generation for humanoid robots. In: International conference on robotics and automation. IEEE, pp 4303\u20134309","DOI":"10.1109\/ICRA.2019.8793720"},{"key":"1051_CR14","doi-asserted-by":"crossref","unstructured":"Ferstl Y, Neff M, McDonnell R (2019) Multi-objective adversarial gesture generation. In: Motion, interaction and games, pp 1\u201310","DOI":"10.1145\/3359566.3360053"},{"issue":"4","key":"1051_CR15","doi-asserted-by":"publisher","first-page":"3757","DOI":"10.1109\/LRA.2018.2856281","volume":"3","author":"CT Ishi","year":"2018","unstructured":"Ishi CT, Machiyashiki D, Mikata R, Ishiguro H (2018) A speech-driven hand gesture generation method and evaluation in android robots. IEEE Robot Autom Lett 3(4):3757\u20133764","journal-title":"IEEE Robot Autom Lett"},{"key":"1051_CR16","doi-asserted-by":"crossref","unstructured":"Alexanderson S, Henter GE, Kucherenko T, Beskow J (2020) Style-controllable speech-driven gesture synthesis using normalising flows. In: Computer graphics forum, vol 39. Wiley Online Library, pp 487\u2013496","DOI":"10.1111\/cgf.13946"},{"issue":"6","key":"1051_CR17","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/3414685.3417838","volume":"39","author":"Y Yoon","year":"2020","unstructured":"Yoon Y, Cha B, Lee J-H, Jang M, Lee J, Kim J, Lee G (2020) Speech gesture generation from the trimodal context of text, audio, and speaker identity. ACM Trans Graph 39(6):1\u201316","journal-title":"ACM Trans Graph"},{"key":"1051_CR18","doi-asserted-by":"crossref","unstructured":"Taylor S, Windle J, Greenwood D, Matthews I (2021) Speech-driven conversational agents using conditional flow-VAEs. In: European conference on visual media production, pp 1\u20139","DOI":"10.1145\/3485441.3485647"},{"key":"1051_CR19","doi-asserted-by":"crossref","unstructured":"Kucherenko T, Nagy R, Jonell P, Neff M, Kjellstr\u00f6m H, Henter GE (2021) Speech2properties2gestures: gesture-property prediction as a tool for generating representational gestures from speech. In: Proceedings of the 21st ACM international conference on intelligent virtual agents, pp 145\u2013147","DOI":"10.1145\/3472306.3478333"},{"issue":"14","key":"1051_CR20","doi-asserted-by":"publisher","first-page":"1300","DOI":"10.1080\/10447318.2021.1883883","volume":"37","author":"T Kucherenko","year":"2021","unstructured":"Kucherenko T, Hasegawa D, Kaneko N, Henter GE, Kjellstr\u00f6m H (2021) Moving fast and slow: analysis of representations and post-processing in speech-driven automatic gesture generation. Int J Hum Comput Interact 37(14):1300\u20131316","journal-title":"Int J Hum Comput Interact"},{"key":"1051_CR21","doi-asserted-by":"crossref","unstructured":"Wu B, Liu C, Ishi CT, Ishiguro H (2021) Probabilistic human-like gesture synthesis from speech using GRU-based WGAN. In: Companion publication of the 2021 international conference on multimodal interaction, pp 194\u2013201","DOI":"10.1145\/3461615.3485407"},{"key":"1051_CR22","doi-asserted-by":"crossref","unstructured":"Wu B, Shi J, Liu C, Ishi CT, Ishiguro H (2022) Controlling the impression of robots via gan-based gesture generation. In: Proceedings of the international conference on intelligent robots and systems. IEEE, pp 9288\u20139295","DOI":"10.1109\/IROS47612.2022.9981535"},{"key":"1051_CR23","doi-asserted-by":"crossref","unstructured":"Hasegawa D, Kaneko N, Shirakawa S, Sakuta H, Sumi K (2018) Evaluation of speech-to-gesture generation using bi-directional LSTM network. In: 18th international conference on intelligent virtual agents, pp 79\u201386","DOI":"10.1145\/3267851.3267878"},{"key":"1051_CR24","doi-asserted-by":"crossref","unstructured":"Kucherenko T, Hasegawa D, Henter GE, Kaneko N, Kjellstr\u00f6m H (2019) Analyzing input and output representations for speech-driven gesture generation. In: 19th ACM international conference on intelligent virtual agents, pp 97\u2013104","DOI":"10.1145\/3308532.3329472"},{"key":"1051_CR25","doi-asserted-by":"crossref","unstructured":"Ginosar S, Bar A, Kohavi G, Chan C, Owens A, Malik J (2019) Learning individual styles of conversational gesture. In: Proceedings of the IEEE\/CVF conference on computer vision and pattern recognition, pp 3497\u20133506","DOI":"10.1109\/CVPR.2019.00361"},{"key":"1051_CR26","doi-asserted-by":"crossref","unstructured":"Yoon Y, Park K, Jang M, Kim J, Lee G (2021) SGToolkit: an interactive gesture authoring toolkit for embodied conversational agents. In: The 34th annual ACM symposium on user interface software and technology, pp 826\u2013840","DOI":"10.1145\/3472749.3474789"},{"issue":"3","key":"1051_CR27","doi-asserted-by":"publisher","first-page":"228","DOI":"10.3390\/electronics10030228","volume":"10","author":"B Wu","year":"2021","unstructured":"Wu B, Liu C, Ishi CT, Ishiguro H (2021) Modeling the conditional distribution of co-speech upper body gesture jointly using conditional-GAN and unrolled-GAN. Electronics 10(3):228","journal-title":"Electronics"},{"issue":"2","key":"1051_CR28","doi-asserted-by":"publisher","first-page":"277","DOI":"10.1007\/s10846-019-01100-3","volume":"99","author":"L P\u00e9rez-Mayos","year":"2020","unstructured":"P\u00e9rez-Mayos L, Farr\u00fas M, Adell J (2020) Part-of-speech and prosody-based approaches for robot speech and gesture synchronization. J Intell Robot Syst 99(2):277\u2013287","journal-title":"J Intell Robot Syst"},{"key":"1051_CR29","unstructured":"Robert L (2018) Personality in the human robot interaction literature: a review and brief critique. In: Proceedings of the 24th Americas conference on information systems, pp 16\u201318"},{"issue":"3","key":"1051_CR30","doi-asserted-by":"publisher","first-page":"459","DOI":"10.1016\/j.apergo.2012.10.010","volume":"44","author":"J Hwang","year":"2013","unstructured":"Hwang J, Park T, Hwang W (2013) The effects of overall robot shape on the emotions invoked in users and the perceived personalities of robot. Appl Ergon 44(3):459\u2013471","journal-title":"Appl Ergon"},{"key":"1051_CR31","doi-asserted-by":"publisher","first-page":"75","DOI":"10.1016\/j.chb.2014.05.014","volume":"38","author":"B Tay","year":"2014","unstructured":"Tay B, Jung Y, Park T (2014) When stereotypes meet robots: the double-edge sword of robot gender and personality in human-robot interaction. Comput Hum Behav 38:75\u201384","journal-title":"Comput Hum Behav"},{"key":"1051_CR32","doi-asserted-by":"crossref","unstructured":"Robert L, Alahmad R, Esterwood C, Kim S, You S, Zhang Q (2020) A review of personality in human\u2013robot interactions. SSRN 3528496","DOI":"10.2139\/ssrn.3528496"},{"key":"1051_CR33","doi-asserted-by":"crossref","unstructured":"Neff M, Wang Y, Abbott R, Walker M (2010) Evaluating the effect of gesture and language on personality perception in conversational agents. In: International conference on intelligent virtual agents. Springer, pp 222\u2013235","DOI":"10.1007\/978-3-642-15892-6_24"},{"key":"1051_CR34","doi-asserted-by":"crossref","unstructured":"Mileounis A, Cuijpers RH, Barakova EI (2015) Creating robots with personality: the effect of personality on social intelligence. In: International work-conference on the interplay between natural and artificial computation. Springer, pp 119\u2013132","DOI":"10.1007\/978-3-319-18914-7_13"},{"key":"1051_CR35","doi-asserted-by":"crossref","unstructured":"Craenen B, Deshmukh A, Foster ME, Vinciarelli A (2018) Shaping gestures to shape personalities: the relationship between gesture parameters, attributed personality traits and godspeed scores. In: 27th IEEE international symposium on robot and human interactive communication. IEEE, pp 699\u2013704","DOI":"10.1109\/ROMAN.2018.8525739"},{"key":"1051_CR36","doi-asserted-by":"crossref","unstructured":"Dou X, Wu C-F, Lin K-C, Tseng T-M (2019) The effects of robot voice and gesture types on the perceived robot personalities. In: International conference on human-computer interaction. Springer, pp 299\u2013309","DOI":"10.1007\/978-3-030-22646-6_21"},{"issue":"2","key":"1051_CR37","doi-asserted-by":"publisher","first-page":"125","DOI":"10.1007\/s12369-010-0071-x","volume":"3","author":"J Li","year":"2011","unstructured":"Li J, Chignell M (2011) Communication of emotion in social robots through simple head and arm movements. Int J Soc Robot 3(2):125\u2013142","journal-title":"Int J Soc Robot"},{"key":"1051_CR38","doi-asserted-by":"crossref","unstructured":"Costa S, Soares F, Santos C (2013) Facial expressions and gestures to convey emotions with a humanoid robot. In: International conference on social robotics. Springer, pp 542\u2013551","DOI":"10.1007\/978-3-319-02675-6_54"},{"issue":"6","key":"1051_CR39","doi-asserted-by":"publisher","first-page":"1493","DOI":"10.1007\/s12369-022-00893-y","volume":"14","author":"A Gjaci","year":"2022","unstructured":"Gjaci A, Recchiuto CT, Sgorbissa A (2022) Towards culture-aware co-speech gestures for social robots. Int J Soc Robot 14(6):1493\u20131506","journal-title":"Int J Soc Robot"},{"key":"1051_CR40","doi-asserted-by":"crossref","unstructured":"Van\u00a0Otterdijk M, Song H, Tsiakas K, Van\u00a0Zeijl I, Barakova E (2022) Nonverbal cues expressing robot personality-a movement analysts perspective. In: 2022 31st IEEE international conference on robot and human interactive communication (RO-MAN). IEEE, pp 1181\u20131186","DOI":"10.1109\/RO-MAN53752.2022.9900647"},{"issue":"10","key":"1051_CR41","doi-asserted-by":"publisher","first-page":"4639","DOI":"10.3390\/app11104639","volume":"11","author":"U Zabala","year":"2021","unstructured":"Zabala U, Rodriguez I, Mart\u00ednez-Otzeta JM, Lazkano E (2021) Expressing robot personality through talking body language. Appl Sci 11(10):4639","journal-title":"Appl Sci"},{"key":"1051_CR42","first-page":"2825","volume":"12","author":"F Pedregosa","year":"2011","unstructured":"Pedregosa F, Varoquaux G, Gramfort A, Michel V, Thirion B, Grisel O, Blondel M, Prettenhofer P, Weiss R, Dubourg V, Vanderplas J, Passos A, Cournapeau D, Brucher M, Perrot M, Duchesnay E (2011) Scikit-learn: machine learning in Python. J Mach Learn Res 12:2825\u20132830","journal-title":"J Mach Learn Res"},{"issue":"6","key":"1051_CR43","doi-asserted-by":"publisher","first-page":"531","DOI":"10.1016\/j.specom.2008.03.009","volume":"50","author":"CT Ishi","year":"2008","unstructured":"Ishi CT, Ishiguro H, Hagita N (2008) Automatic extraction of paralinguistic information using prosodic features related to f0, duration and voice quality. Speech Commun 50(6):531\u2013543","journal-title":"Speech Commun"},{"key":"1051_CR44","volume-title":"Advances in neural information processing systems","author":"I Gulrajani","year":"2017","unstructured":"Gulrajani I, Ahmed F, Arjovsky M, Dumoulin V, Courville AC (2017) Improved training of wasserstein GANs. In: Guyon I, Luxburg UV, Bengio S, Wallach H, Fergus R, Vishwanathan S, Garnett R (eds) Advances in neural information processing systems, vol 30. Curran Associates Inc, New York"},{"key":"1051_CR45","doi-asserted-by":"crossref","unstructured":"Girshick R (2015) Fast R-CNN. In: IEEE international conference on computer vision, pp 1440\u20131448","DOI":"10.1109\/ICCV.2015.169"},{"issue":"3","key":"1051_CR46","doi-asserted-by":"publisher","first-page":"1748","DOI":"10.1109\/LRA.2017.2700941","volume":"2","author":"CT Ishi","year":"2017","unstructured":"Ishi CT, Minato T, Ishiguro H (2017) Motion analysis in vocalized surprise expressions and motion generation in android robots. IEEE Robot Autom Lett 2(3):1748\u20131754. https:\/\/doi.org\/10.1109\/LRA.2017.2700941","journal-title":"IEEE Robot Autom Lett"},{"key":"1051_CR47","doi-asserted-by":"crossref","unstructured":"Ludewig Y, D\u00f6ring N, Exner N (2012) Design and evaluation of the personality trait extraversion of a shopping robot. In: 2012 IEEE RO-MAN: the 21st IEEE international symposium on robot and human interactive communication. IEEE, pp 372\u2013379","DOI":"10.1109\/ROMAN.2012.6343781"}],"container-title":["International Journal of Social Robotics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12369-023-01051-8.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s12369-023-01051-8\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12369-023-01051-8.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,4,18]],"date-time":"2025-04-18T04:49:17Z","timestamp":1744951757000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s12369-023-01051-8"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,10,11]]},"references-count":47,"journal-issue":{"issue":"3","published-print":{"date-parts":[[2025,3]]}},"alternative-id":["1051"],"URL":"https:\/\/doi.org\/10.1007\/s12369-023-01051-8","relation":{},"ISSN":["1875-4791","1875-4805"],"issn-type":[{"type":"print","value":"1875-4791"},{"type":"electronic","value":"1875-4805"}],"subject":[],"published":{"date-parts":[[2023,10,11]]},"assertion":[{"value":"28 August 2023","order":1,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"11 October 2023","order":2,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors have no relevant financial or non-financial interests to disclose.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}},{"value":"Informed consent was obtained from all personal participants included in the study.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent to participate"}},{"value":"There is no written description or image of personal information included in the manuscript.","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent for publication"}},{"value":"Since our experiment involves human subjects, we acquired approval from the ethic committee of the Advanced Telecommunication Research Institute International (ATR, review number 22-605).","order":5,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics approval"}}]}}