{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,13]],"date-time":"2026-03-13T08:55:26Z","timestamp":1773392126428,"version":"3.50.1"},"reference-count":77,"publisher":"Springer Science and Business Media LLC","issue":"7","license":[{"start":{"date-parts":[[2025,5,12]],"date-time":"2025-05-12T00:00:00Z","timestamp":1747008000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2025,5,12]],"date-time":"2025-05-12T00:00:00Z","timestamp":1747008000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100001659","name":"Deutsche Forschungsgemeinschaft","doi-asserted-by":"publisher","award":["502483052"],"award-info":[{"award-number":["502483052"]}],"id":[{"id":"10.13039\/501100001659","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100002347","name":"Bundesministerium f\u00fcr Bildung und Forschung","doi-asserted-by":"publisher","award":["13N16336"],"award-info":[{"award-number":["13N16336"]}],"id":[{"id":"10.13039\/501100002347","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100008530","name":"European Regional Development Fund","doi-asserted-by":"publisher","award":["ZS\/2023\/12\/182056"],"award-info":[{"award-number":["ZS\/2023\/12\/182056"]}],"id":[{"id":"10.13039\/501100008530","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Complex Intell. Syst."],"published-print":{"date-parts":[[2025,7]]},"abstract":"<jats:title>Abstract<\/jats:title>\n          <jats:p>This paper introduces a new approach to predict human engagement in human\u2013robot interactions (HRI), focusing on eye contact and distance information. Recognising engagement, particularly its decline, is essential for successful and natural interactions. This requires early, real-time user behavior detection. Previous HRI engagement classification approaches use various audiovisual features or adopt end-to-end methods. However, both approaches face challenges: the former risks error accumulation, while the latter suffer from small datasets. The proposed class-sensitive model for capturing engagement in HRI is based on eye contact detection. By analyzing eye contact intensity over time, the model provides a more robust and reliable measure of engagement levels, effectively capturing both temporal dynamics and subtle behavioral changes. Direct eye contact detection, a crucial social signal in human interactions that has not yet been explored as a standalone indicator in HRI, offers a significant advantage in robustness over gaze detection and incorporates additional facial features into the assessment. This approach reduces the number of features from up to over 100 to just two, enabling real-time processing and surpassing state-of-the-art results with 80.73% accuracy and 80.68% F1-Score on the UE-HRI dataset, the primary resource in current engagement detection research. Additionally, cross-dataset testing on a newly recorded dataset with the Tiago robot from Pal Robotics achieved an accuracy of 86.8% and an F1-score of 87.9%. The model employs a sliding window approach and consists of just three fully connected layers for feature fusion and classification, offering a minimalistic yet effective architecture. The study reveals that engagement, traditionally relying on extensive feature sets, can be inferred reliably from temporal eye contact dynamics. The results include a detailed analysis of established engagement levels on the UE-HRI dataset using the proposed model. Additionally, models for more nuanced engagement classification are introduced, showcasing the effectiveness of this minimalistic feature set. These models provide a robust foundation for future research, advancing robotic systems and deepening understanding of HRI, for example by improving real-time social cue detection and creating adaptive engagement strategies in HRI.<\/jats:p>","DOI":"10.1007\/s40747-025-01902-z","type":"journal-article","created":{"date-parts":[[2025,5,12]],"date-time":"2025-05-12T07:06:28Z","timestamp":1747033588000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":3,"title":["Eye contact based engagement prediction for efficient human\u2013robot interaction"],"prefix":"10.1007","volume":"11","author":[{"ORCID":"https:\/\/orcid.org\/0009-0006-1353-4185","authenticated-orcid":false,"given":"Magnus","family":"Jung","sequence":"first","affiliation":[]},{"given":"Ahmed","family":"Abdelrahman","sequence":"additional","affiliation":[]},{"given":"Thorsten","family":"Hempel","sequence":"additional","affiliation":[]},{"given":"Basheer","family":"Al-Tawil","sequence":"additional","affiliation":[]},{"given":"Qiaoyue","family":"Yang","sequence":"additional","affiliation":[]},{"given":"Sven","family":"Wachsmuth","sequence":"additional","affiliation":[]},{"given":"Ayoub","family":"Al-Hamadi","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2025,5,12]]},"reference":[{"key":"1902_CR1","doi-asserted-by":"publisher","first-page":"61980","DOI":"10.1109\/ACCESS.2022.3182469","volume":"10","author":"AA Abdelrahman","year":"2022","unstructured":"Abdelrahman AA, Strazdas D, Khalifa A, Hintz J, Hempel T, Al-Hamadi A (2022) Multimodal engagement prediction in multiperson human-robot interaction. IEEE Access 10:61980\u201361991","journal-title":"IEEE Access"},{"key":"1902_CR2","doi-asserted-by":"crossref","unstructured":"Abdelrahman AA, Hempel T, Khalifa A, Al-Hamadi A, Dinges L (2023) L2CS-NET: fine-grained gaze estimation in unconstrained environments. In: 2023 8th international conference on frontiers of signal processing (ICFSP). IEEE, pp 98\u2013102","DOI":"10.1109\/ICFSP59764.2023.10372944"},{"key":"1902_CR3","doi-asserted-by":"crossref","unstructured":"Abedi A, Khan SS (2021) Improving state-of-the-art in detecting student engagement with RESNET and TCN hybrid network. In: 2021 18th conference on robots and vision (CRV). IEEE, pp 151\u2013157","DOI":"10.1109\/CRV52889.2021.00028"},{"key":"1902_CR4","doi-asserted-by":"crossref","unstructured":"Andrist S, Bohus D, Kamar E, Horvitz E (2017) What went wrong and why? Diagnosing situated interaction failures in the wild. In: Social robotics: 9th international conference, ICSR 2017, Tsukuba, Japan, November 22\u201324, 2017, Proceedings 9. Springer, pp 293\u2013303","DOI":"10.1007\/978-3-319-70022-9_29"},{"key":"1902_CR5","doi-asserted-by":"publisher","first-page":"465","DOI":"10.1007\/s12369-015-0298-7","volume":"7","author":"SM Anzalone","year":"2015","unstructured":"Anzalone SM, Boucenna S, Ivaldi S, Chetouani M (2015) Evaluating the engagement with social robots. Int J Soc Robot 7:465\u2013478","journal-title":"Int J Soc Robot"},{"key":"1902_CR6","unstructured":"Anzalone SM, Varni G, Zibetti E, Ivaldi S, Chetouani M (2015) Automated prediction of extraversion during human-robot interaction. In: AIRO@ AI* IA, pp 29\u201339"},{"issue":"5","key":"1902_CR7","doi-asserted-by":"publisher","first-page":"7311","DOI":"10.1007\/s40747-024-01544-7","volume":"10","author":"MS Aslam","year":"2024","unstructured":"Aslam MS, Bilal H, Chang W, Yahya A, Badruddin IA, Kamangar S, Hussien M (2024) Indirect adaptive observer control (I-AOC) design for truck-trailer model based on T-S fuzzy system with unknown nonlinear function. Complex Intell Syst 10(5):7311\u20137331","journal-title":"Complex Intell Syst"},{"key":"1902_CR8","doi-asserted-by":"crossref","unstructured":"Ben-Youssef A, Clavel C, Essid S, Bilac M, Chamoux M, Lim A (2017) UE-HRI: a new dataset for the study of user engagement in spontaneous human\u2013robot interactions. In: Proceedings of the 19th ACM international conference on multimodal interaction, pp 464\u2013472","DOI":"10.1145\/3136755.3136814"},{"issue":"3","key":"1902_CR9","doi-asserted-by":"publisher","first-page":"776","DOI":"10.1109\/TAFFC.2019.2898399","volume":"12","author":"A Ben-Youssef","year":"2019","unstructured":"Ben-Youssef A, Clavel C, Essid S (2019) Early detection of user engagement breakdown in spontaneous human\u2013humanoid interaction. IEEE Trans Affect Comput 12(3):776\u2013787","journal-title":"IEEE Trans Affect Comput"},{"key":"1902_CR10","doi-asserted-by":"publisher","first-page":"815","DOI":"10.1007\/s12369-019-00591-2","volume":"11","author":"A Ben-Youssef","year":"2019","unstructured":"Ben-Youssef A, Varni G, Essid S, Clavel C (2019) On-the-fly detection of user engagement decrease in spontaneous human\u2013robot interaction using recurrent and deep neural networks. Int J Soc Robot 11:815\u2013828","journal-title":"Int J Soc Robot"},{"key":"1902_CR11","doi-asserted-by":"crossref","unstructured":"Bi J, Hu F, Wang Y, Luo M, He M (2023) Human engagement intention intensity recognition method based on two states fusion fuzzy inference system. Intell Serv Robot 1\u201316","DOI":"10.1007\/s11370-023-00464-8"},{"key":"1902_CR12","doi-asserted-by":"publisher","first-page":"05","DOI":"10.22967\/HCIS.2024.14.071","volume":"1\u201321","author":"H Bilal","year":"2024","unstructured":"Bilal H, Ahmed F, Aslam M, Li Q, Hou J, Yin B (2024) A blockchain-enabled approach for privacy-protected data sharing in internet of robotic things networks. Hum-Cent Comput Inf Sci 1\u201321:05. https:\/\/doi.org\/10.22967\/HCIS.2024.14.071","journal-title":"Hum-Cent Comput Inf Sci"},{"key":"1902_CR13","doi-asserted-by":"crossref","unstructured":"Bilal H, Aslam MS, Tian Y, Yahya A, Izneide BA (2024) Enhancing trajectory tracking and vibration control of flexible robots with hybrid fuzzy ADRC and input shaping. IEEE Access","DOI":"10.1109\/ACCESS.2024.3453944"},{"key":"1902_CR14","doi-asserted-by":"crossref","unstructured":"Bilal H, Obaidat MS, Aslam MS, Zhang J, Yin B, Mahmood K (2024) Online fault diagnosis of industrial robot using iort and hybrid deep learning techniques: an experimental approach. IEEE Internet Things J","DOI":"10.1109\/JIOT.2024.3418352"},{"issue":"12","key":"1902_CR15","doi-asserted-by":"publisher","first-page":"1290","DOI":"10.3390\/bioengineering11121290","volume":"11","author":"H Bilal","year":"2024","unstructured":"Bilal H, Tian Y, Ali A, Muhammad Y, Yahya A, Izneid BA, Ullah I (2024) An intelligent approach for early and accurate predication of cardiac disease using hybrid artificial intelligence techniques. Bioengineering 11(12):1290","journal-title":"Bioengineering"},{"key":"1902_CR16","doi-asserted-by":"crossref","unstructured":"Bohus D, Horvitz E (2009) Dialog in the open world: platform and applications. In: Proceedings of the 2009 international conference on Multimodal interfaces, pp 31\u201338","DOI":"10.1145\/1647314.1647323"},{"key":"1902_CR17","doi-asserted-by":"crossref","unstructured":"Bohus D, Horvitz E (2014) Managing human-robot engagement with forecasts and... um... hesitations. In: Proceedings of the 16th international conference on multimodal interaction, pp 2\u20139","DOI":"10.1145\/2663204.2663241"},{"key":"1902_CR18","doi-asserted-by":"publisher","first-page":"181","DOI":"10.1007\/s40593-015-0069-5","volume":"27","author":"N Bosch","year":"2017","unstructured":"Bosch N, D\u2019Mello S (2017) The affective experience of novice computer programmers. Int J Artif Intell Educ 27:181\u2013206","journal-title":"Int J Artif Intell Educ"},{"issue":"3","key":"1902_CR19","doi-asserted-by":"publisher","first-page":"351","DOI":"10.1111\/j.1468-2958.1984.tb00023.x","volume":"10","author":"JK Burgoon","year":"1984","unstructured":"Burgoon JK, Buller DB, Hale JL, de Turck MA (1984) Relational messages associated with nonverbal behaviors. Hum Commun Res 10(3):351\u2013378","journal-title":"Hum Commun Res"},{"key":"1902_CR20","doi-asserted-by":"crossref","unstructured":"Castellano G, Pereira A, Leite I, Paiva A, McOwan PW (2009) Detecting user engagement with a robot companion using task and social interaction-based features. In: Proceedings of the 2009 international conference on Multimodal interfaces, pp 119\u2013126","DOI":"10.1145\/1647314.1647336"},{"key":"1902_CR21","doi-asserted-by":"crossref","unstructured":"Castellano G, Leite I, Pereira A, Martinho C, Paiva A, McOwan PW (2012) Detecting engagement in HRI: an exploration of social and task-based context. In: 2012 international conference on privacy, security, risk and trust and 2012 international conference on social computing. IEEE, pp 421\u2013428","DOI":"10.1109\/SocialCom-PASSAT.2012.51"},{"key":"1902_CR22","doi-asserted-by":"crossref","unstructured":"Chong E, Clark-Whitney E, Southerland A, Stubbs E, Miller C, Ajodan EL, Silverman MR, Lord C, Rozga A, Jones RM, Rehg JM (2020) Detection of eye contact with deep neural networks is as accurate as human experts. Nat Commun 11","DOI":"10.1038\/s41467-020-19712-x"},{"key":"1902_CR23","doi-asserted-by":"publisher","first-page":"116","DOI":"10.3389\/frobt.2020.00116","volume":"7","author":"F Del Duchetto","year":"2020","unstructured":"Del Duchetto F, Baxter P, Hanheide M (2020) Are you still with me? Continuous engagement assessment from a robot\u2019s point of view. Front Robot AI 7:116","journal-title":"Front Robot AI"},{"key":"1902_CR24","doi-asserted-by":"crossref","unstructured":"Deng J, Guo J, Ververas E, Kotsia I, Zafeiriou S (2020) Retinaface: single-shot multi-level face localisation in the wild. In: Proceedings of the IEEE\/CVF conference on computer vision and pattern recognition, pp 5203\u20135212","DOI":"10.1109\/CVPR42600.2020.00525"},{"issue":"1","key":"1902_CR25","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1186\/s40561-018-0080-z","volume":"6","author":"M Dewan","year":"2019","unstructured":"Dewan M, Murshed M, Lin F (2019) Engagement detection in online learning: a review. Smart Learn Environ 6(1):1\u201320","journal-title":"Smart Learn Environ"},{"issue":"12","key":"1902_CR26","doi-asserted-by":"publisher","first-page":"2394","DOI":"10.1587\/transinf.E92.D.2394","volume":"92","author":"P Dybala","year":"2009","unstructured":"Dybala P, Ptaszynski M, Rzepka R, Araki K (2009) Activating humans with humor-a dialogue system that users want to interact with. IEICE Trans Inf Syst 92(12):2394\u20132401","journal-title":"IEICE Trans Inf Syst"},{"issue":"2","key":"1902_CR27","doi-asserted-by":"publisher","first-page":"145","DOI":"10.1016\/j.learninstruc.2011.10.001","volume":"22","author":"S D\u2019Mello","year":"2012","unstructured":"D\u2019Mello S, Graesser A (2012) Dynamics of affective states during complex learning. Learn Instruct 22(2):145\u2013157","journal-title":"Learn Instruct"},{"issue":"5","key":"1902_CR28","doi-asserted-by":"publisher","first-page":"659","DOI":"10.1007\/s12369-017-0414-y","volume":"9","author":"ME Foster","year":"2017","unstructured":"Foster ME, Gaschler A, Giuliani M (2017) Automatically classifying user engagement for dynamic multi-party human-robot interaction. Int J Soc Robot 9(5):659\u2013674","journal-title":"Int J Soc Robot"},{"key":"1902_CR29","doi-asserted-by":"publisher","DOI":"10.3389\/frobt.2021.721317","volume":"8","author":"Y Gao","year":"2022","unstructured":"Gao Y, Huang C-M (2022) Evaluation of socially-aware robot navigation. Front Robot AI 8:721317","journal-title":"Front Robot AI"},{"key":"1902_CR30","unstructured":"Gildenblat J (2021) Contributors. Pytorch library for cam methods. https:\/\/github.com\/jacobgil\/pytorch-grad-cam"},{"issue":"3","key":"1902_CR31","doi-asserted-by":"publisher","first-page":"203","DOI":"10.1561\/1100000005","volume":"1","author":"MA Goodrich","year":"2008","unstructured":"Goodrich MA, Schultz AC et al (2008) Human-robot interaction: a survey. Found Trends Hum Comput Interact 1(3):203\u2013275","journal-title":"Found Trends Hum Comput Interact"},{"key":"1902_CR32","doi-asserted-by":"crossref","unstructured":"Hall L, Woods S, Aylett R, Newall L, Paiva A (2005) Achieving empathic engagement through affective interaction with synthetic characters. In: Affective computing and intelligent interaction: first international conference, ACII 2005, Beijing, China, October 22\u201324, 2005. Proceedings 1. Springer, pp 731\u2013738","DOI":"10.1007\/11573548_94"},{"key":"1902_CR33","doi-asserted-by":"publisher","DOI":"10.1007\/978-0-387-84858-7","volume-title":"The elements of statistical learning: data mining, inference, and prediction","author":"T Hastie","year":"2009","unstructured":"Hastie T, Tibshirani R, Friedman JH, Friedman JH (2009) The elements of statistical learning: data mining, inference, and prediction, vol 2. Springer"},{"key":"1902_CR34","doi-asserted-by":"crossref","unstructured":"He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR)","DOI":"10.1109\/CVPR.2016.90"},{"key":"1902_CR35","doi-asserted-by":"crossref","unstructured":"Hempel T, Dinges L, Al-Hamadi A (2023) Sentiment-based engagement strategies for intuitive human-robot interaction. arXiv preprint arXiv:2301.03867","DOI":"10.5220\/0011772900003417"},{"key":"1902_CR36","doi-asserted-by":"publisher","first-page":"2377","DOI":"10.1109\/TIP.2024.3378180","volume":"33","author":"T Hempel","year":"2024","unstructured":"Hempel T, Abdelrahman AA, Al-Hamadi A (2024) Toward robust and unconstrained full range of rotation head pose estimation. IEEE Trans Image Process 33:2377\u20132387","journal-title":"IEEE Trans Image Process"},{"key":"1902_CR37","doi-asserted-by":"crossref","unstructured":"Hempel T, Jung M, Abdelrahman AA, Al-Hamadi A (2024) NITEC: versatile hand-annotated eye contact dataset for ego-vision interaction. In: Proceedings of the IEEE\/CVF winter conference on applications of computer vision, pp 4437\u20134446. arXiv preprint arXiv:2311.04505","DOI":"10.1109\/WACV57701.2024.00438"},{"key":"1902_CR38","unstructured":"Higashinaka R, Funakoshi K, Kobayashi Y, Inaba M (2016) The dialogue breakdown detection challenge: task description, datasets, and evaluation metrics. In: Proceedings of the tenth international conference on language resources and evaluation (LREC\u201916), pp 3146\u20133150"},{"key":"1902_CR39","doi-asserted-by":"publisher","first-page":"63","DOI":"10.1007\/s12369-016-0357-8","volume":"9","author":"S Ivaldi","year":"2017","unstructured":"Ivaldi S, Lefort S, Peters J, Chetouani M, Provasi J, Zibetti E (2017) Towards engagement models that consider individual factors in HRI: on the relation of extroversion and negative attitude towards robots to gaze and speech during a human-robot assembly task: experiments with the ICUB humanoid. Int J Soc Robot 9:63\u201386","journal-title":"Int J Soc Robot"},{"key":"1902_CR40","doi-asserted-by":"publisher","DOI":"10.1016\/j.rcim.2022.102404","volume":"78","author":"R Jahanmahin","year":"2022","unstructured":"Jahanmahin R, Masoud S, Rickli J, Djuric A (2022) Human-robot interactions in manufacturing: a survey of human behavior modeling. Robot Comput Integr Manuf 78:102404","journal-title":"Robot Comput Integr Manuf"},{"issue":"7","key":"1902_CR41","doi-asserted-by":"publisher","first-page":"2114","DOI":"10.1109\/TCSVT.2019.2912988","volume":"30","author":"Y Ji","year":"2019","unstructured":"Ji Y, Yang Y, Shen F, Shen HT, Li X (2019) A survey of human action analysis in HRI applications. IEEE Trans Circuits Syst Video Technol 30(7):2114\u20132128","journal-title":"IEEE Trans Circuits Syst Video Technol"},{"key":"1902_CR42","doi-asserted-by":"publisher","first-page":"363","DOI":"10.1007\/s10919-020-00333-3","volume":"44","author":"C Jongerius","year":"2020","unstructured":"Jongerius C, Hessels RS, Romijn JA, Smets EMA, Hillen MA (2020) The measurement of eye contact in human interactions: a scoping review. J Nonverb Behav 44:363\u2013389","journal-title":"J Nonverb Behav"},{"key":"1902_CR43","unstructured":"Jung M, Hempel T, Al-Tawil B, Yang Q, Wachsmuth S, Al-Hamadi A (2025) Toward truly intelligent autonomous systems: a taxonomy of LLM integration for everyday automation. In: International conference on robotics, computer vision and intelligent systems. Springer"},{"key":"1902_CR44","doi-asserted-by":"crossref","unstructured":"Kellnhofer P, Recasens A, Stent S, Matusik W, Torralba A (2019) GAZE360: physically unconstrained gaze estimation in the wild. In: Proceedings of the IEEE\/CVF international conference on computer vision, pp 6912\u20136921","DOI":"10.1109\/ICCV.2019.00701"},{"key":"1902_CR45","doi-asserted-by":"publisher","first-page":"22","DOI":"10.1016\/0001-6918(67)90005-4","volume":"26","author":"A Kendon","year":"1967","unstructured":"Kendon A (1967) Some functions of gaze-direction in social interaction. Acta Psychol 26:22\u201363","journal-title":"Acta Psychol"},{"key":"1902_CR46","unstructured":"Kingma DP, Ba J (2014) ADAM: a method for stochastic optimization. arXiv preprint arXiv:1412.6980"},{"key":"1902_CR47","doi-asserted-by":"crossref","unstructured":"Kompatsiari K, Ciardo F, De\u00a0Tommaso D, Wykowska A (2019) Measuring engagement elicited by eye contact in human\u2013robot interaction. In: 2019 IEEE\/RSJ international conference on intelligent robots and systems (IROS). IEEE, pp 6979\u20136985","DOI":"10.1109\/IROS40897.2019.8967747"},{"key":"1902_CR48","doi-asserted-by":"publisher","first-page":"525","DOI":"10.1007\/s12369-019-00565-4","volume":"13","author":"K Kompatsiari","year":"2021","unstructured":"Kompatsiari K, Ciardo F, Tikhanoff V, Metta G, Wykowska A (2021) It\u2019s in the eyes: the engaging role of eye contact in HRI. Int J Soc Robot 13:525\u2013535","journal-title":"Int J Soc Robot"},{"key":"1902_CR49","unstructured":"Lala D, Inoue K, Milhorat P, Kawahara T (2017) Detection of social signals for recognizing engagement in human\u2013robot interaction. arXiv preprint arXiv:1709.10257"},{"key":"1902_CR50","volume-title":"Measuring user engagement","author":"M Lalmas","year":"2022","unstructured":"Lalmas M, O\u2019Brien H, Yom-Tov E (2022) Measuring user engagement. Springer, New York"},{"key":"1902_CR51","doi-asserted-by":"crossref","unstructured":"Leite I, McCoy M, Ullman D, Salomons N, Scassellati B (2015) Comparing models of disengagement in individual and group interactions. In: Proceedings of the tenth annual ACM\/IEEE international conference on human\u2013robot interaction, pp 99\u2013105","DOI":"10.1145\/2696454.2696466"},{"key":"1902_CR52","unstructured":"Liu T, Kappas A (2018) Predicting engagement breakdown in HRI using thin-slices of facial expressions. In: Workshops at the thirty-second AAAI conference on artificial intelligence"},{"key":"1902_CR53","doi-asserted-by":"crossref","unstructured":"Mitsuzumi Y, Nakazawa A, Nishida T (2017) Deep eye contact detector: robust eye contact bid detection using convolutional neural network. In: British machine vision conference","DOI":"10.5244\/C.31.59"},{"key":"1902_CR54","doi-asserted-by":"crossref","unstructured":"Mitsuzumi Y, Nakazawa A, Nishida T (2017) Deep eye contact detector: robust eye contact bid detection using convolutional neural network. In: British machine vision conference 2017, BMVC 2017. BMVA Press","DOI":"10.5244\/C.31.59"},{"key":"1902_CR55","doi-asserted-by":"crossref","unstructured":"Nakano YI, Ishii R (2010) Estimating user\u2019s engagement from eye-gaze behaviors in human-agent conversations. In: Proceedings of the 15th international conference on intelligent user interfaces, pp 139\u2013148","DOI":"10.1145\/1719970.1719990"},{"issue":"3","key":"1902_CR56","doi-asserted-by":"publisher","first-page":"81","DOI":"10.1007\/s43154-023-00103-1","volume":"4","author":"M Natarajan","year":"2023","unstructured":"Natarajan M, Seraj E, Altundas B, Paleja R, Ye S, Chen L, Jensen R, Chang KC, Gombolay M (2023) Human\u2013robot teaming: grand challenges. Curr Robot Rep 4(3):81\u2013100","journal-title":"Curr Robot Rep"},{"issue":"6","key":"1902_CR57","doi-asserted-by":"publisher","first-page":"938","DOI":"10.1002\/asi.20801","volume":"59","author":"HL O\u2019Brien","year":"2008","unstructured":"O\u2019Brien HL, Toms EG (2008) What is user engagement? A conceptual framework for defining user engagement with technology. J Am Soc Inf Sci Technol 59(6):938\u2013955","journal-title":"J Am Soc Inf Sci Technol"},{"key":"1902_CR58","unstructured":"Pages J, Marchionni L, Ferro F (2016) Tiago: the modular robot that adapts to different research needs. In: International workshop on robot modularity, IROS, vol 290"},{"key":"1902_CR59","doi-asserted-by":"crossref","unstructured":"Peters C, Castellano G, De\u00a0Freitas S (2009) An exploration of user engagement in HCI. In: Proceedings of the international workshop on affective-aware virtual agents and social robots, pp 1\u20133","DOI":"10.1145\/1655260.1655269"},{"key":"1902_CR60","unstructured":"Poggi I (2007) Mind, hands, face and body. A goal and belief view of multimodal communication. Weidler, Berlin"},{"key":"1902_CR61","doi-asserted-by":"publisher","DOI":"10.3389\/fbinf.2022.927312","volume":"2","author":"N Pudjihartono","year":"2022","unstructured":"Pudjihartono N, Fadason T, Kempa-Liehr AW, O\u2019Sullivan JM (2022) A review of feature selection methods for machine learning-based disease risk prediction. Front Bioinform 2:927312","journal-title":"Front Bioinform"},{"key":"1902_CR62","doi-asserted-by":"crossref","unstructured":"Rich C, Ponsler B, Holroyd A, Sidner CL (2010) Recognizing engagement in human\u2013robot interaction. In: 2010 5th ACM\/IEEE international conference on human\u2013robot interaction (HRI). IEEE, pp 375\u2013382","DOI":"10.1109\/HRI.2010.5453163"},{"key":"1902_CR63","unstructured":"Salam H (2021) Distinguishing engagement facets: an essential component for AI-based interactive healthcare. arXiv preprint arXiv:2111.11138"},{"key":"1902_CR64","doi-asserted-by":"crossref","unstructured":"Saleh K, Yu K, Chen F (2021) Improving users engagement detection using end-to-end spatio-temporal convolutional neural networks. In: Companion of the 2021 ACM\/IEEE international conference on human\u2013robot interaction, pp 190\u2013194","DOI":"10.1145\/3434074.3447157"},{"issue":"4","key":"1902_CR65","doi-asserted-by":"publisher","first-page":"2132","DOI":"10.1109\/TAFFC.2022.3188390","volume":"13","author":"AV Savchenko","year":"2022","unstructured":"Savchenko AV, Savchenko LV, Makarov I (2022) Classifying emotions and engagement in online learning based on a single facial expression recognition neural network. IEEE Trans Affect Comput 13(4):2132\u20132143","journal-title":"IEEE Trans Affect Comput"},{"key":"1902_CR66","doi-asserted-by":"publisher","DOI":"10.3389\/frai.2021.663190","volume":"4","author":"E Schellen","year":"2021","unstructured":"Schellen E, Bossi F, Wykowska A (2021) Robot gaze behavior affects honesty in human\u2013robot interaction. Front Artif Intell 4:663190","journal-title":"Front Artif Intell"},{"issue":"1\u20132","key":"1902_CR67","doi-asserted-by":"publisher","first-page":"140","DOI":"10.1016\/j.artint.2005.03.005","volume":"166","author":"CL Sidner","year":"2005","unstructured":"Sidner CL, Lee C, Kidd CD, Lesh N, Rich C (2005) Explorations in engagement for humans and robots. Artif Intell 166(1\u20132):140\u2013164","journal-title":"Artif Intell"},{"key":"1902_CR68","doi-asserted-by":"crossref","unstructured":"Srinivasan S, Singh RR, Biradar RR, Revathi SA (2021) COVID-19 monitoring system using social distancing and face mask detection on surveillance video datasets. In: 2021 International conference on emerging smart computing and informatics (ESCI). IEEE, pp 449\u2013455","DOI":"10.1109\/ESCI50559.2021.9396783"},{"key":"1902_CR69","doi-asserted-by":"publisher","unstructured":"Stahle L, Wold S (1989) Analysis of variance (ANOVA). Chemometr Intell Lab Syst 6(4):259\u2013272. https:\/\/doi.org\/10.1016\/0169-7439(89)80095-4","DOI":"10.1016\/0169-7439(89)80095-4"},{"key":"1902_CR70","unstructured":"Strazdas D, Jung M, Marquenie J, Siegert I, Al-Hamadi A (2025) IM here: interaction model for human effort based robot engagement. In: 2025 IEEE conference on cognitive and computational aspects of situation management (CogSIMA). IEEE"},{"key":"1902_CR71","doi-asserted-by":"publisher","first-page":"99","DOI":"10.2307\/3001913","volume":"56","author":"JW Tukey","year":"1949","unstructured":"Tukey JW (1949) Comparing individual means in the analysis of variance. Biometrics 56:99\u2013114","journal-title":"Biometrics"},{"key":"1902_CR72","doi-asserted-by":"publisher","first-page":"4","DOI":"10.1016\/j.robot.2015.01.004","volume":"75","author":"D Vaufreydaz","year":"2016","unstructured":"Vaufreydaz D, Johal W, Combe C (2016) Starting engagement detection towards a companion robot using multimodal features. Robot Autonom Syst 75:4\u201316","journal-title":"Robot Autonom Syst"},{"key":"1902_CR73","unstructured":"Wienke J, Klotz D, Wrede S (2012) A framework for the acquisition of multimodal human\u2013robot interaction data sets with a whole-system perspective. In: LREC 2012 workshop on multimodal corpora for machine learning. Citeseer"},{"issue":"158","key":"1902_CR74","doi-asserted-by":"publisher","first-page":"209","DOI":"10.1080\/01621459.1927.10502953","volume":"22","author":"EB Wilson","year":"1927","unstructured":"Wilson EB (1927) Probable inference, the law of succession, and statistical inference. J Am Stat Assoc 22(158):209\u2013212","journal-title":"J Am Stat Assoc"},{"key":"1902_CR75","doi-asserted-by":"crossref","unstructured":"Xu Q, Liyuan L, Gang W (2013) Designing engagement-aware agents for multiparty conversations. In: Proceedings of the SIGCHI conference on human factors in computing systems, pp 2233\u20132242","DOI":"10.1145\/2470654.2481308"},{"issue":"1","key":"1902_CR76","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/2882970","volume":"6","author":"T Xu","year":"2016","unstructured":"Xu T, Zhang H, Yu C (2016) See you see me: the role of eye contact in multimodal human-robot interaction. ACM Trans Interact Intell Syst TIIS 6(1):1\u201322","journal-title":"ACM Trans Interact Intell Syst TIIS"},{"issue":"6","key":"1902_CR77","doi-asserted-by":"publisher","DOI":"10.1007\/s11432-020-3181-9","volume":"65","author":"D Zhang","year":"2022","unstructured":"Zhang D, Wang B, Wang G, Zhang Q, Zhang J, Han J, You Z (2022) Onfocus detection: identifying individual-camera eye contact from unconstrained images. Sci China Inf Sci 65(6):160101","journal-title":"Sci China Inf Sci"}],"container-title":["Complex &amp; Intelligent Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s40747-025-01902-z.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s40747-025-01902-z\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s40747-025-01902-z.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,19]],"date-time":"2025-06-19T11:07:15Z","timestamp":1750331235000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s40747-025-01902-z"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,5,12]]},"references-count":77,"journal-issue":{"issue":"7","published-print":{"date-parts":[[2025,7]]}},"alternative-id":["1902"],"URL":"https:\/\/doi.org\/10.1007\/s40747-025-01902-z","relation":{},"ISSN":["2199-4536","2198-6053"],"issn-type":[{"value":"2199-4536","type":"print"},{"value":"2198-6053","type":"electronic"}],"subject":[],"published":{"date-parts":[[2025,5,12]]},"assertion":[{"value":"15 January 2025","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"4 April 2025","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"12 May 2025","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare that they have no Conflict of interest or Conflict of interest.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}},{"value":"Not applicable, as existing and publicly available data sets were used. This study does not contain any studies with human or animal subjects performed by any of the authors.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics approval, consent to participate and consent for publication"}}],"article-number":"286"}}