{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,8]],"date-time":"2026-04-08T11:55:15Z","timestamp":1775649315673,"version":"3.50.1"},"reference-count":48,"publisher":"Springer Science and Business Media LLC","issue":"2","license":[{"start":{"date-parts":[[2026,3,26]],"date-time":"2026-03-26T00:00:00Z","timestamp":1774483200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2026,4,8]],"date-time":"2026-04-08T00:00:00Z","timestamp":1775606400000},"content-version":"vor","delay-in-days":13,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/100010665","name":"H2020 Marie Sk\u0142odowska-Curie Actions","doi-asserted-by":"publisher","award":["955901"],"award-info":[{"award-number":["955901"]}],"id":[{"id":"10.13039\/100010665","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/100010665","name":"H2020 Marie Sk\u0142odowska-Curie Actions","doi-asserted-by":"publisher","award":["955901"],"award-info":[{"award-number":["955901"]}],"id":[{"id":"10.13039\/100010665","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/100010665","name":"H2020 Marie Sk\u0142odowska-Curie Actions","doi-asserted-by":"publisher","award":["847402"],"award-info":[{"award-number":["847402"]}],"id":[{"id":"10.13039\/100010665","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["J Intell Robot Syst"],"abstract":"<jats:title>Abstract<\/jats:title>\n                  <jats:p>\n                    Despite significant advances in teleoperated systems, optimal human performance remains a challenging task due to one key aspect, i.e.\n                    <jats:italic>reduced situational awareness<\/jats:italic>\n                    . Active telepresence is one of the major contributing factors affecting operator\u2019s situational awareness. Providing the user with multiple perspectives of the remote environment can be very beneficial, although it may also introduce challenges in maintaining effective control. Prior work has largely examined discrete manipulation tasks using single-modality assessments, whereas this study applies a multimodal neuroergonomic approach to a continuous precision-cutting task, providing first insights into control\u2013view alignment and multi-camera perception effects. In this work, we focus on two interaction and interface design factors: (a)\n                    <jats:italic>control interaction factor<\/jats:italic>\n                    , where we compare the effects of fixed and camera view-aligned control user interfaces on task-performance; (b)\n                    <jats:italic>perception interface factor<\/jats:italic>\n                    : the impact of different visual feedback configurations on operator attention and workload state, and interaction with the interface. A telerobotic system focused on a cutting task is chosen for this evaluation as it demands precision, depth perception, and also continuous feedback. A multimodal bio-sensor network is used to record operator-centric data, i.e Eye-tracking pupil measures, gaze measures and EEG signals, and assess the proposed interfaces using a neuroergonomics framework. User-study results show that using a view-aligned control frame in a multi-camera interface significantly improves cutting quality compared to using a fixed control frame, while the display of multiple camera perspectives in the same window is preferred to a single camera per window, promoting operator visual attention with no significant impact on the mental workload.\n                  <\/jats:p>","DOI":"10.1007\/s10846-026-02383-z","type":"journal-article","created":{"date-parts":[[2026,3,26]],"date-time":"2026-03-26T12:47:13Z","timestamp":1774529233000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":0,"title":["Human-Centered Evaluation of Control and Perception Interface Factors in a Teleoperational Cutting Task"],"prefix":"10.1007","volume":"112","author":[{"given":"In\u00eas F.","family":"Ramos","sequence":"first","affiliation":[]},{"given":"Keerthi","family":"Sagar","sequence":"additional","affiliation":[]},{"given":"Philip","family":"Long","sequence":"additional","affiliation":[]},{"given":"Maria Chiara","family":"Leva","sequence":"additional","affiliation":[]},{"given":"Ernesto","family":"Damiani","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5186-0199","authenticated-orcid":false,"given":"Gabriele","family":"Gianini","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2026,3,26]]},"reference":[{"key":"2383_CR1","doi-asserted-by":"crossref","unstructured":"Rakita, D., Mutlu, B., Gleicher, M.: Remote telemanipulation with adapting viewpoints in visually complex environments. Robotics: Science and Systems XV. (2019)","DOI":"10.15607\/RSS.2019.XV.068"},{"key":"2383_CR2","doi-asserted-by":"publisher","unstructured":"Rea, D.J., Seo, S.H.: Still Not Solved: A Call for Renewed Focus on User-Centered Teleoperation Interfaces. Frontiers in Robotics and AI. 9, 1\u201313 (2022). https:\/\/doi.org\/10.3389\/frobt.2022.704225","DOI":"10.3389\/frobt.2022.704225"},{"key":"2383_CR3","doi-asserted-by":"publisher","unstructured":"Chen, J.Y.C., Haas, E.C., Barnes, M.J.: Human performance issues and user interface design for teleoperated robots. IEEE Trans. Syst. Man Cybernet. Part C Appl. Rev. 37(6), 1231\u20131245 (2007). https:\/\/doi.org\/10.1109\/TSMCC.2007.905819","DOI":"10.1109\/TSMCC.2007.905819"},{"key":"2383_CR4","doi-asserted-by":"publisher","unstructured":"Rakita, D., Mutlu, B., Gleicher, M.: An Autonomous Dynamic Camera Method for Effective Remote Teleoperation. ACM\/IEEE Int. Conf. Hum. Robot Interact., pp. 325\u2013333 (2018). https:\/\/doi.org\/10.1145\/3171221.3171279","DOI":"10.1145\/3171221.3171279"},{"key":"2383_CR5","doi-asserted-by":"crossref","unstructured":"Praveena, P., Molina, L., Wang, Y., Senft, E., Mutlu, B., Gleicher, M.: Understanding Control Frames in Multi-Camera Robot Telemanipulation. In: Proceedings of the 2022 ACM\/IEEE International Conference on Human-Robot Interaction, pp. 432\u2013440. IEEE Press, (2022)","DOI":"10.1109\/HRI53351.2022.9889543"},{"key":"2383_CR6","doi-asserted-by":"crossref","unstructured":"Kent, D., Saldanha, C., Chernova, S.: A Comparison of Remote Robot Teleoperation Interfaces for General Object Manipulation. In: HRI \u201917: Proceedings of the 2017 ACM\/IEEE International Conference on Human-Robot Interaction, pp. 371\u2013379 (2017)","DOI":"10.1145\/2909824.3020249"},{"key":"2383_CR7","doi-asserted-by":"publisher","unstructured":"Lipton, J.I., Fay, A.J., Rus, D.: Baxter\u2019s Homunculus: Virtual Reality Spaces for Teleoperation in Manufacturing. IEEE Robot. Automat. Lett. 3(1), 179\u2013186 (2018). https:\/\/doi.org\/10.1109\/LRA.2017.2737046, arXiv:1703.01270","DOI":"10.1109\/LRA.2017.2737046"},{"key":"2383_CR8","doi-asserted-by":"publisher","unstructured":"Sun, D., Kiselev, A., Liao, Q., Stoyanov, T., Loutfi, A.: A New Mixed-Reality-Based Teleoperation System for Telepresence and Maneuverability Enhancement. IEEE Trans. Hum. Mach. Syst. 50(1), 55\u201367 (2020). https:\/\/doi.org\/10.1109\/THMS.2019.2960676","DOI":"10.1109\/THMS.2019.2960676"},{"issue":"5\u20136","key":"2383_CR9","first-page":"322","volume":"38","author":"S Falcone","year":"2022","unstructured":"Falcone, S., Englebienne, G., Van Erp, J., Heylen, D.: Toward Standard Guidelines to Design the Sense of Embodiment in Teleoperation Applications: A Review and Toolbox. Human-Computer Interaction. 38(5\u20136), 322\u2013351 (2022)","journal-title":"Human-Computer Interaction."},{"key":"2383_CR10","doi-asserted-by":"publisher","unstructured":"Bejczy, B., Bozyil, R., Vaiekauskas, E., Petersen, S.B.K., Bogh, S., Hjorth, S.S., Hansen, E.B.: Mixed reality interface for improving mobile manipulator teleoperation in contamination critical applications. Procedia Manuf. 51, 620\u2013626 (2020). https:\/\/doi.org\/10.1016\/J.PROMFG.2020.10.087","DOI":"10.1016\/J.PROMFG.2020.10.087"},{"key":"2383_CR11","doi-asserted-by":"publisher","unstructured":"Naceri, A., Mazzanti, D., Bimbo, J., Tefera, Y.T., Prattichizzo, D., Caldwell, D.G., Mattos, L.S., Deshpande, N.: The Vicarios Virtual Reality Interface for Remote Robotic Teleoperation. J. Intell. Robot. Syst. 101(4) (2021). https:\/\/doi.org\/10.1007\/S10846-021-01311-7","DOI":"10.1007\/S10846-021-01311-7"},{"key":"2383_CR12","doi-asserted-by":"publisher","unstructured":"Su, Y., Chen, X., Zhou, T., Pretty, C., Chase, G.: Mixed reality-integrated 3D\/2D vision mapping for intuitive teleoperation of mobile manipulator. Robot. Comput. Integ. Manuf. 77, 102332 (2022). https:\/\/doi.org\/10.1016\/J.RCIM.2022.102332","DOI":"10.1016\/J.RCIM.2022.102332"},{"key":"2383_CR13","doi-asserted-by":"publisher","unstructured":"Lim, Y., Gardi, A., Sabatini, R., Ramasamy, S., Kistan, T., Ezer, N., Vince, J., Bolia, R.: Avionics Human-Machine Interfaces and Interactions for Manned and Unmanned Aircraft. Prog. Aerosp. Sci. 102, 1\u201346 (2018). https:\/\/doi.org\/10.1016\/j.paerosci.2018.05.002","DOI":"10.1016\/j.paerosci.2018.05.002"},{"key":"2383_CR14","doi-asserted-by":"crossref","unstructured":"Taori, S., Kim, S., Lim, S.: Evaluating and predicting cognitive workload in collaborative manufacturing scenarios: Human-human and teleoperator-robot-human. Available at SSRN 5213995. (2025)","DOI":"10.2139\/ssrn.5213995"},{"issue":"11","key":"2383_CR15","doi-asserted-by":"publisher","first-page":"728","DOI":"10.3390\/info15110728","volume":"15","author":"IF Ramos","year":"2024","unstructured":"Ramos, I.F., Gianini, G., Leva, M.C., Damiani, E.: Collaborative intelligence for safety-critical industries: A literature review. Information 15(11), 728 (2024)","journal-title":"Information"},{"issue":"10","key":"2383_CR16","doi-asserted-by":"publisher","first-page":"1009","DOI":"10.3390\/brainsci14101009","volume":"14","author":"B Bjegojevi\u0107","year":"2024","unstructured":"Bjegojevi\u0107, B., Pu\u0161ica, M., Gianini, G., Gligorijevi\u0107, I., Cromie, S., Leva, M.C.: Neuroergonomic attention assessment in safety-critical tasks: EEG indices and subjective metrics validation in a novel task-embedded reaction time paradigm. Brain Sci. 14(10), 1009 (2024)","journal-title":"Brain Sci."},{"issue":"2","key":"2383_CR17","doi-asserted-by":"publisher","first-page":"149","DOI":"10.3390\/brainsci14020149","volume":"14","author":"M Pu\u0161ica","year":"2024","unstructured":"Pu\u0161ica, M., Kartali, A., Bojovi\u0107, L., Gligorijevi\u0107, I., Jovanovi\u0107, J., Leva, M.C., Mijovi\u0107, B.: Mental workload classification and tasks detection in multitasking: Deep learning insights from EEG study. Brain Sci. 14(2), 149 (2024)","journal-title":"Brain Sci."},{"key":"2383_CR18","doi-asserted-by":"publisher","unstructured":"Giraudet, L., Imbert, J.P., B\u00e9renger, M., Tremblay, S., Causse, M.: The neuroergonomic evaluation of human machine interface design in air traffic control using behavioral and EGG\/ERP measures. Behav. Brain Res. 294, 246\u2013253 (2015). https:\/\/doi.org\/10.1016\/j.bbr.2015.07.041","DOI":"10.1016\/j.bbr.2015.07.041"},{"key":"2383_CR19","doi-asserted-by":"publisher","unstructured":"Li, Y., Elmaghraby, A.S., El-Baz, A., Sokhadze, E.M.: Using physiological signal analysis to design affective VR games. 2015 IEEE International Symposium on Signal Processing and Information Technology, ISSPIT 2015, 57\u201362 (2016). https:\/\/doi.org\/10.1109\/ISSPIT.2015.7394401","DOI":"10.1109\/ISSPIT.2015.7394401"},{"issue":"11","key":"2383_CR20","doi-asserted-by":"publisher","first-page":"995","DOI":"10.3390\/machines11110995","volume":"11","author":"C Caiazzo","year":"2023","unstructured":"Caiazzo, C., Savkovic, M., Pusica, M., Milojevic, D., Leva, M.C., Djapan, M.: Development of a neuroergonomic assessment for the evaluation of mental workload in an industrial human-robot interaction assembly task: A comparative case study. Machines. 11(11), 995 (2023)","journal-title":"Machines."},{"issue":"8","key":"2383_CR21","doi-asserted-by":"publisher","first-page":"691","DOI":"10.3390\/aerospace11080691","volume":"11","author":"E Mu\u00f1oz-de-Escalona","year":"2024","unstructured":"Mu\u00f1oz-de-Escalona, E., Leva, M.C., Ca\u00f1as, J.J.: Mental workload as a predictor of ATCO\u2019s performance: Lessons learnt from ATM task-related experiments. Aerospace 11(8), 691 (2024)","journal-title":"Aerospace"},{"key":"2383_CR22","first-page":"223","volume":"111","author":"CW Amazu","year":"2024","unstructured":"Amazu, C.W., Demichela, M., Fissore, D., Leva, M.C.: Comparing two control room intervention procedure formats: Preliminary insights from eye tracking measures. Chem. Eng. Trans. 111, 223\u2013228 (2024)","journal-title":"Chem. Eng. Trans."},{"key":"2383_CR23","doi-asserted-by":"publisher","unstructured":"Dehais, F., Lafont, A., Roy, R., Fairclough, S.: A Neuroergonomics Approach to Mental Workload, Engagement and Human Performance. Front. Neurosci. 14, 268 (2020). https:\/\/doi.org\/10.3389\/fnins.2020.00268","DOI":"10.3389\/fnins.2020.00268"},{"key":"2383_CR24","doi-asserted-by":"publisher","unstructured":"McMahan, T., Parberry, I., Parsons, T.D.: Evaluating Player Task Engagement and Arousal Using Electroencephalography. Procedia Manuf. 3, 2303\u20132310 (2015). https:\/\/doi.org\/10.1016\/j.promfg.2015.07.376","DOI":"10.1016\/j.promfg.2015.07.376"},{"key":"2383_CR25","doi-asserted-by":"publisher","unstructured":"Zhai, W., Liao, J., Chen, Z., Su, B., Zhao, X.: A survey of task planning with large language models. Intell. Comput. 4, 0124 (2025). https:\/\/doi.org\/10.34133\/icomputing.0124, https:\/\/spj.science.org\/doi\/pdf\/10.34133\/icomputing.0124","DOI":"10.34133\/icomputing.0124"},{"key":"2383_CR26","doi-asserted-by":"publisher","unstructured":"Alharasees, O., Kale, U.: Human factors and AI in UAV systems: Enhancing operational efficiency through AHP and real-time physiological monitoring. J. Intell. Robot. Syst. 111(1) (2024). https:\/\/doi.org\/10.1007\/s10846-024-02188-y","DOI":"10.1007\/s10846-024-02188-y"},{"key":"2383_CR27","doi-asserted-by":"publisher","unstructured":"Andriella, A., Ros, R., Ellinson, Y., Gannot, S., Lemaignan, S.: Dataset and evaluation of automatic speech recognition for multi-lingual intent recognition on social robots. In: Proceedings of the 2024 ACM\/IEEE International Conference on Human-Robot Interaction. HRI \u201924, pp. 865\u2013869. ACM, (2024). https:\/\/doi.org\/10.1145\/3610977.3637473","DOI":"10.1145\/3610977.3637473"},{"key":"2383_CR28","doi-asserted-by":"publisher","unstructured":"Fernandez Rojas, R., Debie, E., Fidock, J., Barlow, M., Kasmarik, K., Anavatti, S., Garratt, M., Abbass, H.: Electroencephalographic workload indicators during teleoperation of an unmanned aerial vehicle shepherding a swarm of unmanned ground vehicles in contested environments. Front. Neurosci. 14 (2020). https:\/\/doi.org\/10.3389\/fnins.2020.00040","DOI":"10.3389\/fnins.2020.00040"},{"key":"2383_CR29","doi-asserted-by":"publisher","unstructured":"Guo, Y., Freer, D., Deligianni, F., Yang, G.-Z.: Eye-tracking for performance evaluation and workload estimation in space telerobotic training. IEEE Trans. Hum. Mach. Syst. 52(1), 1\u201311 (2022). https:\/\/doi.org\/10.1109\/THMS.2021.3107519","DOI":"10.1109\/THMS.2021.3107519"},{"key":"2383_CR30","doi-asserted-by":"crossref","unstructured":"Bethel, C.L., Henkel, Z., Baugus, K.: Conducting Studies in Human-Robot Interaction, pp. 91\u2013124. Springer, (2020)","DOI":"10.1007\/978-3-030-42307-0_4"},{"key":"2383_CR31","doi-asserted-by":"publisher","unstructured":"Leeper, A.E., Hsiao, K., Ciocarlie, M., Takayama, L., Gossow, D.: Strategies for human-in-the-loop robotic grasping. In: HRI\u201912 - Proceedings of the 7th Annual ACM\/IEEE International Conference on Human-Robot Interaction, pp. 1\u20138 (2012). https:\/\/doi.org\/10.1145\/2157689.2157691","DOI":"10.1145\/2157689.2157691"},{"key":"2383_CR32","doi-asserted-by":"publisher","unstructured":"Pope, A.T., Bogart, E.H., Bartolome, D.S.: Biocybernetic system evaluates indices of operator engagement in automated task. Biol. Psychol. 40(1\u20132), 187\u2013195 (1995). https:\/\/doi.org\/10.1016\/0301-0511(95)05116-3","DOI":"10.1016\/0301-0511(95)05116-3"},{"key":"2383_CR33","doi-asserted-by":"publisher","unstructured":"Kim, M.K., Kim, M., Oh, E., Kim, S.P.: A review on the computational methods for emotional state estimation from the human EEG. Comput. Math. Methods Med. 2013 (2013). https:\/\/doi.org\/10.1155\/2013\/573734","DOI":"10.1155\/2013\/573734"},{"key":"2383_CR34","doi-asserted-by":"publisher","unstructured":"Mijovi\u0107, P., Kovi\u0107, V., De Vos, M., Ma\u010du\u017ei\u0107, I., Todorovi\u0107, P., Jeremi\u0107, B., Gligorijevi\u0107, I.: Towards continuous and real-time attention monitoring at work: reaction time versus brain response. Ergonomics. 60(2), 241\u2013254 (2017). https:\/\/doi.org\/10.1080\/00140139.2016.1142121","DOI":"10.1080\/00140139.2016.1142121"},{"key":"2383_CR35","doi-asserted-by":"publisher","unstructured":"Dehais, F., Dupr\u00e8s, A., Di Flumeri, G., Verdi\u00e8re, K., Borghini, G., Babiloni, F., Roy, R.: Monitoring Pilot\u2019s Cognitive Fatigue with Engagement Features in Simulated and Actual Flight Conditions Using an Hybrid fNIRS-EEG Passive BCI. In: Proceedings - 2018 IEEE International Conference on Systems, Man, and Cybernetics, SMC 2018, 544\u2013549 (2019). https:\/\/doi.org\/10.1109\/SMC.2018.00102","DOI":"10.1109\/SMC.2018.00102"},{"key":"2383_CR36","doi-asserted-by":"publisher","unstructured":"Mahanama, B., Jayawardana, Y., Rengarajan, S., Jayawardena, G., Chukoskie, L., Snider, J., Jayarathna, S.: Eye movement and pupil measures: A review. Front. Comput. Sci. 3 (2022). https:\/\/doi.org\/10.3389\/fcomp.2021.733531","DOI":"10.3389\/fcomp.2021.733531"},{"key":"2383_CR37","doi-asserted-by":"publisher","unstructured":"Delorme, A., Makeig, S.: EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis. J. Neurosci. Methods. 134(1), 9\u201321 (2004). https:\/\/doi.org\/10.1016\/j.jneumeth.2003.10.009","DOI":"10.1016\/j.jneumeth.2003.10.009"},{"key":"2383_CR38","doi-asserted-by":"publisher","unstructured":"Mullen, T.R., Kothe, C.A.E., Chi, Y.M., Ojeda, A., Kerth, T., Makeig, S., Jung, T.-P., Cauwenberghs, G.: Real-time neuroimaging and cognitive monitoring using wearable dry EEG. IEEE Trans. Biomed. Eng. 62(11), 2553\u20132567 (2015). https:\/\/doi.org\/10.1109\/tbme.2015.2481482","DOI":"10.1109\/tbme.2015.2481482"},{"key":"2383_CR39","doi-asserted-by":"publisher","unstructured":"Pion-Tonachini, L., Kreutz-Delgado, K., Makeig, S.: IClabel: An automated electroencephalographic independent component classifier, dataset, and website. NeuroImage. 198, 181\u2013197 (2019). https:\/\/doi.org\/10.1016\/j.neuroimage.2019.05.026","DOI":"10.1016\/j.neuroimage.2019.05.026"},{"key":"2383_CR40","doi-asserted-by":"publisher","unstructured":"Mastropietro, A., Pirovano, I., Marciano, A., Porcelli, S., Rizzo, G.: Reliability of mental workload index assessed by EEG with different electrode configurations and signal pre-processing pipelines. Sensors. 23(3), 1367 (2023). https:\/\/doi.org\/10.3390\/s23031367","DOI":"10.3390\/s23031367"},{"key":"2383_CR41","doi-asserted-by":"publisher","unstructured":"Marcantoni, I., Assogna, R., Del Borrello, G., Di Stefano, M., Morano, M., Romagnoli, S., Leoni, C., Bruschi, G., Sbrollini, A., Morettini, M., Burattini, L.: Ratio indexes based on spectral electroencephalographic brainwaves for assessment of mental involvement: A systematic review. Sensors. 23(13), 5968 (2023). https:\/\/doi.org\/10.3390\/s23135968","DOI":"10.3390\/s23135968"},{"key":"2383_CR42","doi-asserted-by":"crossref","unstructured":"Krejtz, K., Szmidt, T., Duchowski, A.T., Krejtz, I.: Entropy-based statistical analysis of eye movement transitions. In: ETRA \u201914: Eye Tracking Research and Applications, pp. 159\u2013166. ACM, (2014)","DOI":"10.1145\/2578153.2578176"},{"key":"2383_CR43","doi-asserted-by":"publisher","unstructured":"Hart, S.G., Staveland, L.E.: Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. Adv. Psychol. 52(C), 139\u2013183 (1988). https:\/\/doi.org\/10.1016\/S0166-4115(08)62386-9","DOI":"10.1016\/S0166-4115(08)62386-9"},{"key":"2383_CR44","doi-asserted-by":"publisher","unstructured":"Bradley, M.M., Lang, P.J.: Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry. 25(1), 49\u201359 (1994). https:\/\/doi.org\/10.1016\/0005-7916(94)90063-9","DOI":"10.1016\/0005-7916(94)90063-9"},{"key":"2383_CR45","doi-asserted-by":"publisher","unstructured":"Hannum, M.E., Forzley, S., Popper, R., Simons, C.T.: Application of the Engagement Questionnaire (EQ) to compare methodological differences in sensory and consumer testing. Food Research International. 140, 110083 (2021). https:\/\/doi.org\/10.1016\/j.foodres.2020.110083","DOI":"10.1016\/j.foodres.2020.110083"},{"key":"2383_CR46","doi-asserted-by":"publisher","unstructured":"Oh, D.S., Ershad, M., Wee, J.O., Sancheti, M.S., D\u2019Souza, D.M., Herrera, L.J., Schumacher, L.Y., Shields, M., Brown, K., Yousaf, S., Lazar, J.F.: Comparison of global evaluative assessment of robotic surgery with objective performance indicators for the assessment of skill during robotic-assisted thoracic surgery. Surgery. 174(6), 1349\u20131355 (2023). https:\/\/doi.org\/10.1016\/j.surg.2023.08.008","DOI":"10.1016\/j.surg.2023.08.008"},{"key":"2383_CR47","doi-asserted-by":"publisher","unstructured":"Faul, F., Erdfelder, E., Buchner, A., Lang, A.-G.: Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses. Behav. Res. Methods. 41(4), 1149\u20131160 (2009). https:\/\/doi.org\/10.3758\/brm.41.4.1149","DOI":"10.3758\/brm.41.4.1149"},{"key":"2383_CR48","doi-asserted-by":"publisher","unstructured":"Rendon-Velez, E., van Leeuwen, P.M., Happee, R., Horv\u00e1th, I., van der Vegte, W.F., de Winter, J.C.F.: The effects of time pressure on driver performance and physiological activity: A driving simulator study. Transp. Res. Part F Traffic Psychol. Behav. 41, 150\u2013169 (2016). https:\/\/doi.org\/10.1016\/j.trf.2016.06.013","DOI":"10.1016\/j.trf.2016.06.013"}],"container-title":["Journal of Intelligent &amp; Robotic Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s10846-026-02383-z","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10846-026-02383-z.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10846-026-02383-z.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,4,8]],"date-time":"2026-04-08T11:03:18Z","timestamp":1775646198000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s10846-026-02383-z"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2026,3,26]]},"references-count":48,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2026,6]]}},"alternative-id":["2383"],"URL":"https:\/\/doi.org\/10.1007\/s10846-026-02383-z","relation":{},"ISSN":["1573-0409"],"issn-type":[{"value":"1573-0409","type":"electronic"}],"subject":[],"published":{"date-parts":[[2026,3,26]]},"assertion":[{"value":"21 July 2025","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"3 March 2026","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"26 March 2026","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The experiment received ethical approval\n                      \n                      REC-20-52C and all subjects gave their informed consent.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics approval and consent to participate"}},{"value":"Not Applicable","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent for publication"}},{"value":"Not Applicable","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest\/Competing interests"}}],"article-number":"47"}}