{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,6]],"date-time":"2026-04-06T10:18:07Z","timestamp":1775470687502,"version":"3.50.1"},"reference-count":50,"publisher":"Springer Science and Business Media LLC","issue":"4","license":[{"start":{"date-parts":[[2022,10,20]],"date-time":"2022-10-20T00:00:00Z","timestamp":1666224000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,10,20]],"date-time":"2022-10-20T00:00:00Z","timestamp":1666224000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100003052","name":"ministry of trade, industry and energy","doi-asserted-by":"publisher","award":["10077553"],"award-info":[{"award-number":["10077553"]}],"id":[{"id":"10.13039\/501100003052","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001537","name":"University of Auckland","doi-asserted-by":"crossref","id":[{"id":"10.13039\/501100001537","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Int J of Soc Robotics"],"published-print":{"date-parts":[[2024,4]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:sec><jats:title>Background<\/jats:title><jats:p>Sentiment expression and detection are crucial for effective and empathetic human-robot interaction. Previous work in this field often focuses on non-verbal emotion expression, such as facial expressions and gestures. Less is known about which specific prosodic speech elements are required in human-robot interaction. Our research question was: what prosodic elements are related to emotional speech in human-computer\/robot interaction?<\/jats:p><\/jats:sec><jats:sec><jats:title>Methods<\/jats:title><jats:p>The scoping review was conducted in alignment with the Arksey and O\u2019Malley methods. Literature was identified from the SCOPUS, IEEE Xplore, ACM Digital Library and PsycINFO databases in May 2021. After screening and de-duplication, data were extracted into an Excel coding sheet and summarised.<\/jats:p><\/jats:sec><jats:sec><jats:title>Results<\/jats:title><jats:p>Thirteen papers, published from 2012 to 2020 were included in the review. The most commonly used prosodic elements were tone\/pitch (n\u2009=\u20098), loudness\/volume (n\u2009=\u20096) speech speed (n\u2009=\u20094) and pauses (n\u2009=\u20093). Non-linguistic vocalisations (n\u2009=\u20091) were less frequently used. The prosodic elements were generally effective in helping to convey or detect emotion, but were less effective for negative sentiment (e.g., anger, fear, frustration, sadness and disgust).<\/jats:p><\/jats:sec><jats:sec><jats:title>Discussion<\/jats:title><jats:p>Future research should explore the effectiveness of commonly used prosodic elements (tone, loudness, speed and pauses) in emotional speech, using larger sample sizes and real-life interaction scenarios. The success of prosody in conveying negative sentiment to humans may be improved with additional non-verbal cues (e.g., coloured light or motion). More research is needed to determine how these may be combined with prosody and which combination is most effective in human-robot affective interaction.<\/jats:p><\/jats:sec>","DOI":"10.1007\/s12369-022-00913-x","type":"journal-article","created":{"date-parts":[[2022,10,20]],"date-time":"2022-10-20T14:04:07Z","timestamp":1666274647000},"page":"659-670","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":29,"title":["A Scoping Review of the Literature On Prosodic Elements Related to Emotional Speech in Human-Robot Interaction"],"prefix":"10.1007","volume":"16","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-7801-7417","authenticated-orcid":false,"given":"Norina","family":"Gasteiger","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8707-5069","authenticated-orcid":false,"given":"JongYoon","family":"Lim","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-7502-3130","authenticated-orcid":false,"given":"Mehdi","family":"Hellou","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0001-7602-8497","authenticated-orcid":false,"given":"Bruce A.","family":"MacDonald","sequence":"additional","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0001-7418-6280","authenticated-orcid":false,"given":"Ho Seok","family":"Ahn","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,10,20]]},"reference":[{"key":"913_CR1","doi-asserted-by":"crossref","DOI":"10.7551\/mitpress\/1140.001.0001","volume-title":"Affective Computing","author":"R Picard","year":"1997","unstructured":"Picard R (1997) Affective Computing. MIT Press, USA"},{"key":"913_CR2","volume-title":"Robotics, Medicine and Health Sciences, in The Routledge Social Science Handbook of AI, A. Elliott, Editor","author":"N Gasteiger","year":"2021","unstructured":"Gasteiger N, Broadbent E (2021) AI, Robotics, Medicine and Health Sciences, in The Routledge Social Science Handbook of AI, A. Elliott, Editor. Routledge, New York"},{"key":"913_CR3","volume-title":"1st workshop on Emotion and Computing-Current Research and Future Impact","author":"M Ochs","year":"2006","unstructured":"Ochs M et al (2006) A computational model of capability-based emotion elicitation for rational agent. 1st workshop on Emotion and Computing-Current Research and Future Impact. Bremen, Germany"},{"issue":"8","key":"913_CR4","doi-asserted-by":"publisher","first-page":"2712","DOI":"10.3390\/s21082712","volume":"21","author":"J Lim","year":"2021","unstructured":"Lim J et al (2021) Subsentence Extraction from Text Using Coverage-Based Deep Learning Language Models. Sensors 21(8):2712","journal-title":"Sensors"},{"key":"913_CR5","doi-asserted-by":"crossref","unstructured":"Antona M et al (2019) My robot is happy today: how older people with mild cognitive impairments understand assistive robots\u2019 affective output, in 12th ACM International Conference on PErvasive Technologies Related to Assistive Environments. p.\u00a0416\u2013424","DOI":"10.1145\/3316782.3322777"},{"issue":"1\/4","key":"913_CR6","first-page":"407","volume":"158","author":"W Thompson","year":"2006","unstructured":"Thompson W, Balkwill L (2006) Decoding speech prosody in five languages. Semiotica 158(1\/4):407\u2013424","journal-title":"Semiotica"},{"key":"913_CR7","doi-asserted-by":"publisher","first-page":"143","DOI":"10.1037\/0033-2909.99.2.143","volume":"99","author":"K Scherer","year":"1986","unstructured":"Scherer K (1986) Vocal affect expression: A review and a model for future research. Psychol Bull 99:143\u2013165","journal-title":"Psychol Bull"},{"issue":"3\u20134","key":"913_CR8","doi-asserted-by":"publisher","first-page":"169","DOI":"10.1080\/02699939208411068","volume":"6","author":"P Ekman","year":"1992","unstructured":"Ekman P (1992) An argument for basic emotions. Cogn Emot 6(3\u20134):169\u2013200","journal-title":"Cogn Emot"},{"key":"913_CR9","first-page":"140","volume-title":"NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text","author":"D Ghazi","year":"2010","unstructured":"Ghazi D, Inkpen D, Szpakowicz S (2010) Hierarchical versus Flat Classification of Emotions in Text. NAACL HLT 2010 Workshop on Computational Approaches to Analysis and Generation of Emotion in Text. LA, California, Association for Computational Linguistics, pp 140\u2013146"},{"issue":"4","key":"913_CR10","doi-asserted-by":"publisher","first-page":"530","DOI":"10.1177\/0018720811425922","volume":"54","author":"R Calix","year":"2011","unstructured":"Calix R, Javadpour L, Knapp G (2011) Detection of Affective States From Text and Speech for Real-Time Human\u2013Computer Interaction. Hum Factors: J Hum Factors Ergon Soc 54(4):530\u2013545","journal-title":"Hum Factors: J Hum Factors Ergon Soc"},{"key":"913_CR11","doi-asserted-by":"crossref","unstructured":"Plutchik R (1980) A general psychoevolutionary theory of emotion. Theories of emotion. Elsevier, pp 3\u201333","DOI":"10.1016\/B978-0-12-558701-3.50007-7"},{"issue":"6","key":"913_CR12","doi-asserted-by":"publisher","first-page":"1161","DOI":"10.1037\/h0077714","volume":"39","author":"J Russell","year":"1980","unstructured":"Russell J (1980) A circumplex model of affect. J Personal Soc Psychol 39(6):1161","journal-title":"J Personal Soc Psychol"},{"key":"913_CR13","doi-asserted-by":"crossref","unstructured":"Hutto C, Gilbert E (2014) VADER: A Parsimonious Rule-Based Model for Sentiment Analysis of Social Media Text, in AAAI Conference on Web and Social Media.","DOI":"10.1609\/icwsm.v8i1.14550"},{"issue":"1","key":"913_CR14","doi-asserted-by":"publisher","first-page":"24","DOI":"10.1177\/0261927X09351676","volume":"29","author":"Y Tausczik","year":"2010","unstructured":"Tausczik Y, Pennebaker J (2010) The Psychological Meaning of Words: LIWC and Computerized Text Analysis Methods. J Lang Social Psychol 29(1):24\u201354","journal-title":"J Lang Social Psychol"},{"key":"913_CR15","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-540-74628-7_27","volume-title":"Identifying Expressions of Emotion in Text","author":"S Aman","year":"2007","unstructured":"Aman S, Szpakowicz S (2007) In: Text S, Dialogue V, Matou\u0161ek (eds) Identifying Expressions of Emotion in Text. Springer: Berlin, Editors"},{"key":"913_CR16","volume-title":"The English tone of voice: essays in intonation, prosody and paralanguage. The English tone of voice: essays in intonation, prosody and paralanguage","author":"D Crystal","year":"1975","unstructured":"Crystal D (1975) The English tone of voice: essays in intonation, prosody and paralanguage. The English tone of voice: essays in intonation, prosody and paralanguage. Edward Arnold, London"},{"key":"913_CR17","doi-asserted-by":"crossref","unstructured":"Siqueira H et al (2018) Disambiguating Affective Stimulus Associations for Robot Perception and Dialogue, in IEEE-RAS 18th International Conference on Humanoid Robots (Humanoids). IEEE: Beijing, China","DOI":"10.1109\/HUMANOIDS.2018.8625012"},{"key":"913_CR18","doi-asserted-by":"crossref","unstructured":"Provost E, Shangguan Y, Busso C (2015) Umeme: University of Michigan emotional McGurk effect data set. IEEE Transactions on Affective Computing, 6(4): p.\u00a0395\u2013409","DOI":"10.1109\/TAFFC.2015.2407898"},{"key":"913_CR19","doi-asserted-by":"publisher","first-page":"66","DOI":"10.1016\/j.actpsy.2018.04.013","volume":"187","author":"L Aguado","year":"2018","unstructured":"Aguado L et al (2018) Effects of affective and emotional congruency on facial expression processing under different task demands. Acta Psychol 187:66\u201376","journal-title":"Acta Psychol"},{"key":"913_CR20","doi-asserted-by":"crossref","unstructured":"Paradeda R et al (2018) Would You Follow the Suggestions of a Storyteller Robot?, in 11th International Conference on Interactive Digital Storytelling. : Dublin, Ireland","DOI":"10.1007\/978-3-030-04028-4_57"},{"key":"913_CR21","volume-title":"Adaptive emotional chatting behavior to increase the sociability of robots, in Social Robotics (ICSR), A. Kheddar, Editor","author":"I Rodriguez","year":"2017","unstructured":"Rodriguez I et al (2017) Adaptive emotional chatting behavior to increase the sociability of robots, in Social Robotics (ICSR), A. Kheddar, Editor. Springer, Cham"},{"issue":"7","key":"913_CR22","first-page":"1","volume":"6","author":"S Anderson","year":"2008","unstructured":"Anderson S et al (2008) Asking the right questions: Scoping studies in the commissioning of research on the organisation and delivery of health services. Health Res Policy Syst 6(7):1\u201312","journal-title":"Health Res Policy Syst"},{"issue":"69","key":"913_CR23","first-page":"1","volume":"5","author":"D Levac","year":"2010","unstructured":"Levac D, Colquhoun H, O\u2019Brien KK (2010) Scoping studies: Advancing the methodology. Implement Sci 5(69):1\u20139","journal-title":"Implement Sci"},{"key":"913_CR24","doi-asserted-by":"crossref","unstructured":"Arksey H, O\u2019Malley L, Scoping studies: Towards a methodological framework.International Journal of Social Research Methodology 2005. 8(1): p.19\u201332","DOI":"10.1080\/1364557032000119616"},{"issue":"2","key":"913_CR25","doi-asserted-by":"publisher","first-page":"183","DOI":"10.1016\/j.im.2014.08.008","volume":"52","author":"G Par\u00e9","year":"2015","unstructured":"Par\u00e9 G et al (2015) Synthesizing information systems knowledge: A typology of literature reviews. Inform Manage 52(2):183\u2013199","journal-title":"Inform Manage"},{"issue":"7","key":"913_CR26","doi-asserted-by":"publisher","first-page":"467","DOI":"10.7326\/M18-0850","volume":"169","author":"A Tricco","year":"2018","unstructured":"Tricco A et al (2018) PRISMA extension for scoping reviews (PRISMA-ScR): checklist and explanation. Ann Intern Med 169(7):467\u2013473","journal-title":"Ann Intern Med"},{"issue":"7","key":"913_CR27","doi-asserted-by":"publisher","first-page":"e1000097","DOI":"10.1371\/journal.pmed.1000097","volume":"6","author":"D Moher","year":"2009","unstructured":"Moher D et al (2009) Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med 6(7):e1000097","journal-title":"PLoS Med"},{"issue":"1","key":"913_CR28","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1145\/2133366.2133372","volume":"2","author":"F Eyben","year":"2012","unstructured":"Eyben F, W\u00f6llmer M, Schuller B (2012) A multitask approach to continuous five-dimensional affect sensing in natural speech. ACM Trans Interact Intell Syst 2(1):1\u201329","journal-title":"ACM Trans Interact Intell Syst"},{"issue":"1","key":"913_CR29","doi-asserted-by":"publisher","first-page":"224","DOI":"10.20965\/jrm.2020.p0224","volume":"32","author":"W-F Hsieh","year":"2020","unstructured":"Hsieh W-F, Sato-Shimokawara E, Yamaguchi T (2020) Investigation of Robot Expression Style in Human-Robot Interaction. J Robot Mechatron 32(1):224\u2013235","journal-title":"J Robot Mechatron"},{"key":"913_CR30","doi-asserted-by":"crossref","unstructured":"Valenti A et al (2020) Emotion expression in a socially assistive robot for persons with Parkinson\u2019s disease, in Proceedings of the 13th ACM International Conference on PErvasive Technologies Related to Assistive Environments. p.\u00a01\u201310","DOI":"10.1145\/3389189.3389190"},{"key":"913_CR31","doi-asserted-by":"crossref","unstructured":"Tsiourti C et al (2017) Designing Emotionally Expressive Robots, in Proceedings of the 5th International Conference on Human Agent Interaction. p.\u00a0213\u2013222","DOI":"10.1145\/3125739.3125744"},{"key":"913_CR32","doi-asserted-by":"crossref","unstructured":"\u0218erban O et al (2017) Interactive narration with a child: impact of prosody and facial expressions, in Proceedings of the 19th ACM International Conference on Multimodal Interaction. p.\u00a023\u201331","DOI":"10.1145\/3136755.3136797"},{"key":"913_CR33","doi-asserted-by":"crossref","unstructured":"Crumpton J, Bethel C (2014) Conveying Emotion in Robotic Speech: Lessons Learned, in The 23rd IEEE International Symposium on Robot and Human Interactive Communication. IEEE: Edinburgh, Scotland, UK","DOI":"10.1109\/ROMAN.2014.6926265"},{"key":"913_CR34","doi-asserted-by":"crossref","unstructured":"Juszkiewicz L (2014) Improving Speech Emotion Recognition System for a Social Robot with Speaker Recognition, in 19th International Conference on Methods and Models in Automation and Robotics. IEEE: Miedzyzdroje, Poland","DOI":"10.1109\/MMAR.2014.6957480"},{"key":"913_CR35","doi-asserted-by":"crossref","unstructured":"Aly A, Tapus A (2015) Multimodal Adapted Robot Behavior Synthesis within a Narrative Human-Robot Interaction, in International Conference on Intelligent Robots and Systems. IEEE\/RSJ: Hamburg, Germany","DOI":"10.1109\/IROS.2015.7353789"},{"key":"913_CR36","doi-asserted-by":"crossref","unstructured":"Rabiei M, Gasparetto A (2016) System and method for recognizing human emotion state based on analysis of speech and facial feature extraction; Applications to Human-Robot Interaction, in nternational Conference on Robotics and Mechatronics. IEEE: Tehran, Iran","DOI":"10.1109\/ICRoM.2016.7886857"},{"key":"913_CR37","doi-asserted-by":"crossref","unstructured":"Yamamoto K et al (2018) Analysis of Emotional Expression by Visualization of the Human and Synthesized Speech Signal Sets, in 2018 International Workshop on Advanced Image Technology (IWAIT). IEEE: Chiang Mai, Thailand","DOI":"10.1109\/IWAIT.2018.8369809"},{"issue":"20","key":"913_CR38","doi-asserted-by":"publisher","first-page":"1030","DOI":"10.1080\/01691864.2019.1667872","volume":"33","author":"Y Li","year":"2019","unstructured":"Li Y et al (2019) Expressing reactive emotion based on multimodal emotion recognition for natural conversation in human\u2013robot interaction. Adv Robot 33(20):1030\u20131041","journal-title":"Adv Robot"},{"key":"913_CR39","doi-asserted-by":"crossref","unstructured":"Li Y, Ishi C, Ward N (2017) Emotion Recognition by Combining Prosody and Sentiment Analysis for Expressing Reactive Emotion by Humanoid Robot, in APSIPA Annual Summit and Conference. APSIPA: Malaysia","DOI":"10.1109\/APSIPA.2017.8282243"},{"key":"913_CR40","unstructured":"SoftBank Robotics, Available NAO from: https:\/\/www.softbankrobotics.com\/emea\/en\/nao"},{"key":"913_CR41","unstructured":"TU WIen. HOBBIT - THE MUTUAL CARE ROBOT. n.d.; Available from: http:\/\/hobbit.acin.tuwien.ac.at"},{"key":"913_CR42","unstructured":"Guizzo E(2014) How Aldebaran Robotics Built its Friendly Humanoid Robot, Pepper. ; Available from: https:\/\/spectrum.ieee.org\/robotics\/home-robots\/how-aldebaran-robotics-built-its-friendly-humanoid-robot-pepper"},{"key":"913_CR43","unstructured":"SoftBank Robotics. Pepper. n.d.; Available from: https:\/\/www.softbankrobotics.com\/emea\/en\/pepper"},{"key":"913_CR44","doi-asserted-by":"crossref","unstructured":"L\u00f6ffler D, Schmidt N, Tscharn R(2018) Multimodal Expression of Artificial Emotion in Social Robots Using Color, Motion and Sound, in ACM\/IEEE International Conference on Human-Robot Interaction.","DOI":"10.1145\/3171221.3171261"},{"key":"913_CR45","doi-asserted-by":"publisher","first-page":"84","DOI":"10.1016\/j.actpsy.2016.06.012","volume":"170","author":"T Sutton","year":"2016","unstructured":"Sutton T, Altarriba J (2016) Finding the positive in all of the negative: Facilitation for color-related emotion words in a negative priming paradigm. Acta Psychol 170:84\u201393","journal-title":"Acta Psychol"},{"key":"913_CR46","doi-asserted-by":"crossref","unstructured":"Fugate J, Franco C (2019) What Color Is Your Anger? Assessing Color-Emotion Pairings in English Speakers. Front. Psychol","DOI":"10.3389\/fpsyg.2019.00206"},{"key":"913_CR47","doi-asserted-by":"crossref","unstructured":"Gasteiger N et al (2021) Older Adults\u2019 Experiences and Perceptions of Living with Bomy, an Assistive Dailycare Robot: A Qualitative Study. Assistive Technology","DOI":"10.1080\/10400435.2021.1877210"},{"issue":"7","key":"913_CR48","doi-asserted-by":"publisher","first-page":"416","DOI":"10.12788\/jhm.3248","volume":"14","author":"H Sucharew","year":"2019","unstructured":"Sucharew H, Macaluso M, Methods for Research Evidence Synthesis (2019) Scoping Rev Approach J Hosp Med 14(7):416\u2013418","journal-title":"Scoping Rev Approach J Hosp Med"},{"issue":"16","key":"913_CR49","doi-asserted-by":"publisher","first-page":"941","DOI":"10.2147\/CIA.S282709","volume":"2021","author":"N Gasteiger","year":"2021","unstructured":"Gasteiger N et al (2021) Friends from the Future: A Scoping Review of Research into Robots and Computer Agents to Combat Loneliness in Older People. Clin Interv Aging 2021(16):941\u2013971","journal-title":"Clin Interv Aging"},{"key":"913_CR50","doi-asserted-by":"publisher","first-page":"e000371","DOI":"10.1136\/bmjpo-2018-000371","volume":"3","author":"J Dawe","year":"2019","unstructured":"Dawe J et al (2019) Can social robots help children in healthcare contexts? A scoping review. BMJ Paediatrics Open 3:e000371","journal-title":"BMJ Paediatrics Open"}],"container-title":["International Journal of Social Robotics"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12369-022-00913-x.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s12369-022-00913-x\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s12369-022-00913-x.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,10,6]],"date-time":"2024-10-06T07:23:29Z","timestamp":1728199409000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s12369-022-00913-x"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,10,20]]},"references-count":50,"journal-issue":{"issue":"4","published-print":{"date-parts":[[2024,4]]}},"alternative-id":["913"],"URL":"https:\/\/doi.org\/10.1007\/s12369-022-00913-x","relation":{},"ISSN":["1875-4791","1875-4805"],"issn-type":[{"value":"1875-4791","type":"print"},{"value":"1875-4805","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,10,20]]},"assertion":[{"value":"1 August 2021","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"2 August 2022","order":2,"name":"revised","label":"Revised","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"3 August 2022","order":3,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"20 October 2022","order":4,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors have no conflicts of interest to declare that are relevant to the content of this article.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflicts of Interest\/Competing Interests"}},{"value":"Not applicable.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethics Approval"}},{"value":"Not applicable.","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent to Participate"}},{"value":"Not applicable.","order":5,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent for Publication"}}]}}