{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,11,11]],"date-time":"2025-11-11T22:35:05Z","timestamp":1762900505798,"version":"3.37.3"},"reference-count":122,"publisher":"MIT Press","issue":"11","funder":[{"name":"The MARCS Institute of Brain, Behaviour and Development","award":["Doctoral Scholarship"],"award-info":[{"award-number":["Doctoral Scholarship"]}]},{"name":"HEARing Cooperative Research Centre","award":["Doctoral Scholarship"],"award-info":[{"award-number":["Doctoral Scholarship"]}]},{"DOI":"10.13039\/501100003086","name":"Basque Government","doi-asserted-by":"crossref","award":["BERC 2018-2021","PIBA PI-2019-0054"],"award-info":[{"award-number":["BERC 2018-2021","PIBA PI-2019-0054"]}],"id":[{"id":"10.13039\/501100003086","id-type":"DOI","asserted-by":"crossref"}]},{"DOI":"10.13039\/501100004837","name":"Ministerio de Ciencia e Innovaci\u00f3n","doi-asserted-by":"publisher","award":["RYC2018-024284-I"],"award-info":[{"award-number":["RYC2018-024284-I"]}],"id":[{"id":"10.13039\/501100004837","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001602","name":"Science Foundation Ireland","doi-asserted-by":"publisher","award":["13\/RC\/2106_P2"],"award-info":[{"award-number":["13\/RC\/2106_P2"]}],"id":[{"id":"10.13039\/501100001602","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["direct.mit.edu"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2023,11,1]]},"abstract":"<jats:title>Abstract<\/jats:title>\n               <jats:p>In face-to-face conversations, listeners gather visual speech information from a speaker's talking face that enhances their perception of the incoming auditory speech signal. This auditory\u2013visual (AV) speech benefit is evident even in quiet environments but is stronger in situations that require greater listening effort such as when the speech signal itself deviates from listeners' expectations. One example is infant-directed speech (IDS) presented to adults. IDS has exaggerated acoustic properties that are easily discriminable from adult-directed speech (ADS). Although IDS is a speech register that adults typically use with infants, no previous neurophysiological study has directly examined whether adult listeners process IDS differently from ADS. To address this, the current study simultaneously recorded EEG and eye-tracking data from adult participants as they were presented with auditory-only (AO), visual-only, and AV recordings of IDS and ADS. Eye-tracking data were recorded because looking behavior to the speaker's eyes and mouth modulates the extent of AV speech benefit experienced. Analyses of cortical tracking accuracy revealed that cortical tracking of the speech envelope was significant in AO and AV modalities for IDS and ADS. However, the AV speech benefit [i.e., AV &amp;gt; (A + V)] was only present for IDS trials. Gaze behavior analyses indicated differences in looking behavior during IDS and ADS trials. Surprisingly, looking behavior to the speaker's eyes and mouth was not correlated with cortical tracking accuracy. Additional exploratory analyses indicated that attention to the whole display was negatively correlated with cortical tracking accuracy of AO and visual-only trials in IDS. Our results underscore the nuances involved in the relationship between neurophysiological AV speech benefit and looking behavior.<\/jats:p>","DOI":"10.1162\/jocn_a_02044","type":"journal-article","created":{"date-parts":[[2023,9,7]],"date-time":"2023-09-07T18:38:56Z","timestamp":1694111936000},"page":"1741-1759","update-policy":"https:\/\/doi.org\/10.1162\/mitpressjournals.corrections.policy","source":"Crossref","is-referenced-by-count":2,"title":["Seeing a Talking Face Matters: Gaze Behavior and the Auditory\u2013Visual Speech Benefit in Adults' Cortical Tracking of Infant-directed Speech"],"prefix":"10.1162","volume":"35","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-6388-9746","authenticated-orcid":true,"given":"Sok Hui Jessica","family":"Tan","sequence":"first","affiliation":[{"name":"The MARCS Institute of Brain, Behaviour and Development, Western Sydney University, Australia"},{"name":"Science of Learning in Education Centre, Office of Education Research, National Institute of Education, Nanyang Technological University, Singapore"}]},{"given":"Marina","family":"Kalashnikova","sequence":"additional","affiliation":[{"name":"The Basque Center on Cognition, Brain and Language"},{"name":"IKERBASQUE, Basque Foundation for Science"}]},{"given":"Giovanni M.","family":"Di Liberto","sequence":"additional","affiliation":[{"name":"ADAPT Centre, School of Computer Science and Statistics, Trinity College Institute of Neuroscience, Trinity College, The University of Dublin, Ireland"}]},{"given":"Michael J.","family":"Crosse","sequence":"additional","affiliation":[{"name":"SEGOTIA, Galway, Ireland"},{"name":"Trinity Center for Biomedical Engineering, Department of Mechanical, Manufacturing & Biomedical Engineering, Trinity College Dublin, Dublin, Ireland"}]},{"given":"Denis","family":"Burnham","sequence":"additional","affiliation":[{"name":"The MARCS Institute of Brain, Behaviour and Development, Western Sydney University, Australia"}]}],"member":"281","published-online":{"date-parts":[[2023,11,1]]},"reference":[{"key":"2023101014123366300_bib1","doi-asserted-by":"publisher","first-page":"520","DOI":"10.1037\/a0013552","article-title":"Comprehension of familiar and unfamiliar native accents under adverse listening conditions","volume":"35","author":"Adank","year":"2009","journal-title":"Journal of Experimental Psychology: Human Perception and Performance"},{"key":"2023101014123366300_bib2","doi-asserted-by":"publisher","first-page":"13367","DOI":"10.1073\/pnas.201400998","article-title":"Speech comprehension is correlated with temporal response patterns recorded from auditory cortex","volume":"98","author":"Ahissar","year":"2001","journal-title":"Proceedings of the National Academy of Sciences, U.S.A."},{"key":"2023101014123366300_bib3","doi-asserted-by":"publisher","first-page":"6108","DOI":"10.1523\/JNEUROSCI.2476-21.2022","article-title":"Differential auditory and visual phase-locking are observed during audio-visual benefit and silent lip-reading for speech perception","volume":"42","author":"Aller","year":"2022","journal-title":"Journal of Neuroscience"},{"key":"2023101014123366300_bib4","doi-asserted-by":"publisher","first-page":"1472","DOI":"10.3758\/S13414-016-1109-4","article-title":"High visual resolution matters in audiovisual speech perception, but only for some","volume":"78","author":"Alsius","year":"2016","journal-title":"Attention, Perception, & Psychophysics"},{"key":"2023101014123366300_bib5","doi-asserted-by":"publisher","first-page":"13445","DOI":"10.1523\/JNEUROSCI.3194-09.2009","article-title":"Dual neural routing of visual facilitation in speech processing","volume":"29","author":"Arnal","year":"2009","journal-title":"Journal of Neuroscience"},{"key":"2023101014123366300_bib6","doi-asserted-by":"publisher","first-page":"797","DOI":"10.1038\/nn.2810","article-title":"Transitions in neural oscillations reflect prediction errors generated in audiovisual speech","volume":"14","author":"Arnal","year":"2011","journal-title":"Nature Neuroscience"},{"key":"2023101014123366300_bib7","doi-asserted-by":"publisher","first-page":"339","DOI":"10.1348\/000712601162220","article-title":"Bisensory augmentation: A speechreading advantage when speech is clearly audible and intact","volume":"92","author":"Arnold","year":"2001","journal-title":"British Journal of Psychology"},{"key":"2023101014123366300_bib8","doi-asserted-by":"publisher","first-page":"31","DOI":"10.1016\/j.cognition.2013.09.006","article-title":"Degrading phonetic information affects matching of audiovisual speech in adults, but not in infants","volume":"130","author":"Baart","year":"2014","journal-title":"Cognition"},{"key":"2023101014123366300_bib9","doi-asserted-by":"publisher","first-page":"153","DOI":"10.1016\/s0065-2407(02)80041-6","article-title":"Intersensory redundancy guides early perceptual and cognitive development","volume-title":"Advances in child development and behavior","author":"Bahrick","year":"2002"},{"key":"2023101014123366300_bib10","doi-asserted-by":"publisher","first-page":"422","DOI":"10.3389\/fnhum.2015.00422","article-title":"Audiovisual cues benefit recognition of accented speech in noise but not perceptual adaptation","volume":"9","author":"Banks","year":"2015","journal-title":"Frontiers in Human Neuroscience"},{"key":"2023101014123366300_bib11","doi-asserted-by":"publisher","first-page":"34","DOI":"10.3389\/fnins.2013.00034","article-title":"Auditory perceptual learning for speech perception can be enhanced by audiovisual training","volume":"7","author":"Bernstein","year":"2013","journal-title":"Frontiers in Neuroscience"},{"key":"2023101014123366300_bib155","doi-asserted-by":"publisher","first-page":"2225","DOI":"10.1111\/j.1460-9568.2004.03670.x","article-title":"Bimodal speech: Early suppressive visual effects in human auditory cortex","volume":"20","author":"Besle","year":"2004","journal-title":"European Journal of Neuroscience"},{"key":"2023101014123366300_bib13","doi-asserted-by":"publisher","first-page":"1314","DOI":"10.1080\/23273798.2020.1762905","article-title":"Highly proficient L2 speakers still need to attend to a talker's mouth when processing L2 speech","volume":"35","author":"Birul\u00e9s","year":"2020","journal-title":"Language, Cognition, & Neuroscience"},{"key":"2023101014123366300_bib14","doi-asserted-by":"publisher","first-page":"33","DOI":"10.1007\/S11257-015-9167-1","article-title":"Automatic gaze-based user-independent detection of mind wandering during computerized reading","volume":"26","author":"Bixler","year":"2016","journal-title":"User Modeling and User-Adapted Interaction"},{"key":"2023101014123366300_bib15","doi-asserted-by":"publisher","first-page":"3135","DOI":"10.1044\/2019_JSLHR-S-18-0392","article-title":"Second language learners' listener impressions of foreigner-directed speech","volume":"62","author":"Bobb","year":"2019","journal-title":"Journal of Speech, Language, and Hearing Research"},{"key":"2023101014123366300_bib16","doi-asserted-by":"publisher","first-page":"e0162177","DOI":"10.1371\/journal.pone.0162177","article-title":"Infant directed speech enhances statistical learning in newborn infants: An ERP study","volume":"11","author":"Bosseler","year":"2016","journal-title":"PLoS One"},{"key":"2023101014123366300_bib17","doi-asserted-by":"publisher","first-page":"1151","DOI":"10.1007\/S10936-015-9396-9","article-title":"The temporal dynamics of spoken word recognition in adverse listening conditions","volume":"45","author":"Brouwer","year":"2016","journal-title":"Journal of Psycholinguistic Research"},{"key":"2023101014123366300_bib18","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1080\/17470910601043644","article-title":"Spatial statistics of gaze fixations during dynamic face processing","volume":"2","author":"Buchan","year":"2007","journal-title":"Social Neuroscience"},{"key":"2023101014123366300_bib19","doi-asserted-by":"publisher","first-page":"162","DOI":"10.1016\/j.brainres.2008.06.083","article-title":"The effect of varying talker identity and listening conditions on gaze behaviour during audiovisual speech perception","volume":"1242","author":"Buchan","year":"2008","journal-title":"Brain Research"},{"key":"2023101014123366300_bib20","doi-asserted-by":"publisher","first-page":"1435","DOI":"10.1126\/science.1069587","article-title":"What's new, pussycat? On talking to babies and animals","volume":"296","author":"Burnham","year":"2002","journal-title":"Science"},{"key":"2023101014123366300_bib21","doi-asserted-by":"publisher","first-page":"40","DOI":"10.1016\/j.specom.2022.03.011","article-title":"Seeing lexical tone: Head and face motion in production and perception of Cantonese lexical tones","volume":"141","author":"Burnham","year":"2022","journal-title":"Speech Communication"},{"key":"2023101014123366300_bib22","doi-asserted-by":"publisher","first-page":"2175","DOI":"10.1109\/ICSLP.1996.607235","article-title":"About the relationship between eyebrow movements and Fo variations","volume-title":"Proceeding of the Fourth International Conference on Spoken Language Processing (ICSLP '96)","author":"Cav\u00e9","year":"1996"},{"key":"2023101014123366300_bib23","doi-asserted-by":"publisher","first-page":"e1000436","DOI":"10.1371\/journal.pcbi.1000436","article-title":"The natural statistics of audiovisual speech","volume":"5","author":"Chandrasekaran","year":"2009","journal-title":"PLoS Computational Biology"},{"key":"2023101014123366300_bib24","doi-asserted-by":"publisher","first-page":"211","DOI":"10.1002\/ICD.286","article-title":"Three facial expressions mothers direct to their infants","volume":"12","author":"Chong","year":"2003","journal-title":"Infant and Child Development"},{"key":"2023101014123366300_bib25","doi-asserted-by":"publisher","first-page":"181","DOI":"10.1017\/S0140525X12000477","article-title":"Whatever next? Predictive brains, situated agents, and the future of cognitive science","volume":"36","author":"Clark","year":"2013","journal-title":"Behavioural and Brain Sciences"},{"key":"2023101014123366300_bib26","doi-asserted-by":"publisher","first-page":"1584","DOI":"10.1111\/J.1467-8624.1990.tb02885.x","article-title":"Preference for infant-directed speech in the first month after birth","volume":"61","author":"Cooper","year":"1990","journal-title":"Child Development"},{"key":"2023101014123366300_bib27","doi-asserted-by":"publisher","first-page":"14195","DOI":"10.1523\/JNEUROSCI.1829-15.2015","article-title":"Congruent visual speech enhances cortical entrainment to continuous auditory speech in noise-free conditions","volume":"35","author":"Crosse","year":"2015","journal-title":"Journal of Neuroscience"},{"key":"2023101014123366300_bib28","doi-asserted-by":"publisher","first-page":"604","DOI":"10.3389\/fnhum.2016.00604","article-title":"The multivariate temporal response function (mTRF) toolbox: A MATLAB toolbox for relating neural signals to continuous stimuli","volume":"10","author":"Crosse","year":"2016","journal-title":"Frontiers in Human Neuroscience"},{"key":"2023101014123366300_bib29","doi-asserted-by":"publisher","first-page":"9888","DOI":"10.1523\/JNEUROSCI.1396-16.2016","article-title":"Eye can hear clearly now: Inverse effectiveness in natural audiovisual speech processing relies on long-term crossmodal temporal integration","volume":"36","author":"Crosse","year":"2016","journal-title":"Journal of Neuroscience"},{"key":"2023101014123366300_bib30","doi-asserted-by":"publisher","first-page":"705621","DOI":"10.3389\/fnins.2021.705621","article-title":"Linear modeling of neurophysiological responses to speech and other continuous stimuli: Methodological considerations for applied research","volume":"15","author":"Crosse","year":"2021","journal-title":"Frontiers in Neuroscience"},{"key":"2023101014123366300_bib31","doi-asserted-by":"publisher","first-page":"555","DOI":"10.1016\/j.specom.2010.02.006","article-title":"Prosody off the top of the head: Prosodic contrasts can be discriminated by head motion","volume":"52","author":"Cvejic","year":"2010","journal-title":"Speech Communication"},{"key":"2023101014123366300_bib32","doi-asserted-by":"publisher","first-page":"3565","DOI":"10.1093\/cercor\/bhab032","article-title":"Pupil dilation and the slow wave ERP reflect surprise about choice outcome resulting from intrinsic variability in decision confidence","volume":"31","author":"de Gee","year":"2021","journal-title":"Cerebral Cortex"},{"key":"2023101014123366300_bib33","doi-asserted-by":"publisher","first-page":"9","DOI":"10.1016\/j.jneumeth.2003.10.009","article-title":"EEGLAB: An open source toolbox for analysis of single-trial EEG dynamics including independent component analysis","volume":"134","author":"Delorme","year":"2004","journal-title":"Journal of Neuroscience Methods"},{"key":"2023101014123366300_bib34","doi-asserted-by":"publisher","first-page":"70","DOI":"10.1016\/j.heares.2017.02.015","article-title":"Indexing cortical entrainment to natural speech at the phonemic level: Methodological considerations for applied research","volume":"348","author":"Di Liberto","year":"2017","journal-title":"Hearing Research"},{"key":"2023101014123366300_bib35","doi-asserted-by":"publisher","first-page":"70","DOI":"10.1016\/j.neuroimage.2018.03.072","article-title":"Atypical cortical entrainment to speech in the right hemisphere underpins phonemic deficits in dyslexia","volume":"175","author":"Di Liberto","year":"2018","journal-title":"Neuroimage"},{"key":"2023101014123366300_bib36","doi-asserted-by":"publisher","first-page":"78","DOI":"10.1152\/jn.00297.2011","article-title":"Neural coding of continuous speech in auditory cortex during monaural and dichotic listening","volume":"107","author":"Ding","year":"2012","journal-title":"Journal of Neurophysiology"},{"key":"2023101014123366300_bib37","doi-asserted-by":"publisher","first-page":"11854","DOI":"10.1073\/pnas.1205381109","article-title":"Emergence of neural encoding of auditory objects while listening to competing speakers","volume":"109","author":"Ding","year":"2012","journal-title":"Proceedings of the National Academy of Sciences, U.S.A."},{"key":"2023101014123366300_bib38","doi-asserted-by":"publisher","first-page":"5728","DOI":"10.1523\/JNEUROSCI.5297-12.2013","article-title":"Adaptive temporal encoding leads to a background-insensitive cortical representation of speech","volume":"33","author":"Ding","year":"2013","journal-title":"Journal of Neuroscience"},{"key":"2023101014123366300_bib39","doi-asserted-by":"publisher","first-page":"311","DOI":"10.3389\/fnhum.2014.00311","article-title":"Cortical entrainment to continuous speech: Functional roles and interpretations","volume":"8","author":"Ding","year":"2014","journal-title":"Frontiers in Human Neuroscience"},{"key":"2023101014123366300_bib40","doi-asserted-by":"publisher","first-page":"423","DOI":"10.1044\/jshr.1202.423","article-title":"Interaction of audition and vision in the recognition of oral speech stimuli","volume":"12","author":"Erber","year":"1969","journal-title":"Journal of Speech and Hearing Research"},{"key":"2023101014123366300_bib41","doi-asserted-by":"publisher","first-page":"120","DOI":"10.1016\/j.jecp.2013.03.003","article-title":"The relationship between auditory\u2013visual speech perception and language-specific speech perception at the onset of reading instruction in English-speaking children","volume":"116","author":"Erdener","year":"2013","journal-title":"Journal of Experimental Child Psychology"},{"key":"2023101014123366300_bib42","doi-asserted-by":"publisher","DOI":"10.1121\/1.4784597","article-title":"Intelligibility of foreign-accented speech in noise for younger and older adults","volume":"125","author":"Ferguson","year":"2009","journal-title":"Journal of the Acoustical Society of America"},{"key":"2023101014123366300_bib43","doi-asserted-by":"publisher","first-page":"209","DOI":"10.1037\/0012-1649.27.2.209","article-title":"Prosody and focus in speech to infants and adults","volume":"27","author":"Fernald","year":"1991","journal-title":"Developmental Psychology"},{"key":"2023101014123366300_bib44","doi-asserted-by":"publisher","first-page":"477","DOI":"10.1017\/S0305000900010679","article-title":"A cross-language study of prosodic modifications in mothers' and fathers' speech to preverbal infants","volume":"16","author":"Fernald","year":"1989","journal-title":"Journal of Child Language"},{"key":"2023101014123366300_bib45","doi-asserted-by":"publisher","first-page":"33","DOI":"10.1016\/j.neuroimage.2018.10.057","article-title":"Late cortical tracking of ignored speech facilitates neural selectivity in acoustically challenging conditions","volume":"186","author":"Fiedler","year":"2019","journal-title":"Neuroimage"},{"key":"2023101014123366300_bib46","doi-asserted-by":"publisher","first-page":"379","DOI":"10.1007\/S10936-008-9097-8","article-title":"Regional and foreign accent processing in English: Can listeners adapt?","volume":"38","author":"Floccia","year":"2009","journal-title":"Journal of Psycholinguistic Research"},{"key":"2023101014123366300_bib47","doi-asserted-by":"publisher","first-page":"1060","DOI":"10.1162\/jocn_a_00764","article-title":"Cortical representations sensitive to the number of perceived auditory objects emerge between 2 and 4 months of age: Electrophysiological evidence","volume":"27","author":"Folland","year":"2015","journal-title":"Journal of Cognitive Neuroscience"},{"key":"2023101014123366300_bib48","doi-asserted-by":"publisher","first-page":"1207","DOI":"10.1080\/01690965.2012.701758","article-title":"Seeing the initial articulatory gestures of a word triggers lexical access","volume":"28","author":"Fort","year":"2013","journal-title":"Language and Cognitive Processes"},{"key":"2023101014123366300_bib49","doi-asserted-by":"publisher","first-page":"51","DOI":"10.1037\/a0030217","article-title":"Mind wandering in sentence reading: Decoupling the link between mind and eye","volume":"67","author":"Foulsham","year":"2013","journal-title":"Canadian Journal of Experimental Psychology"},{"key":"2023101014123366300_bib50","doi-asserted-by":"publisher","first-page":"127","DOI":"10.1038\/nrn2787","article-title":"The free-energy principle: A unified brain theory?","volume":"11","author":"Friston","year":"2010","journal-title":"Nature Reviews Neuroscience"},{"key":"2023101014123366300_bib51","doi-asserted-by":"publisher","first-page":"511","DOI":"10.1038\/nn.3063","article-title":"Cortical oscillations and speech processing: Emerging computational principles and operations","volume":"15","author":"Giraud","year":"2012","journal-title":"Nature Neuroscience"},{"key":"2023101014123366300_bib52","doi-asserted-by":"publisher","first-page":"444","DOI":"10.1121\/1.3397409","article-title":"Recognition of accented English in quiet by younger normal-hearing listeners and older listeners with normal-hearing and hearing loss","volume":"128","author":"Gordon-Salant","year":"2010","journal-title":"Journal of the Acoustical Society of America"},{"key":"2023101014123366300_bib53","doi-asserted-by":"publisher","first-page":"EL200","DOI":"10.1121\/1.3486199","article-title":"Short-term adaptation to accented English by younger and older adults","volume":"128","author":"Gordon-Salant","year":"2010","journal-title":"Journal of the Acoustical Society of America"},{"key":"2023101014123366300_bib54","doi-asserted-by":"publisher","first-page":"33","DOI":"10.1007\/978-3-030-10461-0_3","article-title":"Toward a model of auditory\u2013visual speech intelligibility","volume-title":"Multisensory processes","author":"Grant","year":"2019"},{"key":"2023101014123366300_bib156","doi-asserted-by":"publisher","first-page":"1197","DOI":"10.1121\/1288668","article-title":"The use of visible speech cues for improving auditory detection of spoken sentences","volume":"108","author":"Grant","year":"2000","journal-title":"Journal of the Acoustical Society of America"},{"key":"2023101014123366300_bib55","doi-asserted-by":"publisher","first-page":"1529","DOI":"10.1044\/1092-4388(2010\/09-0005)","article-title":"Lip movement exaggerations during infant-directed speech","volume":"53","author":"Green","year":"2010","journal-title":"Journal of Speech, Language, and Hearing Research"},{"key":"2023101014123366300_bib57","doi-asserted-by":"publisher","first-page":"1333","DOI":"10.3758\/S13414-014-0821-1","article-title":"A link between individual differences in multisensory speech perception and eye movements","volume":"77","author":"Gurler","year":"2015","journal-title":"Attention, Perception, & Psychophysics"},{"key":"2023101014123366300_bib58","doi-asserted-by":"publisher","first-page":"96","DOI":"10.1016\/j.neuroimage.2013.10.067","article-title":"On the interpretation of weight vectors of linear models in multivariate neuroimaging","volume":"87","author":"Haufe","year":"2014","journal-title":"Neuroimage"},{"key":"2023101014123366300_bib59","doi-asserted-by":"publisher","first-page":"617","DOI":"10.1016\/j.neuroimage.2018.07.052","article-title":"Cortical tracking of multiple streams outside the focus of attention in naturalistic auditory scenes","volume":"181","author":"Hausfeld","year":"2018","journal-title":"Neuroimage"},{"key":"2023101014123366300_bib60","article-title":"How does foreigner-directed speech differ from other forms of listener-directed clear speaking styles?","volume-title":"18th International Congress of Phonetic Sciences","author":"Hazan","year":"2015"},{"key":"2023101014123366300_bib61","doi-asserted-by":"publisher","first-page":"850","DOI":"10.3758\/PBR.16.5.850","article-title":"Searching in the dark: Cognitive relevance drives attention in real-world scenes","volume":"16","author":"Henderson","year":"2009","journal-title":"Psychonomic Bulletin & Review"},{"key":"2023101014123366300_bib62","doi-asserted-by":"publisher","first-page":"20095","DOI":"10.1073\/pnas.1213390109","article-title":"Frequency modulation entrains slow neural oscillations and optimizes human listening behaviour","volume":"109","author":"Henry","year":"2012","journal-title":"Proceedings of the National Academy of Sciences, U.S.A."},{"key":"2023101014123366300_bib63","doi-asserted-by":"publisher","first-page":"25","DOI":"10.1186\/s40101-015-0063-5","article-title":"Analysis of physiological signals for recognition of boredom, pain, and surprise emotions","volume":"34","author":"Jang","year":"2015","journal-title":"Journal of Physiological Anthropology"},{"key":"2023101014123366300_bib64","doi-asserted-by":"publisher","first-page":"1563","DOI":"10.1080\/17470218.2012.658822","article-title":"Predicting foreign-accent adaptation in older adults","volume":"65","author":"Janse","year":"2012","journal-title":"Quarterly Journal of Experimental Psychology"},{"key":"2023101014123366300_bib65","doi-asserted-by":"publisher","first-page":"116060","DOI":"10.1016\/j.neuroimage.2019.116060","article-title":"Quantifying the individual auditory and visual brain response in 7-month-old infants watching a brief cartoon movie","volume":"202","author":"Jessen","year":"2019","journal-title":"Neuroimage"},{"key":"2023101014123366300_bib67","doi-asserted-by":"publisher","first-page":"13745","DOI":"10.1038\/s41598-018-32150-6","article-title":"Infant-directed speech facilitates seven-month-old infants' cortical tracking of speech","volume":"8","author":"Kalashnikova","year":"2018","journal-title":"Scientific Reports"},{"key":"2023101014123366300_bib68","doi-asserted-by":"publisher","first-page":"1352","DOI":"10.1121\/1.4892770","article-title":"The influence of visual speech information on the intelligibility of English consonants produced by non-native speakers","volume":"136","author":"Kawase","year":"2014","journal-title":"Journal of the Acoustical Society of America"},{"key":"2023101014123366300_bib70","doi-asserted-by":"publisher","first-page":"85","DOI":"10.1207\/S15327078IN0401_5","article-title":"Pitch and communicative intent in mother's speech: Adjustments for age and sex in the first year","volume":"4","author":"Kitamura","year":"2003","journal-title":"Infancy"},{"key":"2023101014123366300_bib71","doi-asserted-by":"publisher","first-page":"372","DOI":"10.1016\/S0163-6383(02)00086-3","article-title":"Universality and specificity in infant-directed speech: Pitch modifications as a function of infant age and sex in a tonal and non-tonal language","volume":"24","author":"Kitamura","year":"2001","journal-title":"Infant Behaviour and Development"},{"key":"2023101014123366300_bib72","doi-asserted-by":"publisher","first-page":"1669","DOI":"10.21437\/interspeech.2007-29","article-title":"Acoustic and affective comparisons of natural and imaginary infant-, foreigner- and adult-directed speech","volume":"3","author":"Knoll","year":"2007","journal-title":"International Speech Communication Association - 8th Annual Conference of the International Speech Communication Association, Interspeech 2007"},{"key":"2023101014123366300_bib73","doi-asserted-by":"publisher","first-page":"845","DOI":"10.1080\/0144929X.2011.577192","article-title":"Using the internet for speech research: An evaluative study examining affect in speech","volume":"30","author":"Knoll","year":"2011","journal-title":"Behaviour & Information Technology"},{"key":"2023101014123366300_bib74","doi-asserted-by":"publisher","first-page":"110","DOI":"10.1111\/DESC.12098","article-title":"Audio\u2013visual speech perception: A developmental ERP investigation","volume":"17","author":"Knowland","year":"2014","journal-title":"Developmental Science"},{"article-title":"U.S. patent application no. 14\/895,440","year":"2014","author":"Kothe","key":"2023101014123366300_bib75"},{"key":"2023101014123366300_bib76","doi-asserted-by":"publisher","first-page":"1111","DOI":"10.1037\/XGE0000411","article-title":"Gaze-based signatures of mind wandering during real-world scene processing","volume":"147","author":"Krasich","year":"2018","journal-title":"Journal of Experimental Psychology"},{"key":"2023101014123366300_bib77","doi-asserted-by":"publisher","first-page":"279","DOI":"10.1016\/j.neuron.2006.12.011","article-title":"Neuronal oscillations and multisensory interaction in primary auditory cortex","volume":"53","author":"Lakatos","year":"2007","journal-title":"Neuron"},{"key":"2023101014123366300_bib78","doi-asserted-by":"publisher","first-page":"110","DOI":"10.1126\/science.1154735","article-title":"Entrainment of neuronal oscillations as a mechanism of attentional selection","volume":"320","author":"Lakatos","year":"2008","journal-title":"Science"},{"key":"2023101014123366300_bib79","doi-asserted-by":"publisher","first-page":"526","DOI":"10.1044\/jslhr.4203.526","article-title":"Attention to facial regions in segmental and prosodic visual speech perception tasks","volume":"42","author":"Lansing","year":"1999","journal-title":"Journal of Speech, Language, and Hearing Research"},{"key":"2023101014123366300_bib80","doi-asserted-by":"publisher","first-page":"1431","DOI":"10.1073\/pnas.1114783109","article-title":"Infants deploy selective attention to the mouth of a talking face when learning speech","volume":"109","author":"Lewkowicz","year":"2012","journal-title":"Proceedings of the National Academy of Sciences, U.S.A."},{"key":"2023101014123366300_bib82","doi-asserted-by":"publisher","first-page":"52","DOI":"10.3389\/fpsyg.2016.00052","article-title":"Differential gaze patterns on eyes and mouth during audiovisual speech segmentation","volume":"7","author":"Lusk","year":"2016","journal-title":"Frontiers in Psychology"},{"key":"2023101014123366300_bib83","doi-asserted-by":"publisher","first-page":"173","DOI":"10.2307\/3588329","article-title":"The effects of nonnative accents on listening comprehension: Implications for ESL assessment","volume":"36","author":"Major","year":"2002","journal-title":"TESOL Quarterly"},{"key":"2023101014123366300_bib84","doi-asserted-by":"publisher","first-page":"8530","DOI":"10.1523\/JNEUROSCI.0555-20.2020","article-title":"Crossmodal phase reset and evoked responses provide complementary mechanisms for the influence of visual speech in auditory cortex","volume":"40","author":"M\u00e9gevand","year":"2020","journal-title":"Journal of Neuroscience"},{"key":"2023101014123366300_bib85","doi-asserted-by":"publisher","first-page":"8546","DOI":"10.1523\/JNEUROSCI.4527-14.2015","article-title":"Neuro-oscillatory phase alignment drives speeded multisensory response times: An electro-corticographic investigation","volume":"35","author":"Mercier","year":"2015","journal-title":"Journal of Neuroscience"},{"key":"2023101014123366300_bib86","doi-asserted-by":"publisher","first-page":"359","DOI":"10.3389\/fpsyg.2013.00359","article-title":"Gated audiovisual speech identification in silence vs. noise: Effects on time and accuracy","volume":"4","author":"Moradi","year":"2013","journal-title":"Frontiers in Psychology"},{"key":"2023101014123366300_bib87","doi-asserted-by":"publisher","first-page":"531","DOI":"10.3389\/fpsyg.2014.00531","article-title":"Gated auditory speech perception: Effects of listening conditions and cognitive capacity","volume":"5","author":"Moradi","year":"2014","journal-title":"Frontiers in Psychology"},{"key":"2023101014123366300_bib88","doi-asserted-by":"publisher","first-page":"1640","DOI":"10.1037\/dev0000750","article-title":"Selective attention to the mouth of talking faces in monolinguals and bilinguals aged 5 months to 5 years","volume":"55","author":"Morin-Lessard","year":"2019","journal-title":"Developmental Psychology"},{"key":"2023101014123366300_bib89","doi-asserted-by":"publisher","first-page":"133","DOI":"10.1111\/J.0963-7214.2004.01502010.X","article-title":"Visual prosody and speech intelligibility: Head movement improves auditory speech perception","volume":"15","author":"Munhall","year":"2004","journal-title":"Psychological Science"},{"key":"2023101014123366300_bib90","doi-asserted-by":"publisher","first-page":"73","DOI":"10.1111\/J.1467-1770.1995.tb00963.x","article-title":"Foreign accent, comprehensibility, and intelligibility in the speech of second language learners","volume":"45","author":"Munro","year":"1995","journal-title":"Language Learning"},{"key":"2023101014123366300_bib91","doi-asserted-by":"publisher","first-page":"1272","DOI":"10.1121\/1.4944634","article-title":"Speech rate and pitch characteristics of infant-directed speech: Longitudinal and cross-linguistic observations","volume":"139","author":"Narayan","year":"2016","journal-title":"Journal of the Acoustical Society of America"},{"key":"2023101014123366300_bib92","doi-asserted-by":"publisher","first-page":"4","DOI":"10.1007\/S00426-005-0031-5","article-title":"Hearing lips in a second language: Visual articulatory information enables the perception of second language sounds","volume":"71","author":"Navarra","year":"2007","journal-title":"Psychological Research"},{"key":"2023101014123366300_bib93","doi-asserted-by":"publisher","first-page":"3282","DOI":"10.1111\/ejn.14425","article-title":"Look at me when I'm talking to you: Selective attention at a multisensory cocktail party can be decoded using stimulus reconstruction and alpha power modulations","volume":"50","author":"O'Sullivan","year":"2019","journal-title":"European Journal of Neuroscience"},{"key":"2023101014123366300_bib94","doi-asserted-by":"publisher","first-page":"15689","DOI":"10.1155\/2011\/156869","article-title":"FieldTrip: Open source software for advanced analysis of MEG, EEG, and invasive electrophysiological data","volume":"2011","author":"Oostenveld","year":"2011","journal-title":"Computational Intelligence and Neuroscience"},{"key":"2023101014123366300_bib95","doi-asserted-by":"publisher","first-page":"381","DOI":"10.1044\/jshr.2803.381","article-title":"Visemes observed by hearing-impaired and normal-hearing adult viewers","volume":"28","author":"Owens","year":"1985","journal-title":"Journal of Speech and Hearing Research"},{"key":"2023101014123366300_bib96","doi-asserted-by":"publisher","first-page":"553","DOI":"10.3758\/BF03194582","article-title":"Gaze behavior in audiovisual speech perception: The influence of ocular fixations on the McGurk effect","volume":"65","author":"Par\u00e9","year":"2003","journal-title":"Perception & Psychophysics"},{"key":"2023101014123366300_bib97","doi-asserted-by":"publisher","first-page":"e14521","DOI":"10.7554\/eLife.14521","article-title":"Lip movements entrain the observers' low-frequency brain oscillations to facilitate speech intelligibility","volume":"5","author":"Park","year":"2016","journal-title":"eLife"},{"key":"2023101014123366300_bib98","doi-asserted-by":"publisher","first-page":"320","DOI":"10.3389\/fpsyg.2012.00320","article-title":"Neural oscillations carry speech rhythm through to comprehension","volume":"3","author":"Peelle","year":"2012","journal-title":"Frontiers in Psychology"},{"key":"2023101014123366300_bib99","doi-asserted-by":"publisher","first-page":"34273","DOI":"10.1038\/srep34273","article-title":"Mature neural responses to infant-directed speech but not adult-directed speech in pre-verbal infants","volume":"6","author":"Peter","year":"2016","journal-title":"Scientific Reports"},{"key":"2023101014123366300_bib100","doi-asserted-by":"publisher","first-page":"105","DOI":"10.1016\/j.tics.2006.12.002","article-title":"Do people use language production to make predictions during comprehension?","volume":"11","author":"Pickering","year":"2007","journal-title":"Trends in Cognitive Sciences"},{"key":"2023101014123366300_bib101","doi-asserted-by":"publisher","first-page":"1073","DOI":"10.1044\/1092-4388(2009\/07-0276)","article-title":"Auditory event-related potentials (ERPs) in audiovisual speech perception","volume":"52","author":"Pilling","year":"2009","journal-title":"Journal of Speech Language and Hearing Research"},{"key":"2023101014123366300_bib102","doi-asserted-by":"publisher","first-page":"1558","DOI":"10.3758\/S13414-019-01946-7","article-title":"When processing costs impact predictive processing: The case of foreign-accented speech and accent experience","volume":"82","author":"Porretta","year":"2020","journal-title":"Attention, Perception, & Psychophysics"},{"key":"2023101014123366300_bib103","doi-asserted-by":"publisher","first-page":"1832","DOI":"10.1037\/xlm0000674","article-title":"Influencing the time and space of lexical competition: The effect of gradient foreign accentedness","volume":"45","author":"Porretta","year":"2019","journal-title":"Journal of Experimental Psychology: Learning, Memory, and Cognition"},{"key":"2023101014123366300_bib104","doi-asserted-by":"publisher","first-page":"1","DOI":"10.1016\/j.wocn.2016.05.006","article-title":"The influence of gradient foreign accentedness and listener experience on word recognition","volume":"58","author":"Porretta","year":"2016","journal-title":"Journal of Phonetics"},{"key":"2023101014123366300_bib105","doi-asserted-by":"publisher","first-page":"1306","DOI":"10.1109\/JPROC.2003.817150","article-title":"Recent advances in the automatic recognition of audiovisual speech","volume":"91","author":"Potamianos","year":"2003","journal-title":"Proceedings of the IEEE"},{"key":"2023101014123366300_bib106","doi-asserted-by":"publisher","first-page":"193","DOI":"10.1016\/j.cognition.2018.05.015","article-title":"Is infant-directed speech interesting because it is surprising?\u2014Linking properties of IDS to statistical learning and attention at the prosodic level","volume":"178","author":"R\u00e4s\u00e4nen","year":"2018","journal-title":"Cognition"},{"key":"2023101014123366300_bib107","doi-asserted-by":"publisher","first-page":"1159","DOI":"10.1044\/jshr.3906.1159","article-title":"Point-light facial displays enhance comprehension of speech in noise","volume":"39","author":"Rosenblum","year":"1996","journal-title":"Journal of Speech, Language, and Hearing Research"},{"key":"2023101014123366300_bib108","doi-asserted-by":"publisher","first-page":"2329","DOI":"10.1111\/J.1460-9568.2011.07685.X","article-title":"The development of multisensory speech perception continues into the late childhood years","volume":"33","author":"Ross","year":"2011","journal-title":"European Journal of Neuroscience"},{"key":"2023101014123366300_bib109","doi-asserted-by":"publisher","first-page":"1147","DOI":"10.1093\/cercor\/bhl024","article-title":"Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments","volume":"17","author":"Ross","year":"2007","journal-title":"Cerebral Cortex"},{"key":"2023101014123366300_bib110","unstructured":"Ru, P.\n           (2001). Multiscale multirate spectro-temporal auditory model[Unpublished doctoral dissertation]. University of Maryland College Park."},{"key":"2023101014123366300_bib111","doi-asserted-by":"publisher","first-page":"329","DOI":"10.1518\/hfes.45.2.329.27237","article-title":"Bimodal displays improve speech comprehension in environments with multiple speakers","volume":"45","author":"Rudmann","year":"2003","journal-title":"Human Factors"},{"key":"2023101014123366300_bib112","doi-asserted-by":"publisher","first-page":"B69","DOI":"10.1016\/j.cognition.2004.01.006","article-title":"Seeing to hear better: Evidence for early audio-visual interactions in speech identification","volume":"93","author":"Schwartz","year":"2004","journal-title":"Cognition"},{"key":"2023101014123366300_bib113","doi-asserted-by":"publisher","first-page":"e1003743","DOI":"10.1371\/journal.pcbi.1003743","article-title":"No, there is no 150 ms lead of visual speech on auditory speech, but a range of audiovisual asynchronies varying from small audio lead to large audio lag","volume":"10","author":"Schwartz","year":"2014","journal-title":"PLoS Computational Biology"},{"key":"2023101014123366300_bib114","doi-asserted-by":"publisher","first-page":"840","DOI":"10.21437\/SpeechProsody.2016-172","article-title":"Identifying visual prosody: Where do people look?","volume-title":"Proceedings of the International Conference on Speech Prosody","author":"Simonetti","year":"2016"},{"key":"2023101014123366300_bib115","doi-asserted-by":"publisher","first-page":"501","DOI":"10.1016\/j.dr.2007.06.002","article-title":"Beyond babytalk: Re-evaluating the nature and content of speech input to preverbal infants","volume":"27","author":"Soderstrom","year":"2007","journal-title":"Developmental Review"},{"key":"2023101014123366300_bib116","doi-asserted-by":"publisher","first-page":"104214","DOI":"10.1016\/j.cognition.2020.104214","article-title":"Infants' expectations about the recipients of infant-directed and adult-directed speech","volume":"198","author":"Soley","year":"2020","journal-title":"Cognition"},{"volume-title":"The merging of the senses","year":"1993","author":"Stein","key":"2023101014123366300_bib117"},{"key":"2023101014123366300_bib118","doi-asserted-by":"publisher","first-page":"212","DOI":"10.1121\/1.1907309","article-title":"Visual contribution to speech intelligibility in noise","volume":"26","author":"Sumby","year":"1954","journal-title":"Journal of the Acoustical Society of America"},{"key":"2023101014123366300_bib119","doi-asserted-by":"publisher","first-page":"119217","DOI":"10.1016\/j.neuroimage.2022.119217","article-title":"Seeing a talking face matters: The relationship between cortical tracking of continuous auditory\u2013visual speech and gaze behaviour in infants, children and adults","volume":"256","author":"Tan","year":"2022","journal-title":"Neuroimage"},{"key":"2023101014123366300_bib120","doi-asserted-by":"publisher","first-page":"2","DOI":"10.1016\/j.specom.2006.10.003","article-title":"Do you speak E-NG-L-I-SH? A comparison of foreigner- and infant-directed speech","volume":"49","author":"Uther","year":"2007","journal-title":"Speech Communication"},{"key":"2023101014123366300_bib121","doi-asserted-by":"publisher","first-page":"1181","DOI":"10.1073\/pnas.0408949102","article-title":"Visual speech speeds up the neural processing of auditory speech","volume":"102","author":"van Wassenhove","year":"2005","journal-title":"Proceedings of the National Academy of Sciences, U.S.A."},{"key":"2023101014123366300_bib122","doi-asserted-by":"publisher","first-page":"555","DOI":"10.1006\/jpho.2002.0165","article-title":"Linking facial animation, head motion and speech acoustics","volume":"30","author":"Yehia","year":"2002","journal-title":"Journal of Phonetics"},{"key":"2023101014123366300_bib123","doi-asserted-by":"publisher","first-page":"471","DOI":"10.1044\/1092-4388(2012\/10-0288)","article-title":"Gaze patterns and audiovisual speech enhancement","volume":"56","author":"Yi","year":"2013","journal-title":"Journal of Speech, Language, and Hearing Research"},{"key":"2023101014123366300_bib124","doi-asserted-by":"publisher","first-page":"449","DOI":"10.1002\/acp.3632","article-title":"Wandering eyes: Eye movements during mind wandering in video lectures","volume":"34","author":"Zhang","year":"2020","journal-title":"Applied Cognitive Psychology"},{"key":"2023101014123366300_bib125","doi-asserted-by":"publisher","first-page":"1417","DOI":"10.1523\/JNEUROSCI.3675-12.2013","article-title":"Visual input enhances selective speech envelope tracking in auditory cortex at a \u201ccocktail party\u201d","volume":"33","author":"Zion Golumbic","year":"2013","journal-title":"Journal of Neuroscience"}],"container-title":["Journal of Cognitive Neuroscience"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/direct.mit.edu\/jocn\/article-pdf\/35\/11\/1741\/2162555\/jocn_a_02044.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"https:\/\/direct.mit.edu\/jocn\/article-pdf\/35\/11\/1741\/2162555\/jocn_a_02044.pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,10,10]],"date-time":"2023-10-10T14:14:04Z","timestamp":1696947244000},"score":1,"resource":{"primary":{"URL":"https:\/\/direct.mit.edu\/jocn\/article\/35\/11\/1741\/117480\/Seeing-a-Talking-Face-Matters-Gaze-Behavior-and"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023]]},"references-count":122,"journal-issue":{"issue":"11","published-online":{"date-parts":[[2023,11,1]]},"published-print":{"date-parts":[[2023,11,1]]}},"URL":"https:\/\/doi.org\/10.1162\/jocn_a_02044","relation":{},"ISSN":["0898-929X","1530-8898"],"issn-type":[{"type":"print","value":"0898-929X"},{"type":"electronic","value":"1530-8898"}],"subject":[],"published-other":{"date-parts":[[2023]]},"published":{"date-parts":[[2023]]}}}