{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,10]],"date-time":"2026-01-10T21:28:10Z","timestamp":1768080490428,"version":"3.49.0"},"reference-count":39,"publisher":"MIT Press - Journals","issue":"1","content-domain":{"domain":["direct.mit.edu"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2016,1,1]]},"abstract":"<jats:title>Abstract<\/jats:title>\n               <jats:p>Auditory speech perception can be altered by concurrent visual information. The superior temporal cortex is an important combining site for this integration process. This area was previously found to be sensitive to audiovisual congruency. However, the direction of this congruency effect (i.e., stronger or weaker activity for congruent compared to incongruent stimulation) has been more equivocal. Here, we used fMRI to look at the neural responses of human participants during the McGurk illusion\u2014in which auditory \/aba\/ and visual \/aga\/ inputs are fused to perceived \/ada\/\u2014in a large homogenous sample of participants who consistently experienced this illusion. This enabled us to compare the neuronal responses during congruent audiovisual stimulation with incongruent audiovisual stimulation leading to the McGurk illusion while avoiding the possible confounding factor of sensory surprise that can occur when McGurk stimuli are only occasionally perceived. We found larger activity for congruent audiovisual stimuli than for incongruent (McGurk) stimuli in bilateral superior temporal cortex, extending into the primary auditory cortex. This finding suggests that superior temporal cortex prefers when auditory and visual input support the same representation.<\/jats:p>","DOI":"10.1162\/jocn_a_00874","type":"journal-article","created":{"date-parts":[[2015,9,9]],"date-time":"2015-09-09T18:10:56Z","timestamp":1441822256000},"page":"1-7","update-policy":"https:\/\/doi.org\/10.1162\/mitpressjournals.corrections.policy","source":"Crossref","is-referenced-by-count":9,"title":["Preference for Audiovisual Speech Congruency in Superior Temporal Cortex"],"prefix":"10.1162","volume":"28","author":[{"given":"Claudia S.","family":"L\u00fcttke","sequence":"first","affiliation":[]},{"given":"Matthias","family":"Ekman","sequence":"additional","affiliation":[]},{"given":"Marcel A. J.","family":"van Gerven","sequence":"additional","affiliation":[]},{"given":"Floris P.","family":"de Lange","sequence":"additional","affiliation":[]}],"member":"281","published-online":{"date-parts":[[2016,1,1]]},"reference":[{"key":"2021073020565810800_R1","doi-asserted-by":"crossref","first-page":"649","DOI":"10.1016\/j.neunet.2009.12.007","article-title":"Of bits and wows: A Bayesian theory of surprise with applications to attention","volume":"23","author":"Baldi","year":"2010","journal-title":"Neural Networks"},{"key":"2021073020565810800_R2","doi-asserted-by":"crossref","first-page":"377","DOI":"10.1162\/0898929053279586","article-title":"Integration of visual and auditory information by superior temporal sulcus neurons responsive to the sight of actions","volume":"17","author":"Barraclough","year":"2005","journal-title":"Journal of Cognitive Neuroscience"},{"key":"2021073020565810800_R3","doi-asserted-by":"crossref","first-page":"1825","DOI":"10.1016\/j.neuroimage.2012.05.034","article-title":"Multisensory speech perception without the left superior temporal sulcus","volume":"62","author":"Baum","year":"2012","journal-title":"Neuroimage"},{"key":"2021073020565810800_R4","doi-asserted-by":"crossref","first-page":"1190","DOI":"10.1038\/nn1333","article-title":"Unraveling multisensory integration: Patchy organization within human STS multisensory cortex","volume":"7","author":"Beauchamp","year":"2004","journal-title":"Nature Neuroscience"},{"key":"2021073020565810800_R5","doi-asserted-by":"crossref","first-page":"2414","DOI":"10.1523\/JNEUROSCI.4865-09.2010","article-title":"fMRI-guided transcranial magnetic stimulation reveals that the superior temporal sulcus is a cortical locus of the McGurk effect","volume":"30","author":"Beauchamp","year":"2010","journal-title":"Journal of Neuroscience"},{"key":"2021073020565810800_R6","doi-asserted-by":"crossref","first-page":"3750","DOI":"10.1152\/jn.00500.2003","article-title":"Functional connections between auditory cortex on Heschl's gyrus and on the lateral superior temporal gyrus in humans","volume":"90","author":"Brugge","year":"2003","journal-title":"Journal of Neurophysiology"},{"key":"2021073020565810800_R7","doi-asserted-by":"crossref","first-page":"2619","DOI":"10.1097\/00001756-199908200-00033","article-title":"Response amplification in sensory-specic cortices during crossmodal binding","volume":"10","author":"Calvert","year":"1999","journal-title":"NeuroReport"},{"key":"2021073020565810800_R8","doi-asserted-by":"crossref","first-page":"649","DOI":"10.1016\/S0960-9822(00)00513-3","article-title":"Evidence from functional magnetic resonance imaging of crossmodal binding in the human heteromodal cortex","volume":"10","author":"Calvert","year":"2000","journal-title":"Current Biology"},{"key":"2021073020565810800_R9","doi-asserted-by":"crossref","first-page":"e1000436","DOI":"10.1371\/journal.pcbi.1000436","article-title":"The natural statistics of audiovisual speech","volume":"5","author":"Chandrasekaran","year":"2009","journal-title":"PLoS Computational Biology"},{"key":"2021073020565810800_R10","first-page":"10","article-title":"Modulation of visual responses in the superior temporal sulcus by audio-visual congruency","volume":"4","author":"Dahl","year":"2010","journal-title":"Frontiers in Integrative Neuroscience"},{"key":"2021073020565810800_R11","doi-asserted-by":"crossref","first-page":"1175","DOI":"10.1093\/cercor\/bhn161","article-title":"A dual role for prediction error in associative learning","volume":"19","author":"Den Ouden","year":"2009","journal-title":"Cerebral Cortex"},{"key":"2021073020565810800_R12","doi-asserted-by":"crossref","first-page":"511","DOI":"10.1016\/j.neuroimage.2007.03.060","article-title":"Assignment of functional activations to probabilistic cytoarchitectonic areas revisited","volume":"36","author":"Eickhoff","year":"2007","journal-title":"Neuroimage"},{"key":"2021073020565810800_R13","doi-asserted-by":"crossref","first-page":"263","DOI":"10.1016\/j.brainresbull.2006.06.012","article-title":"Independent component model of the default-mode brain function: Assessing the impact of active thinking","volume":"70","author":"Esposito","year":"2006","journal-title":"Brain Research Bulletin"},{"key":"2021073020565810800_R14","doi-asserted-by":"crossref","first-page":"1197","DOI":"10.1121\/1.1288668","article-title":"The use of visible speech cues for improving auditory detection","volume":"108","author":"Grant","year":"2000","journal-title":"Journal of the Acoustical Society of America"},{"key":"2021073020565810800_R15","doi-asserted-by":"crossref","first-page":"1544","DOI":"10.1093\/cercor\/bht347","article-title":"Orthographic dependency in the neural correlates of reading: Evidence from audiovisual integration in English readers","volume":"25","author":"Holloway","year":"2015","journal-title":"Cerebral Cortex"},{"key":"2021073020565810800_R16","doi-asserted-by":"crossref","first-page":"79","DOI":"10.1002\/(SICI)1096-9861(20000103)416:1<79::AID-CNE6>3.0.CO;2-2","article-title":"Auditory cortex on the human posterior superior temporal gyrus","volume":"416","author":"Howard","year":"2000","journal-title":"Journal of Comparative Neurology"},{"key":"2021073020565810800_R17","doi-asserted-by":"crossref","first-page":"125","DOI":"10.1016\/S0304-3940(99)00288-8","article-title":"Attention modulates activity in the primary and the secondary auditory cortex\u202f: A functional magnetic resonance imaging study in human subjects","volume":"266","author":"Jaencke","year":"1999","journal-title":"Neuroscience Letters"},{"key":"2021073020565810800_R18","doi-asserted-by":"crossref","first-page":"351","DOI":"10.1007\/s12021-012-9152-3","article-title":"Development of PowerMap: A software package for statistical power calculation in neuroimaging studies","volume":"10","author":"Joyce","year":"2012","journal-title":"Neuroinformatics"},{"key":"2021073020565810800_R19","doi-asserted-by":"crossref","first-page":"19","DOI":"10.1016\/j.cub.2009.10.068","article-title":"Visual enhancement of the information representation in auditory cortex","volume":"20","author":"Kayser","year":"2010","journal-title":"Current Biology"},{"key":"2021073020565810800_R20","doi-asserted-by":"crossref","first-page":"R309","DOI":"10.1016\/j.cub.2014.02.007","article-title":"Temporal prediction errors in visual and auditory cortices","volume":"24","author":"Lee","year":"2014","journal-title":"Current Biology"},{"key":"2021073020565810800_R21","doi-asserted-by":"crossref","first-page":"249","DOI":"10.1002\/hbm.10082","article-title":"Human prefrontal and sensory cortical activity during divided attention tasks","volume":"18","author":"Loose","year":"2003","journal-title":"Human Brain Mapping"},{"key":"2021073020565810800_R22","doi-asserted-by":"crossref","first-page":"725","DOI":"10.1016\/j.neuroimage.2003.09.049","article-title":"Spatial and temporal factors during processing of audiovisual speech: A PET study","volume":"21","author":"Macaluso","year":"2004","journal-title":"Neuroimage"},{"key":"2021073020565810800_R23","doi-asserted-by":"crossref","first-page":"746","DOI":"10.1038\/264746a0","article-title":"Hearing lips and seeing voices","volume":"264","author":"McGurk","year":"1976","journal-title":"Nature"},{"key":"2021073020565810800_R24","doi-asserted-by":"crossref","first-page":"781","DOI":"10.1016\/j.neuroimage.2011.07.024","article-title":"A neural basis for interindividual differences in the McGurk effect, a multisensory speech illusion","volume":"59","author":"Nath","year":"2012","journal-title":"Neuroimage"},{"key":"2021073020565810800_R25","doi-asserted-by":"crossref","first-page":"598","DOI":"10.1093\/cercor\/bhm091","article-title":"The effect of prior visual information on recognition of speech and sounds","volume":"18","author":"Noppeney","year":"2008","journal-title":"Cerebral Cortex"},{"key":"2021073020565810800_R26","doi-asserted-by":"crossref","first-page":"e68959","DOI":"10.1371\/journal.pone.0068959","article-title":"An fMRI study of audiovisual speech perception reveals multisensory interactions in auditory cortex","volume":"8","author":"Okada","year":"2013","journal-title":"PLoS One"},{"key":"2021073020565810800_R27","doi-asserted-by":"crossref","first-page":"797","DOI":"10.1002\/hbm.21254","article-title":"Neural correlates of audio-visual object recognition: Effects of implicit spatial congruency","volume":"33","author":"Plank","year":"2012","journal-title":"Human Brain Mapping"},{"key":"2021073020565810800_R28","doi-asserted-by":"crossref","first-page":"669","DOI":"10.1006\/nimg.2000.0714","article-title":"Probabilistic mapping and volume measurement of human primary auditory cortex","volume":"13","author":"Rademacher","year":"2001","journal-title":"Neuroimage"},{"key":"2021073020565810800_R29","doi-asserted-by":"crossref","first-page":"676","DOI":"10.1073\/pnas.98.2.676","article-title":"A default mode of brain function","volume":"98","author":"Raichle","year":"2001","journal-title":"Proceedings of the National Academy of Sciences, U.S.A."},{"key":"2021073020565810800_R38","doi-asserted-by":"crossref","first-page":"1147","DOI":"10.1093\/cercor\/bhl024","article-title":"Do you see what I am saying? Exploring visual enhancement of speech comprehension in noisy environments","volume":"17","author":"Ross","year":"2007","journal-title":"Cerebral Cortex"},{"key":"2021073020565810800_R30","doi-asserted-by":"crossref","first-page":"e1003743","DOI":"10.1371\/journal.pcbi.1003743","article-title":"No, there is no 150 ms lead of visual speech on auditory speech, but a range of audiovisual asynchronies varying from small audio lead to large audio lag","volume":"10","author":"Schwartz","year":"2014","journal-title":"PLoS Computational Biology"},{"key":"2021073020565810800_R39","doi-asserted-by":"crossref","first-page":"17977","DOI":"10.1121\/1.401660","article-title":"McGurk effect in non-English listeners: few visual effects for Japanese subjects hearing Japanese syllables of high auditory intelligibility","volume":"90","author":"Sekiyama","year":"1991","journal-title":"Journal of the Acoustical Society of America"},{"key":"2021073020565810800_R31","doi-asserted-by":"crossref","first-page":"2387","DOI":"10.1093\/cercor\/bhl147","article-title":"Hearing lips and seeing voices: How cortical areas supporting speech production mediate audiovisual speech perception","volume":"17","author":"Skipper","year":"2007","journal-title":"Cerebral Cortex"},{"key":"2021073020565810800_R32","doi-asserted-by":"crossref","first-page":"3308","DOI":"10.1016\/j.neuroimage.2009.12.001","article-title":"Neural processing of asynchronous audiovisual speech perception","volume":"49","author":"Stevenson","year":"2010","journal-title":"Neuroimage"},{"key":"2021073020565810800_R33","doi-asserted-by":"crossref","first-page":"1339","DOI":"10.1016\/j.neuroimage.2010.12.063","article-title":"Discrete neural substrates underlie complementary audiovisual speech integration processes","volume":"55","author":"Stevenson","year":"2011","journal-title":"Neuroimage"},{"key":"2021073020565810800_R34","doi-asserted-by":"crossref","first-page":"95","DOI":"10.3389\/fnhum.2012.00095","article-title":"Examining the McGurk illusion using high-field 7 Tesla functional MRI","volume":"6","author":"Szycik","year":"2012","journal-title":"Frontiers in Human Neuroscience"},{"key":"2021073020565810800_R35","doi-asserted-by":"crossref","first-page":"11","DOI":"10.1186\/1471-2202-11-11","article-title":"fMR-adaptation indicates selectivity to audiovisual content congruency in distributed clusters in human superior temporal cortex","volume":"11","author":"Van Atteveldt","year":"2010","journal-title":"BMC Neuroscience"},{"key":"2021073020565810800_R36","doi-asserted-by":"crossref","first-page":"271","DOI":"10.1016\/j.neuron.2004.06.025","article-title":"Integration of letters and speech sounds in the human brain","volume":"43","author":"Van Atteveldt","year":"2004","journal-title":"Neuron"},{"key":"2021073020565810800_R37","doi-asserted-by":"crossref","first-page":"598","DOI":"10.1016\/j.neuropsychologia.2006.01.001","article-title":"Temporal window of integration in auditory-visual speech perception","volume":"45","author":"Van Wassenhove","year":"2007","journal-title":"Neuropsychologia"}],"container-title":["Journal of Cognitive Neuroscience"],"original-title":[],"language":"en","link":[{"URL":"http:\/\/direct.mit.edu\/jocn\/article-pdf\/28\/1\/1\/1950138\/jocn_a_00874.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"http:\/\/direct.mit.edu\/jocn\/article-pdf\/28\/1\/1\/1950138\/jocn_a_00874.pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2021,7,31]],"date-time":"2021-07-31T00:47:29Z","timestamp":1627692449000},"score":1,"resource":{"primary":{"URL":"https:\/\/direct.mit.edu\/jocn\/article\/28\/1\/1\/28418\/Preference-for-Audiovisual-Speech-Congruency-in"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2016,1,1]]},"references-count":39,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2016,1,1]]},"published-print":{"date-parts":[[2016,1,1]]}},"URL":"https:\/\/doi.org\/10.1162\/jocn_a_00874","relation":{},"ISSN":["0898-929X","1530-8898"],"issn-type":[{"value":"0898-929X","type":"print"},{"value":"1530-8898","type":"electronic"}],"subject":[],"published-other":{"date-parts":[[2016,1]]},"published":{"date-parts":[[2016,1,1]]}}}