Presentation
Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks
Using EEG and eye-tracking to investigate the prediction of speech in naturalistic virtual environments
Poster A14 in Poster Session A, Tuesday, October 24, 10:15 am - 12:00 pm CEST, Espace Vieux-Port
Also presenting in Lightning Talks A, Tuesday, October 24, 10:00 - 10:15 am CEST, Auditorium
Eleanor Huizeling1, Phillip M. Alday2, David Peeters3, Peter Hagoort1,4; 1Max Planck Institute for Psycholinguistics, 2Beacon Biosignals Inc., 3Tilburg University, 4Donders Institute for Brain, Cognition and Behaviour, Radboud University
EEG and eye-tracking provide complementary information when investigating language comprehension. Evidence that speech processing may be facilitated by speech prediction comes from the observation that a listener’s eye gaze moves towards a referent before it is mentioned if the remainder of the spoken sentence is predictable. However, changes to the trajectory of anticipatory fixations could result from a change in prediction or an attention shift. Conversely, N400 amplitudes and concurrent spectral power provide information about the ease of word processing the moment the word is perceived. In a proof-of-principle investigation, we combined EEG and eye-tracking to study linguistic prediction in naturalistic, virtual environments. Participants (n=32) listened to sentences spoken by a virtual agent (pre-recorded by a native Dutch speaker) during a tour of eight virtual scenes (e.g., office, street) while participants’ eye-movements and EEG were recorded. Spoken stimuli were 128 subject-verb-object sentences that were either predictable or unpredictable. The verb in the sentence was either related to a single object (predictable) or multiple objects (unpredictable) in the scene. Objects mentioned were either visible or absent in the scene, to confirm or disconfirm participants’ predictions, respectively. Increased processing resources, as reflected in increased theta power, were observed either at the verb onset when the verb was predictive of the noun, or at noun onset if the verb was not predictive of the noun. Alpha power was higher in response to the predictive verb and unpredictable nouns. We replicated greater proportions of anticipatory fixations towards the target object in predictable compared to unpredictable sentences and a greater N400 response to the noun when referents were absent compared to visible in the scene. Conversely, no effect of predictability was seen on the N400. Lastly, anticipatory fixations were predictive of spectral power during noun processing and the proportion of anticipatory fixations could be predicted by theta power at verb onset. In conclusion, the rich visual context that accompanied speech partly altered the findings in the EEG data compared to previous reports, where the visual context eased the processing of unpredictable nouns. Our findings provide strong evidence to suggest that the N400 reflects integration rather than prediction. Overall, we show that combining EEG and eye-tracking provides a promising new method to answer novel research questions about the prediction of upcoming linguistic input, for example, to determine whether changes to the trajectory of anticipatory fixations reflect a change in prediction or an attention shift.
Topic Areas: Meaning: Discourse and Pragmatics, Meaning: Lexical Semantics