Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions

Exploring Brain Activity and Multimodal Input: Effects on First and Second Language Learning and the Role of Working Memory

Poster A31 in Poster Session A - Sandbox Series, Thursday, October 24, 10:00 - 11:30 am, Great Hall 4
This poster is part of the Sandbox Series.

Mayumi Kajiura1, Chunlin Liu2, Diego Dardon3, Motoaki Sugiura2, Hyeonjeong Jeong2; 1Nagoya City University, 2Tohoku University, 3Miyagi University of Education

[Introduction] In second language (L2) learning, multimodal input—such as information from video captions and audio—can mitigate prediction errors through the richness of information, thereby enhancing comprehension and acquisition (Perez, 2020). However, due to the non-automated nature of L2 processing, it can also strain the limited capacity of working memory (WM), causing difficulties in simultaneously processing information and splitting attention (Pociask & Morrison, 2004), which may hinder learning. The processing of multimodal input in the automated first language (L1) may differ significantly from L2. For instance, reading speeds are typically faster than human speech speeds (Brysbaert, 2019), implying suppression of audio input in multimodal situation. This fMRI study aims to elucidate the brain mechanisms associated with processing multimodal input in both L1 and L2, and to identify effective input presentation methods for L2 acquisition by considering individual factors that influence their effectiveness. [Method] Thirty-one Japanese university students learning English participated in this study. The participants completed a semantic judgment task, (e.g., "It is true that rabbits usually have two ears." Participants judged the truthfulness of the statement) in both their L1 (Japanese: 2 sessions, 72 trials) and L2 (English: 2 sessions, 72 trials) under three conditions: audio-only (A), visual (text)-only (V), and simultaneous-audio-and-visual (AV) presentation. Participants also took Oxford listening and reading tests to assess English proficiency, a digit span test for WM capacity, and an anti-saccade task to assess their ability to suppress attention to stimuli. [Data Analysis] Behavioral data, including accuracy and response time, and brain activities in the semantic judgment tasks under the conditions above were compared. To isolate specific brain activity related to the AV condition, brain activity in (AV - (A + V)) will be calculated in both L1 and L2 conditions. Correlations between individual English proficiency, WM, attention-inhibition abilities, and the behavioral data (accuracy and response time), as well as brain activity for each condition, were also examined. [Expected Results] According to Kajiura et al. (2021), the superior temporal gyrus (STG) and angular gyrus (AG) are involved in integrating auditory input and prior knowledge through reading transcripts. While audio might aid reading comprehension in L2, leading to integration of the two types of information, it might interfere in L1 since reading speeds surpass audio speeds. We hypothesized that in the AV condition, participants with higher attention-inhibition abilities would show increased activity in brain areas involved in attention suppression (e.g., caudate nucleus) in L1, whereas participants with higher WM would exhibit increased activity in the STG and AG due to the integration of auditory and visual information in L2. If this hypothesis is confirmed, it suggests that integrating information from multiple modalities can enhance comprehension in non-automated languages like L2, with learners possessing higher WM benefiting more significantly. Conversely, focusing on a single modality's information might be more efficient for automated languages like L1, with learners exhibiting higher attention-inhibition abilities achieving better outcomes. Thus, the effectiveness of multimodal input may vary depending on the type of stimuli presented and the characteristics of the learners.

Topic Areas: Multisensory or Sensorimotor Integration, Speech Perception

SNL Account Login


Forgot Password?
Create an Account

News

Abstract Submissions extended through June 10

Meeting Registration is Open

Make Your Hotel Reservations Now

2024 Membership is Open

Please see Dates & Deadlines for important upcoming dates.