Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions

Investigating syntactic attention in the brain

Poster Session B, Friday, October 25, 10:00 - 11:30 am, Great Hall 3 and 4

Yushi Sugimoto1, Ryo Yoshida2, Hyeonjeong Jeong3, Akitake Kanno3, Masatoshi Koizumi3, Yohei Oseki2; 1Osaka University, 2University of Tokyo, 3Tohoku University

[INTRO] In recent computational psycho/neurolinguistics, Transformers have been shown to capture human reading effort more precisely than Recurrent Neural Networks (RNNs), implying the cognitive plausibility of selective attention to the words in the context (Merkx and Frank, 2021). On another stand, Recurrent Neural Network Grammars (RNNGs), the integration of RNNs with explicit syntactic structures, have been shown to achieve better psychometric predictive power than vanilla RNNs, suggesting that the architecture that constructs syntactic structures behind the words is more human-like (Hale et al., 2018; Brennan et al., 2020). However, previous studies do not investigate the integration of these two architectures, i.e., selective attention to the syntactic structures, namely syntactic attention. In this study, we employed Composition Attention Grammars (Yoshida and Oseki, 2022) as the neural architecture of syntactic attention and evaluated them against a novel MEG dataset, to investigate the cognitive plausibility of selective attention to the constructed syntactic structures. [METHODS] We collected MEG data from 41 Japanese speakers. Using rapid serial visual presentation, 20 Japanese newspaper articles selected from the Balanced Corpus of Contemporary Written Japanese were presented segment by segment for 500ms, followed by a blank screen for 500ms. For statistical analysis, the information-theoretic metric surprisal (Hale, 2001; Levy, 2008) was calculated for each segment of the newspaper articles using targeted computational models. We conducted two-stage regression analyses in which the regression model includes one predictor of interest (surprisal from Long Short-Term Memory Networks (LSTMs), surprisal from RNNGs, surprisal from GPT2 small (the model that is available from Hugging Face), or surprisal from CAGs), as well as control predictors of word length, word frequency, segment position, and sentence position. Then, for each regression model, we performed spatio-temporal permutation cluster tests in regions that were defined based on the results of a previous study (Pallier et al., 2011) over a 200-650ms time window. Next, we defined the functional ROI (fROI) based on the results of spatio-temporal cluster permutation regression. Finally, the neural activities were averaged over space and time within the fROI, and the likelihood-ratio tests were conducted for nested model comparisons. [RESULTS] For the results of spatio-temporal permutation cluster tests, there was a significant cluster in the temporal lobe that includes the left middle temporal gyrus (MTG) only for surprisal from CAGs (425-615ms). For the results of nested model comparisons, the regression model that includes surprisal from CAGs as well as all baseline predictors had a statistically significant effect against the baseline regression model in the fROI. Moreover, the regression model that includes all predictors had the above-and-beyond effects against the model that includes all but suprisal from CAGs in the fROI. [DISCUSSIONS] First, our results align with some previous studies where the left MTG is involved in hierarchical complexity, especially in the construction of phrases and sentences (Sheng et al., 2019; Woolnough et al., 2023). Second and more importantly, combining our results with previous studies, sentence processing in this region may be engaged in syntactic attention-like operation—constructing syntactic structures and selectively attending to the constructed representations.

Topic Areas: Syntax and Combinatorial Semantics, Computational Approaches

SNL Account Login


Forgot Password?
Create an Account

News

Abstract Submissions extended through June 10

Meeting Registration is Open

Make Your Hotel Reservations Now

2024 Membership is Open

Please see Dates & Deadlines for important upcoming dates.