Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions

Neural representation of sensorimotor features in language-motor areas during auditory and visual language perception

Poster Session B, Friday, October 25, 10:00 - 11:30 am, Great Hall 3 and 4

Min Xu1, Yuanyi Zheng1, Jianfeng Zhang1, Yang Yang2; 1School of Psychology, Shenzhen University, Shenzhen, China, 2Department of Psychology, University of Chinese Academy of Sciences, Beijing, China

Language processing involves a complex interplay between sensory and motor systems in the brain. Accumulating evidence has demonstrated that listening to speech involves not only auditory processing areas but also brain areas traditionally defined as parts of the speech production system, even in the absence of explicit motor tasks. Recent research has extended our understanding of sensory and motor interaction to the domain of visual word processing, suggesting a cooperative network between the visual form processing system in the ventral occipitotemporal area and a handwriting gesture system along the dorsal pathway from the ventral premotor to the parietal lobe. The specific information encoded within these language-motor areas during language perception, however, remains poorly understood. In this study, we employed functional magnetic resonance imaging (fMRI) and representational similarity analysis (RSA) to investigate whether the language-motor areas represent motoric or sensory attributes of language stimuli during auditory and visual language perception. We recruited 36 Chinese-speaking adults who performed tasks involving the perception of spoken syllables and written characters. Auditory stimuli comprised eight consonant-vowel syllables (i.e., /ba/, /pa/, /da/, /ta/, /ga/, /ka/, /sa/, /ca/) differing in articulatory and acoustic characteristics. Visual stimuli included 24 Chinese characters varying in visual-form and stroke-motoric features. Participants also completed articulation and handwriting tasks to localize speech-motor and writing-motor areas. RSA was applied to determine if activity patterns within the language-motor and sensory areas could be predicted by the motoric and sensory attributes of language stimuli. Results indicated that in the auditory domain, speech-motoric similarity predicted neural similarity in the right precentral gyrus, left Heschl's gyrus, and left superior temporal gyrus. Additionally, whereas high-level auditory features represented in the left inferior frontal and superior temporal gyri, low-level auditory features represented only in the left Heschl's gyrus. In the visual domain, significant correlations between stroke-motoric similarity and neural similarity were found in the left superior frontal and right superior occipital gyri. However, visual-feature patterns were not correlated with neural patterns in the motor or visual processing regions. These results reveal that language-motor areas, along with parts of the perceptual regions, encode motoric information across both auditory and visual modalities, supporting the notion of a shared motoric representation. Additionally, modality-specific patterns were observed: the speech-motor areas (along with the auditory cortex) representing acoustic features of speech stimuli, whereas visual and motor areas did not exhibit a similar pattern in visual-character processing. The findings underscore the multimodal and modality-specific encoding capabilities of language motor systems and provide insights into the complex interplay between sensory and motor processes in language perception.

Topic Areas: Reading, Speech Perception

SNL Account Login


Forgot Password?
Create an Account

News

Abstract Submissions extended through June 10

Meeting Registration is Open

Make Your Hotel Reservations Now

2024 Membership is Open

Please see Dates & Deadlines for important upcoming dates.