Symposia

Search Abstracts | Symposia | Slide Sessions | Poster Sessions

Leveraging intracranial recordings for detailed insights into language processing: Bridging gaps and advancing understanding

Friday, October 25, 1:00 - 3:00 pm, Great Hall 1 and 2

Organizer: Jill Kries1; 1Stanford University
Presenters: Matthew Leonard, Gregory Cogan, Stephanie Ries, Anna Mai

Intracranial recordings (iEEG, ECoG) uniquely provide both high spatial and temporal resolution, enabling researchers to study local neural populations at a scale that approaches advances in animal models for behaviors other than language. Such precision is necessary for understanding the complex representations, computations and architecture underlying language processing. Symposium speakers will present recent advances in intracranial research, addressing (1)theoretical psycholinguistic debates, (2)the interface between acoustic/motor encoding and higher-level conceptual and cognitive processing, and (3)muscle-artifact-free speech production data that enable BCI applications. The panel discussion will entail debating future directions, addressing questions like: How can intracranial research contribute to furthering our theoretical understanding of language in ways that no other technique can? How can intracranial language studies support and augment non-invasive research? What promise does intracranial research hold for applications in individuals with language disorders? This symposium will identify future directions for intracranial language research and strengthen integration with non-invasive methods.

Presentations

Speech encoding in single neurons and neural populations in human cortex

Matthew Leonard1; 1University of California San Francisco

The development of high-resolution direct neural recording techniques in humans has enabled a detailed understanding of how local neural populations and single neurons process spoken input. The emerging picture of speech encoding in the brain is one in which key regions like the superior temporal gyrus (STG) represent multiple phonological, prosodic, and word-level features in parallel. However, we lack an understanding of how these diverse features are integrated into coherent percepts, and how both local and network-level neural codes contribute to listeners’ perceptual experience of speech comprehension. By combining recording techniques (ECoG, Neuropixels, and direct cortical stimulation), naturalistic and controlled stimuli, and both linear and non-linear encoding models, it is now possible to address fundamental questions about the interface between auditory speech input and language comprehension.

More is Better: Studying the Speaking Brain using Micro-ECoG

Gregory Cogan1; 1Duke University

Direct brain recordings afforded by neurosurgical patients has provided new insights into how the human brain works. This opportunity has also led to technological advances to record from large areas of the brain with fine spatial detail. These ‘high-definition’ neural recordings are particularly important for the uniquely human capacity of speech and language. In this talk, I will discuss our recent work using high-density, high channel-count micro-electrocorticography (micro-ECoG) to study the speaking brain. I will show the benefits of increased spatial sampling for speech decoding from neural signals for brain computer interfaces (BCI). I will also show that we can delineate neural sub-processes associated with both the planning and execution of speech. Together, these results highlight opportunities from direct brain recordings that combined with advances in neural recording technologies, will provide new vistas for advancing our ability to understand and treat the human brain.

Cortical Interactions Supporting Cognitive Control in Language Production

Stephanie Ries1, Yusheng Wang1,2, Ashkan Ashrafi1, Sharona Ben-Haim2, Jerry Shih2; 1San Diego State University, 2University of California San Diego

Although producing language seems easy, cognitive control processes help efficiently transform ideas into language output and avoid errors. Prefrontal cortex (PFC) regions in the lateral and medial PFC play critical roles in supporting cognitive control processes in and outside of language. However, little is known about how these regions interact with core language regions to allow us to produce and retrieve words so efficiently as we speak. We used graph signal processing, i.e. graph learning, to analyze intracranial electroencephalography data recorded directly within brain regions of interest as participants were naming pictures. Our results show that distant frontal and temporal brain regions are functionally connected in picture naming. In particular, a higher number of functional connections involve the caudal anterior cingulate cortex compared to the superior temporal gyrus time-locked to vocal onset. This supports the involvement of a domain-general medial PFC conflict monitoring mechanisms during language production and speech monitoring.

Linguistic Structure on Both Sides of the Braincase

Anna Mai1; 1Max Planck Institute for Psycholinguistics

Explaining structured variability within and across languages is a central goal of linguistic research, and the unparalleled spatiotemporal resolution of intracranial recordings is invaluable to explaining this intricate structure at the neural level. In this talk, I will discuss work showcasing how intracranial recordings have enabled such study as well as work highlighting the importance of bridging intracranial work with other neuroimaging literatures. In particular, I will discuss how some recent intracranial work has substantiated long-standing theoretical claims about the structure of phonological contrasts, their relationship to speech acoustics, and their specificity to particular languages (Mai et al. 2024). I will additionally show how aspects of this work have been validated in a less-invasive neuroimaging context and how bandlimited intracranial responses to speech vary across distances in gray and white matter. In this way, this talk will illustrate both the present and future promise of intracranial recordings in language research.

SNL Account Login


Forgot Password?
Create an Account

News

Abstract Submissions extended through June 10

Meeting Registration is Open

Make Your Hotel Reservations Now

2024 Membership is Open

Please see Dates & Deadlines for important upcoming dates.