Presentation
Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks
Specificity of Motor Contributions to Statistical Language Learning
Poster C78 in Poster Session C, Wednesday, October 25, 10:15 am - 12:00 pm CEST, Espace Vieux-Port
Eleonore Smalle1, Sam Boeve2, Riikka Möttönen3; 1Postdoc, Ghent University; Assistant professor, Tilburg University, 2PhD student, Ghent University, 3Associate professor, University of Helsinki
Statistical learning is critical for detecting and extracting patterned information from continuous sensory signals, including speech. Coupling of frontal motor regions to speech sound sequences plays an important role in auditory statistical language learning (Assaneo et al., 2019). It remains however an open question how specific these motor contributions are. In the present study, we aimed to characterize the specificity of motor contributions to auditory statistical learning. In Experiment 1, we tested specificity of motor processes contributing to learning statistical regularities in speech sound sequences. Participants performed either linguistic (i.e., whispering syllables) or non-linguistic (i.e., clapping hands) motor tasks during exposure to structured speech sequences. In Experiment 2, we focused on auditory specificity and tested whether a linguistic motor task (i.e., whispering) equally affects learning statistical regularities in speech and tone sequences. Finally, in Experiment 3, we examined whether statistical learning performance in tasks with different auditory stimulus types (i.e., speech sounds versus tones) is correlated. In all experiments, statistical learning performance was tested via a forced-choice recognition task. Whispering, but not clapping, impaired learning of statistical regularities in speech sequences in Experiment 1, but whispering had no effect on learning statistical regularities in non-speech sequences in Experiment 2. Moreover, no correlation was found in Experiment 3. Overall, our findings show that auditory statistical language learning is supported by domain-specific auditory-motor processes. These results support the idea that learning statistical regularities in speech versus non-speech relies on distinct mechanisms, and that the speech motor system contributes to auditory statistical learning in a highly specific manner.
Topic Areas: Speech Perception, Multisensory or Sensorimotor Integration