Presentation
Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks
Auditory-Motor Synchronization and Perception Suggest Partially Distinct Time Scales in Speech and Music
Poster E69 in Poster Session E, Thursday, October 26, 10:15 am - 12:00 pm CEST, Espace Vieux-Port
Johanna Rimmele1,2, Molly Henry3, Claire Pelofi4, Alice Vivien Barchet1; 1Max Planck Institute for Empirical Aesthetics, 2Max Planck NYU Center for Language, Music, and Emotion, Frankfurt am Main, Germany, New York, 3Toronto Metropolitan University, 4New York University
Whether the different dominant rhythmic structure in speech (~4.5 Hz) and music (~2 Hz) emerges from distinct neural auditory or motor related mechanisms is debated. We provide a new perspective by investigating the effect of different articulator systems and their auditory-motor cortex coupling (as estimated by perception-production synchronization strength) on rate-specific processing in both domains. In a behavioral protocol (n=60), a perception and a synchronization task involving speech (syllable sequences) and music (piano tone sequences) stimuli and articulators typically investigated in the speech and music domain (whispering and tapping), were conducted at slow (~2 Hz) and faster rates (~4.5 Hz). In the synchronization task, participants were instructed to synchronize their production to the presented stimuli. In the perception task, participants detected temporal deviations in sequences of isochronous tones and syllables. The normalized phase-locking value (PLV) was used to quantify the auditory-motor synchronization performance. In a linear mixed effects model (LMM), the synchronization strength was predicted from the stimulus type, the articulator type, and the presentation rate. In order to extract independent auditory-motor synchronization components, principal component analysis (PCA) was conducted on the normalized PLVs. Finally, a generalized linear mixed effects model (GLMM) was used for predicting accuracy in the perception task from the stimulus type, the articulator type, the presentation rate, and the retrieved PCA components, including their interactions. Although synchronization performance was generally better at slow rates, the results suggest domain-specific rate preferences. Tapping synchronization was advantaged compared to whispering at slow but not at faster rates. Synchronization was domain-dependent at slow, but highly correlated across domains at faster rates. Syllable and tone deviance detection in the perception task was optimal at different rates -faster and slower respectively- and predicted by synchronization strength. Our data suggests partially independent articulator system related mechanisms mediating an impact of the motor system on the optimal processing rates for speech and music.
Topic Areas: Speech Motor Control, Speech Perception