Search Abstracts | Symposia | Slide Sessions | Poster Sessions
Neural selectivity for sign-based phonological units: Evidence from fMRI adaptation
Poster C15 in Poster Session C, Friday, October 25, 4:30 - 6:00 pm, Great Hall 4
Stephen McCullough1, Karen Emmorey1; 1SDSU
Linguistic and psycholinguistic studies have amply demonstrated that sign languages exhibit phonological structure that is parallel, but not identical to spoken languages. The manual parameters of handshape, place of articulation (location), and movement constitute phonological units that must be retrieved and assembled during sign production, and sign perception involves the detection and segmentation of these visual-manual phonological units. However, very little is known about how the brain recognizes and represents phonological structure in a sign language. We used fMRI adaptation to investigate neural selectivity for sign-based phonological units ¬– handshape and place of articulation (POA) on the body – in American Sign Language (ASL). Given that signers exhibit categorical perception effects for handshapes (in contrast to non-signers), we hypothesized that neural regions involved in the perception of body parts (e.g., the extrastriate body area, EBA) become selectively tuned to ASL handshapes in signers. It is less clear whether neural adaption effects will be found for POA because categorical perception effects have not been observed for sign location. We used an adaptor-probe, event-related design in which each trial consisted of two consecutively presented video clips of signs. The adaptor and probe signs were presented in four conditions: identity (e.g., BIRD BIRD), handshape only overlap (e.g., NEWSPAPER BIRD), location only overlap (e.g., ORANGE BIRD), and no phonological overlap (e.g., WEST BIRD). None of the adaptor-probe signs were semantically related, and all videos were the same length (2 seconds). The participants’ task was to detect an occasional grooming gesture (e.g., adjusting clothing, rubbing eyes). This task was chosen because it can be performed accurately by both signers and nonsigners and grooming gestures do not engage sublexical or lexical processing by signers. Thus far, 11 deaf ASL signers and 4 hearing non-signers have participated (data collection is on-going). Preliminary whole brain results from the ASL signers revealed repetition suppression (reduced BOLD response) for handshape overlap (compared to no phonological overlap) in parietal cortex (left superior parietal lobule (SPL), right inferior parietal lobule, IPL) and surprisingly in the left anterior temporal lobe (ATL). Repetition suppression for place of articulation was observed bilaterally in the EBA and right fusiform gyrus. Repetition suppression for identical sign probes (compared to no phonological overlap) was found in bilateral SPL and in the EBA and fusiform gyri, bilaterally. This pattern of results suggests that neuronal assemblies within parietal cortex (the dorsal stream) and in inferior temporal cortex (the ventral stream) become tuned to linguistically relevant handshapes and places of articulation, respectively. We hypothesize that non-signers process signs relatively holistically and have more coarse neural selectivity for specific handshapes or locations, and we predict that the non-signers will exhibit weak or no adaptation for sign pairs with sublexical overlap. Contrary to our predictions for signers, our preliminary results suggest that the EBA becomes neurally tuned to where the hands are positioned on the body (place of articulation) rather than to hand configurations. Neural representations of linguistic handshapes appear to reside in parietal cortex (and possibly in left ATL).
Topic Areas: Phonology, Signed Language and Gesture