Presentation

Search Abstracts | Symposia | Slide Sessions | Poster Sessions

The sound of silence: Congenitally deaf individuals rely more on visual motion to derive object sound knowledge

Poster Session C, Friday, October 25, 4:30 - 6:00 pm, Great Hall 3 and 4

Shuyue Wang1, Ziyi Xiong1, Xiaosha Wang1, Yanchao Bi1; 1Beijing Normal University

The mechanism in which the human brain derives rich knowledge about the world through multiple sensory modalities and language is a major question in cognitive neuroscience, yet remains elusive given the intricate interaction and association among various types of experiences. For instance, an object sound (e.g., the common sounds of a cat or a hammer hitting a nail) is primarily perceived through the auditory modality, and yet can also be derived to varying extents from other sensory modalities that have rich associations with or indications of sound (e.g., tactile vibration) and language (e.g., sound words). Studying sound knowledge in populations deprived of auditory experience thus provides unique opportunities to disentangle the unique contribution, redundancy, and interaction among different types of experiences. In this study, by asking congenitally deaf and hearing subjects to answer the question of whether an object has common sounds, we aimed to reveal the cognitive and neural mechanisms by which object sound knowledge is constructed from auditory, non-auditory (object size/weight, tactile, visual motion, taxonomy) and language (estimated by large language models) experiences. First, 46 deaf and 50 hearing individuals participated in the behavioral survey about whether an object (animals or artifacts) has common sounds. The deaf and hearing groups produced highly correlated patterns across objects (that is, they tend to judge similarly whether an object has common sounds), despite overall underestimation in the deaf group. Regression models were then used to predict sound knowledge response (yes/no) for a given object from the above variables in each group. Results showed that sound judgment was positively associated with auditory experience and language encoding in both groups and that the deaf relied more on visual motion than the hearing. In a picture naming fMRI experiment, we first localized the brain regions whose activities were significantly modulated by object sound properties in the two groups. Among these regions, the superior temporal regions were modulated by visual motion in deaf subjects, to a greater extent than in hearing subjects, which may explain their behavioral overreliance on visual motion in object sound judgment. In summary, auditory deprivation alters how different sensory cues are weighted to infer such knowledge. These findings suggest that the human brain adapts different ways to construct knowledge when available sensory experience alters, with visual motion highlighted as a substitute for auditory properties.

Topic Areas: Multisensory or Sensorimotor Integration, Meaning: Lexical Semantics

SNL Account Login


Forgot Password?
Create an Account

News

Abstract Submissions extended through June 10

Meeting Registration is Open

Make Your Hotel Reservations Now

2024 Membership is Open

Please see Dates & Deadlines for important upcoming dates.