Presentation
Search Abstracts | Symposia | Slide Sessions | Poster Sessions | Lightning Talks
Investigating Embodied Cognition in Deaf ASL Users: EEG and Virtual Reality for STEM Education
There is a Poster PDF for this presentation, but you must be a current member or registered to attend SNL 2023 to view it. Please go to your Account Home page to register.
Poster B112 in Poster Session B, Tuesday, October 24, 3:30 - 5:15 pm CEST, Espace Vieux-Port
This poster is part of the Sandbox Series.
Carly Leannah1, Lorna Quandt1; 1Gallaudet University
It has been shown that hands-on physical learning experiences with science lead to increased activation in the sensorimotor regions, indicating an embodied understanding of scientific concepts (Kontra et al., 2015). Recent research in Immersive Virtual Reality (IVR) suggests it can enhance learning through gamified experiences, increased presence, and self-efficacy. However, little is known about the learning experience in IVR for individuals who are deaf or hard of hearing (DHH) and have extensive familiarity with American Sign Language (ASL), an embodied language. ASL is a manual language that utilizes the visuospatial modality, making it an embodied language that engages sensorimotor cortex regions (Emmorey et al., 2014). DHH ASL users have been found to have earlier and more robust sensorimotor responses when processing human movement compared to hearing non-signers (Quandt et al., 2021). Building on this work, our study aims to explore how DHH learners understand and process STEM content during IVR educational activities. Our experiment will explore the behavioral and neural impacts of two types of technological learning experiences: 1) immersive, interactive VR, and 2) passive video-watching. We will compare learning outcomes before and after the two experiences. Learning outcomes will be assessed through knowledge retention, transfer, and motivation measures. Additionally, we will identify the neural correlates of immersive learning among DHH ASL users using EEG time-frequency analysis focusing on frontal midline theta power and sensorimotor mu rhythm desynchronization before and after each type of learning. Increased desynchronization in the IVR condition would indicate heightened engagement and embodiment of the learning content. To achieve our research goals, we will recruit 30 individuals aged 18-40 who are DHH and fluent in ASL. Participants will be divided into two groups: one experiencing a 20-minute chemistry reaction-balancing activity in IVR (N=15) and the other engaging in passive video watching of the same content (N=15). Pre- and post-learning assessments will be conducted to measure learning outcomes, and EEG data will be collected before and after the learning experience. The experiment will involve 120 stimulus trials consisting of alphanumerical and picture molecular equations, some learned during the IVR activity and others unfamiliar. Participants will indicate whether the equation is correct or incorrect. Through EEG analyses, we will compare the impact of different learning environments on sensorimotor neural processing, shedding light on embodied learning processes. There is limited research on how lifelong experience and fluency in an embodied signed language, such as ASL, affect cognitive processes during different learning contexts. Studying how DHH learners, a unique group with an embodied language, embody concepts by measuring their theta oscillations and sensorimotor mu rhythms will allow us to investigate their performance in hands-on STEM learning. This research will provide insights into comparisons between IVR and video learning for embodied learning of STEM content. Link to references: https://docs.google.com/document/d/1xUZNZPoDn8JZiIuCH_ETh1jjiVw93nkRHihaW9MwYbs/edit?usp=sharing
Topic Areas: Signed Language and Gesture, Multisensory or Sensorimotor Integration