Abstract

Deaf infants born to hearing parents are at risk of language deprivation, which may lead to life-long impact on linguistic, cognitive and socio-emotional development. It remains demanding for hearing parents to provide meaningful and linguistic-rich interaction with their deaf and hard of hearing (DHH) children, due to lack of sign language fluency and insufficient communication strategies. In this study, we present a proof-of-concept visual augmentation prototype utilizing the Augmented Reality (AR) lamp metaphor that aims to support context-aware and non-intrusive parent-child interaction using American Sign Language (ASL), with adaptation to joint-attention strategies that match with the child's communication modality. The proposed prototype enables future studies to collect in-depth design critiques and preliminary usability evaluation from domain experts, novice ASL learners, and hearing parents with DHH children.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call