Signable: improving accessability to American sign language through gesture recognition

Publication Type honors thesis
School or College College of Engineering
Department Kahlert School of Computing
Faculty Mentor Thomas C. Henderson
Creator Davis, Charles Foulger
Title Signable: improving accessability to American sign language through gesture recognition
Date 2024
Description American Sign Language (ASL) is the prominent form of communication used by deaf communities throughout the United States. Despite the prevalence of its use, accessing means of learning ASL can be difficult, inconvenient, and costly. People who wish to communicate via ASL, such as those with deaf family members, coworkers, or friends, often requires hiring a professional interpreter or enrolling in formal lessons to gain a basic understanding of the language. The web application Signable addresses this problem by providing intuitive online lessons designed to teach users the fundamentals of ASL using real-time feedback from computer vision (CV) models trained to detect gestures from webcam input. Several CV models designed to detect static hand gestures already exist and are readily available however, ASL often requires complex hand motions to convey meaning properly. This thesis explores methods for training more robust CV models capable of classifying complex, movement-based hand gestures that perform at a high enough efficiency rate to provide real-time user feedback.
Type Text
Publisher University of Utah
Subject deaf communities
Language eng
Rights Management © Chales Foulger Davis
Format Medium application/pdf
Permissions Reference URL https://collections.lib.utah.edu/ark:/87278/s647pxpj
ARK ark:/87278/s68heqxq
Setname ir_htoa
ID 2640403
Reference URL https://collections.lib.utah.edu/ark:/87278/s68heqxq