"An American Sign Language Translation System Based on Kinect" by Vitaliy Sinyuk, Sam Yokoyama et al.
 
An American Sign Language Translation System Based on Kinect

An American Sign Language Translation System Based on Kinect

Files

Description

In this project, students have attempted to implement a computer-vision based translation system for American Sign Language (ASL). A set of ASL signs, including “Hello”, “How are you”, “Good”, “Thank you”, “Yes”, “What”, “Blue”, “Noise”, and “Tall”, were recorded via a computer program using the Kinect sensor. The Hidden Markov Model (HMM) was used as the machine learning classifier to recognize ASL signs. The HMM classifier was implemented in C# and trained using the recorded data. Unfortunately, the experimental results were disappointing. The accuracy of ASL sign recognition is not acceptable. We conclude that alternative template-based method might be more suitable for the ASL sign recognition. Nevertheless, the project provided a good learning experience for students, and the recorded ASL signs will be valuable for future research.

Publication Date

9-28-2013

An American Sign Language Translation System Based on Kinect

Share

COinS