Hand Gesture Recognition Using CNN & Publication of World’s Largest ASL Database
Sign language is used throughout the world by the hearing impaired to communicate. Recent advancements in Computer Vision and Deep Learning has given rise to many machine learning based translators. In this research report, a solution to recognize the English alphabet presented as static signs in the American Sign Language (ASL) is proposed. The classifications are achieved by a four layer CNN. The model is trained and tested on a dataset created for this project. This dataset will be published as a contribution to the community and is currently the world’s largest ASL database consisting of 624,000 images. Split into two sections, the database contains images in both the IR and RGB spectrum. Classifications on both sets of data achieve state-of-the-art when compared to similar research. An accuracy of 99.89% and 99.91% are achieved when classifying the IR and RGB datasets respectively.
History
Language
EnglishDegree
- Master of Engineering
Program
- Electrical and Computer Engineering
Granting Institution
Ryerson UniversityLAC Thesis Type
- MRP