Please use this identifier to cite or link to this item:
http://arks.princeton.edu/ark:/88435/dsp01wh246v763
Title: | An Interactive American Sign Language Dictionary: Using Neural Networks to Evaluate ASL |
Authors: | O'Neill, Meaghan |
Advisors: | Fish, Robert S. |
Department: | Computer Science |
Certificate Program: | Robotics & Intelligent Systems Program |
Class Year: | 2017 |
Abstract: | American Sign Language is the language of the Deaf and hard-of-hearing in the United States, and those who wish to communicate with this demographic of people need to learn ASL. In lieu of a real teacher, there is a wealth of online resources that one can consult in learning sign language -- module-based learning platforms, video dictionaries, YouTube tutorials -- but they are nearly all references in the English-to-ASL direction. This project is an attempt to tackle the much larger, much grander concept of an ASL-to-English video dictionary, one that would accept videos of isolated ASL signs and output their meaning in English. My thesis takes on a small chunk of this larger project -- it trains and tests a very simple Convolutional Neural Network on frames 10 different ASL signs that undergo little preprocessing. As neural networks have not yet been applied, in their pure form and without the help of a Microsoft Kinect or LeapMotion Controller, to American Sign Language data, this project is the first of its kind. |
URI: | http://arks.princeton.edu/ark:/88435/dsp01wh246v763 |
Type of Material: | Princeton University Senior Theses |
Language: | en_US |
Appears in Collections: | Computer Science, 1988-2020 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
M-ONEILL-THESIS-FINAL.pdf | 639.41 kB | Adobe PDF | Request a copy |
Items in Dataspace are protected by copyright, with all rights reserved, unless otherwise indicated.