32 million children world-wide suffer from disabling hearing loss and 97% of the worlds growing population is from developing countries. I created this device to be used by developing countries or those without access to computers. The device is intended to teach children sign language. Normally, when we think of sophisticated computer vision systems we assume sophisticated hardware to run it on. At this 48 hour ACM Hack-a-thon event, I created a sign language reading device that uses less than 32k of memory. The Pixy camera sensor was a Kickstarter that ran back in 2013. This camera sensor, developed by Carnegie Mellon and charmedlabs, runs a color feature detection algorithm that leverages a dedicated processor. This allows the Arduino device to track objects at 50 frames per second. To compliment the computer vision system, I also prototyped various types of gloves with color codes on them. By the end of the competition, I had programmed a device that can identify the letters {A,B,C,D,E,F,I,L,V,W} and the gestures {"hello", "J"} on a vision system that cost less than $100 to make.
Built With
- arduino
- pixy

Log in or sign up for Devpost to join the conversation.