Helping people who are visually and/or hearing impaired to have better and safer interactions.

What it does

The sensor beeps when the user comes too close to an object or too close to a hot beverage/food. The sign language recognition system translates sign language from a hearing impaired individual to english for a caregiver. The glasses capture pictures of surroundings and convert them into speech for a visually imapired user.

How we built it

We used Microsoft Azure's vision API,Open CV,Scikit Learn, Numpy, Django + REST Framework, to build the technology.

Challenges we ran into

Making sure the computer recognizes the different signs.

Accomplishments that we're proud of

Making a glove with a sensor that helps user navigate their path, recognizing sign language, and converting images of surroundings to speech.

What we learned

Different technologies such as Azure, OpenCV

What's next for Spectrum Vision

Hoping to gain more funding to increase the scale of the project.

Share this project: