Inspiration

A decade ago My grandfather had Glaucoma and couldn’t watch news in his later years . We connected the television audio to his headphones

What it does

Visually I am paired is written in python pictoblox which calls chat GPT for more information after detecting the object by using AI

How we built it

Using block coding by STEMpedia pictoblox

Challenges we ran into

Hardware of magnetic slate . Instead we will be using haptic gloves in XR

Accomplishments that we're proud of

Simple coding

What we learned

AI and ML in python

What's next for Visually I am paired

Haptic gloves in Mixed reality

Built With

Share this project:

Updates

posted an update

The idea is to give computer vision to the visually impaired. In this upgrade Artificial Intelligence does the following functions :

  1. It reads what is in front of the camera
  2. It translates and answers a query in local language
  3. It recognizes the expression of the person apart from the viewer

All functions can be text to speech or made in audio display

https://youtube.com/shorts/yT0R2_tI48c?si=A2-iVroyOBIP_jrk

Log in or sign up for Devpost to join the conversation.