This project is inspired by the YouTube channel 'Stuff Made Here' in this video

What it does

Argus helps blind people navigate the world by providing haptic feedback to the user.

How we built it

I first created a prototype React web app that can activate camera in both desktop and mobile. After that, I had to extract the frame, turn it into an image, then feed it to the Azure Computer Vision API.

Challenges we ran into

This is my first time working with video and cameras in a web application, and I had not expected it to be so different to images. From the styling to transforming it to a format that the Azure Computer Vision API can understand.

Accomplishments that we're proud of

Progressive Web Apps are very new and I am happy that I'm able to leverage its capabilities to record videos and generate haptic feedback on a mobile phone through a web application.

What we learned

Azure AI ecosystem and how to deal with videos and canvasses.

What's next for Argus

The application can interact with the user to better understand their needs

Share this project: