Inspiration

Throughout my childhood, I have met many people, who were visually impaired or blind and I started to notice the number of obstacles they had to overcome in everyday situations. I believe this app will specifically aid them and ease their lives.

What it does

The app has 4 core functions: distance detection to objects, surrounding detection, text recognition, and sign language translation.

How we built it

The app is entirely built using Swift on XCode. In order to gain camera access, we created an AVCaptureVideoPreviewLayer and put it onto the main view. In order to detect the distances between your current location and the location of road signs, we used ArKit to find the x, y, and z points and calculate the distance between those points.

Challenges we ran into

One challenge we ran into was trouble plotting the points in ArKit.

Accomplishments that we're proud of

The main accomplishment we are proud of making the distance ruler work, as this caused most of the trouble.

What we learned

Through our experience at the hackathon, we were able to learn the fundamentals of ArKit and Apple libraries.

What's next for BlindHelp

In the next version of BlindHelp, we will merge the different camera screens into one screen that serves all of the functionalities into one screen and create a better dataset for the sign language.

Built With

  • arkit
  • coreml
  • scenekit
  • swift
  • vision
Share this project:

Updates