Problem Statement

Healthcare & Accessibility Tech - As technology improves the quality of life of the masses, how can we ensure this advancement is equitable and inclusive?

Inspiration

The visually impaired may have difficulty navigating around places they are not familiar with. Tackling the healthcare and sustainability problem statement, we sought to create an app that notifies the visually impaired of objects around them so they can avoid them more easily. We strive to give them more independence in their daily life through our app. This problem is pertinent as there are almost no apps to help the blind navigate in Singapore.

What it does

This app will use the phone cameras to detect the user's distance from objects and vibrate when they are near to warn them. Their phone will be placed around their torso so that it can detect the ground and overhead objects.

How we built it

We made use of the built-in infrared sensors in iPhones to detect objects nearby. The app is best compatible with iPhones that have dual cameras which can record depth information. This is because it will use the cameras to judge the distance of the user from the iPhone. Using the haptics of the iPhone, we are able to make the phone vibrate to warn the users.

Challenges we ran into

During ideation, we considered how we could make this product with what we were given. It was not easy as we did not have cameras and vibration motors. We realised we could make use of our phones which already have these technologies but none of us were familiar with iOS development. Learning how to use the programming language for iOS was one of our main challenges during this project.

Accomplishments that we're proud of

We are proud of creating a solution that meets the needs of the visually impaired, namely their desire to be independent and seem normal. The app is also easy to use as the user can simply download it into their iPhone and will not require a separate product.

What we learned

We learned how to use Swift to code the app in iOS.

What's next for iSight

iSight can be further improved with Lidar in iPhone pros as it allows for a depth map that is much more accurate in judging distances. Furthermore, due to time constraints, our app currently does not indicate how far the user is from the object but only that there is an object within 1m in front of them. With more time, we believe iSight can have varying vibration intensities to indicate distances.

Built With

Share this project:

Updates