Inspiration
Having visually impaired humans across the world, there is only so many things we can do to help them. For the blind population, a very amazing, however expensive aid we can offer is the guide dog. These dogs act as the eyes of these men and women, unfortunately, the cost of these animals are very high with only 2% of blind people having access to them. With the help of this app, we are able to provide as much sight back to the blind as we can.
What it does
i9 is an IOS app that aids the visually impaired. It utilizes the newer iPhone's lidar sensor to calculate the distance between the device and the object in front of it. If a user gets too close, it provides a haptic and audio response to the user, warning them that they are approaching an object. Both responses grow in strength as the user gets closer, giving them a way to perceive how far away they are.
How we built it
Built using an iterative development process involving making small changes to the code and testing the app directly on an iPhone. Once we felt confident a certain feature would consistently provide a balanced and accurate response we would move on to another.
Was built using XCode and a developer-enabled iPhone.
Challenges we ran into
- Extracting the distance data from the Lidar Sensor (by default it provides raw image data)
- Observing consistent responses (audio and haptic response at correct distances, with consistent rate to provide reliable feedback)
- Various swift/XCode usage, part of our team did not have experience with the platform.
Accomplishments that we're proud of
- Created a fully rendered mobile application that uses haptics and audio as responses to input
- App is not just functional but also useful, tested it in a setting similar to what we expect for real world usage
What we learned
We learned how to test a mobile application that integrates a fully functional camera and lidar sensor. Two of our team members learned SWIFT for the first time during this project.
What's next for i9
Voice command integration, app store launch, potential object recognition via AI integration, and android app development. Maybe GPS integration to provide hazard area detection (roads, rivers, etc.) and landmark detection.
Log in or sign up for Devpost to join the conversation.