Inspiration

The inspiration for this project came from the group's passion to build health related apps. While blindness is not necessarily something we can heal, it is something that we can combat with technology.

What it does

This app gives blind individuals the ability to live life with the same ease as any other person. Using beacon software, we are able to provide users with navigational information in heavily populated areas such as subways or or museums. The app uses a simple UI that includes the usage of different numeric swipes or taps to launch certain features of the app. At the opening of the app, the interface is explained in its entirety in a verbal manner. One of the most useful portions of the app is a camera feature that allows users to snap a picture and instantly receive verbal cues depicting what is in their environment. The navigation side of the app is what we primarily focused on, but as a fail safe method the Lyft API was implemented for users to order a car ride out of a worst case scenario.

How we built it

Challenges we ran into

We ran into several challenges during development. One of our challenges was attempting to use the Alexa Voice Services API for Android. We wanted to create a skill to be used within the app; however, there was a lack of documentation at our disposal and minimal time to bring it to fruition. Rather than eliminating this feature all together, we collaborated to develop a fully functional voice command system that can command their application to call for a Lyft to their location through the phone rather than the Alexa.

Another issue we encountered was in dealing with the beacons. In a large area like what would be used in a realistic public space and setting, such as a subway station, the beacons would be placed at far enough distances to be individually recognized. Whereas, in such a confined space, the beacon detection overlapped, causing the user to receive multiple different directions simultaneously. Rather than using physical beacons, we leveraged a second mobile application that allows us to create beacons around us with an Android Device.

Accomplishments that we're proud of

As always, we are a team of students who strive to learn something new at every hackathon we attend. We chose to build an ambitious series of applications within a short and concentrated time frame, and the fact that we were successful in making our idea come to life is what we are the most proud of. Within our application, we worked around as many obstacles that came our way as possible. When we found out that Amazon Alexa wouldn't be compatible with Android, it served as a minor setback to our plan, but we quickly brainstormed a new idea.

Additionally, we were able to develop a fully functional beacon navigation system with built in voice prompts. We managed to develop a UI that is almost entirely nonvisual, rather used audio as our only interface. Given that our target user is blind, we had a lot of difficulty in developing this kind of UI because while we are adapted to visual cues and the luxury of knowing where to tap buttons on our phone screens, the visually impaired aren't. We had to keep this in mind throughout our entire development process, and so voice recognition and tap sequences became a primary focus. Reaching out of our own comfort zones to develop an app for a unique user was another challenge we successfully overcame.

What's next for Lantern

With a passion for improving health and creating easier accessibility for those with disabilities, we plan to continue working on this project and building off of it. The first thing we want to recognize is how easily adaptable the beacon system is. In this project we focused on the navigation of subway systems: knowing how many steps down to the platform, when they've reached the safe distance away from the train, and when the train is approaching. This idea could easily be brought to malls, museums, dorm rooms, etc. Anywhere that could provide a concern for the blind could benefit from adapting our beacon system to their location.

The second future project we plan to work on is a smart walking stick that uses sensors and visual recognition to detect and announce what elements are ahead, what could potentially be in the user's way, what their surroundings look like, and provide better feedback to the user to assure they don't get misguided or lose their way.

Share this project:
×

Updates