Inspiration
For us, moving through the city it’s an easy task, we cross the roads and get anywhere without any issues, but for elderly people, this can be a challenge. We believe that cities should be equally accessible to anyone but currently that’s not the case. We have developed OAsis with that belief in mind.
What it does
OAsis is a smart assistant that helps elderly people to avoid potential risks that can be found in a city, for example slightly visible signs and traffic lights. But it doesn’t just assists elderly people, it also helps their relatives to be aware of them, in case of emergency or when they need help. By sending a notification, the device is capable of alerting the assistant by sending them the GPS coordinates and it also offers the possibility to call them immediately.
How we built it
The project is divided in three different components:
- Elderly device: this component takes profit from the capabilities of the Google Glass hardware. We used the Google Glass’ embedded camera together with image recognition offered by the IBM Watson API.
- Assistant app: This app is built for android and uses the Firebase API to receive the notifications dispatched by the Google Glass. It also uses Google Maps API to display the last known position of the elderly device.
- Server: It uses Firebase to receive and send the all the notifications between the other two components and also saves in the cloud the location history of the elderly for emergency purposes.
Challenges we ran into
For us, this has been the first time developing an application for Google Glass which also involved managing the camera which was also a first time experience. It was also the first time we interacted with the IBM Watson API for image recognition which was difficult to train correctly. We used Firebase in web apps before, but we never used it in an Android application that receives notifications from the cloud.
Accomplishments that we are proud of
Being able to build a fully functional Google Glass application managed with voice input and successfully interacting with the server and an assitant app.
What we learned
To build Google Glass apps, interaction between Firebase and Android.
What's next for OAsis
Use a faster image recognition system and also a real time environment recognition. Increase the number of risks that we can recognize like road works, traffic signals…
Log in or sign up for Devpost to join the conversation.