Generally, the elderly and visually impaired people walk with someone by their side. But what if nobody is around to help? We were very concerned about the fact that help may not be always available so we wanted to make a positive impact on the elderly and visually impaired people. Nowadays, the elderly want to have independence and not always want people around to monitor them, so we also catered to their independence.
What it does
Our app takes input in the form of images through the camera, processes them and it gives real time knowledge about the user's environment using Google cloud's vision. The user gets real time information about their surroundings from our personal assistant. We also use Azure OCR to read important documents for the user such as bus tickets.
How we built it
Our native app was built with Flutter. This app communicates with Firebase and firestore real-time database. We used Microsoft Azure OCR to read documents and Google Cloud Vision to give the user feedback about their surroundings.
Challenges we ran into
We had never used Flutter before and developing this app from scratch was a challenge. Alongside that, getting the Azure OCR to communicate with our Firestore database was a challenge.
Accomplishments that we're proud of
We are proud of getting our app to work and building our Azure OCR functionality.
What we learned
We learned how to build a native app using Flutter, and getting accustomed to different Google services and Azure.
What's next for Stepify 2.0
We will continue building all the functionalities of our product and integrate the remaining components.