Generally, the elderly and visually impaired people walk with someone by their side. But what if nobody is around to help? We were very concerned about the fact that help may not be always available so we wanted to make a positive impact on the elderly. Nowadays, the elderly want to have independence and not always want people around to monitor them, so we also catered to their independence.
What it does
Our web-app takes input in the form of images through the raspberry pi camera, processes them and using the ultrasonic sensor, accurately calculates the distance between an object and the person with the camera. This is also used to provide collision warnings. If the user suffers a fall, an emergency call is sent out to 911 and another emergency contact of the user's choice.
How we built it
We started off with setting up the Raspberry Pi and the sensors and the camera associated with it. We used multiple Python scripts to get input from the Pi camera, the ultrasonic sensors and accelerometers. The data from the camera was passed through the Clarifai API which generates tags based on the content of the image. We used Python to refine the results that we received from the API. The inputs from the ultrasonic sensors and accelerometers were processed using our Python script, the results calculated and returned to the user on the web app.
Challenges we ran into
Our accelerometer was analog, and the Pi only had digital pins, which led us to make two trips down to MicroCenter to pick up hardware for the project. It took us a while to get the Clarifai API up and running, and then the mathematical algorithms behind fall detection and the entire logic behind getting all the scripts to integrate into one. Interfacing with the accelerometer and ultrasonic sensor was a challenge because of all the proprietary designs implemented by companies.
Accomplishments that we're proud of
We're proud of the progress we've made over this weekend to build the first prototype of our product, having done no work on it before. We feel that moving forward, this could be built into a much more sophisticated system, truly making life much easier for our target audience.
What we learned
We learned how to use a Raspberry Pi to the point where we can now connect any kind of sensor onto it. We learned how to use and implement Clarifai, Adafruit and Twillio API's respectively, and better use of Google Cloud and Firebase. Most importantly, my team learned better teamwork and cooperation among each other.
What's next for Stepify
Our passion for helping the elderly and visually impaired will always exist, and we will continue to take this project on further in order to make it a successful product and make a change in society for the greater good.