Side-view of FindR
Front-view of FindR
Web Interface displaying points from live demo
Displaying the limb detection/detected image (Base64) from live demo (Featuring: Aravind)
Hackathon Submission - 3rd Place: Hack the Valley II
FindR was initially inspired from the many heroes of the everyday world, first responders such as firefighters, rescuers, and EMS crews. Everyday they are faced with multiple challenges to navigate through perhaps smoke filled, dark, small, and most importantly dangerous areas of structures trying to rescue people and save lives
What it does
FindR is an autonomous self-driving robot that is designed to help people in distress in times of emergencies.FindR aims to help firefighters by implementing advanced people recognition. With robust self-driving capabilities this robot can navigate through corridors and search rooms for people. If found, the FindR sends the geolocation Co-ordinates along with pictures to a web interface for emergency crews to analyze the situation and respond accordingly. Additionally, FindR paths its route allowing firefighters to follow after it in areas of heavy smoke or dark areas.
FindR does not stop there, it provide emergency crews with a live stream video and a Myo gesture control armband to control its movements. With slight movements from the hand, emergency crews can precisely control the robot allowing it to be maneuvered in highly congested areas and dangerous areas.
Challenges we ran into
There was numerous challenges that we faced and had to overcome to arrive at the final product demoed. Definitely getting the autonomous driving capability fine-tuned was a great hardship, due to the sensitivity of the hardware we were working with, mainly the drive chassis and the imbalance of the wheels to arrive at the ideal values necessary for certain maneuvers. Myo integration was another crucial challenge that we had the opportunity to face and overcome in terms of being able to communicate between the armband and the robot.
Aside from the technical challenges, for all of us this was the first hardware-based hackathon project that we worked on. Especially with team members coming from various backgrounds with various skill sets, being able to work well as a team and integrate the separate parts into the final product was extremely central to the success of the project.
Accomplishments that we're proud of
Definitely all of it, especially the stressful surprise critical debugging session 30 minutes prior to the judging. Prior to this, functionality was as expected and the robot set aside only to find it suddenly gone haywire and we thought we'd have to skip the autonomous portion, severely affecting its demo performance in getting across the true abilities of our project. Thankfully, we were able to tweak it and carry on.
We believe that FindR has a lot of potential in the real world and continuing to develop our self-driving algorithm to adapt to many different situations and maybe even make a life size model would definitely be a start. Hardware limitation challenges we've faced prompted us to consider wheel encoders/better wheels and drive shafts to assist in the development process. Additionally, implementing this idea with swarm robotics principles and coordinating multiple robots to communicate with one another and carry out such tasks with much more efficiency and speed. At the end of the day, we want to ensure that first responders and those in distress are able to return to their families alive and well ❤️.
GitHub Organization Navigation:
Our Github Orgainization FINDR-HTV2 consists of three repositories.
The HackTheValley repository is a python scripts which utilizes Open CV to detect people based on cascades. This repository can be run by executing the personDetection.py. However this file is depended on a live stream video from the pi and may not run without it.
The FINDR repository is a web interface which utilized Google Maps API and Firebase to map the geolocation Co-ordinates and display images and live stream from the pi. To run this file, double click on the index.html.
The moverobot repository consists of automation.py which is a purely python script which controls the motors on the raspberry pi. This script needs a raspberry pi with correct installations to run. mio.py contains the scripts with regards to the Myo Armband and communicating with the firebase database and relaying the instructions to FindR.