We were inspired by, due to the increase of natural disasters happening world-wide, being able to connect victims with first responders and emergency services in the most seamless way possible, reducing the mortality rate, as well as the severity of injuries that will happen.

What it does

We not only looked at how we could best help those in need as easily as possible - allowing them to get into contact with emergency services/first responders as easily and accurately as possible as well as giving them crucial information that could easily get looked over when one is in a high pressure situation, but also how we could help emergency services, first responders, and volunteers assist as many people as possible as easily as possible. There are four parts to the project: a website, a mobile application, a chatbot, and a drone/drone controller, which all work with each other to create the most fluid and consistent experience possible, tailored to each platform. The mobile app focuses on simplicity, as those who use it will often be under the most stress. The website, while also providing the same functionality, expands on each aspect, giving those in need of help a place to find resources to maximize their chances of survival, and giving emergency services/first responders a more powerful way to help the victims. You can see the exact coordinates of users that have pinged for help and manage that database straight from the website. In addition, there is space for the future implementation of the control of drones, as well as the ability to view the information gathered from them. The chatbot provides an easy way to get users' vital information, as well as guidance as to how to handle the situation they are in, whether that be basic medical triaging or what to do given the natural disaster. The drone provides emergency services/first responders a safe way to scout a certain area. Using visual recognition and machine learning, the drone takes pictures every few seconds which then helps determine whether people in need of help can be seen, while still maintaining full control of the drone. Most of the information, including location information and drone information, is shared across the platforms, allowing for a fast, seamless data stream.

How I built it

The website was created using javascript, html, and css, with the location services being hosted on a firebase database. The app is built on react-native, while also connecting with the firebase database. The chatbot is built on IBM's Watson Assistant. The drone services are built upon IBM's node-red, while utilizing IBM's Watson Vision Recognition to help assist in finding those in need of help, as well as IBM's Cloudant Database for sharing information.

Challenges I ran into

We ran into a few issues with the drone, specifically connecting it with node-red, as well as a few connectivity issues.

Accomplishments that I'm proud of

We're proud of being able to get the drone up and running, and to have it connect to IBM's Watson Vision Recognition and Cloudant Database, as well as creating multiple user-intractable items that work seamlessly with each other.

What I learned

We learned a lot about javascript (and react-native), as well as databases. We also learned a lot about IBM's cloud services, especially their Cloudant database, Watson visual recognition, Watson assistant, and their Cloud Continuous Delivery.

What's next for rescueU

We hope to be able to add a real-time live-view for the drone, that is available from the control panel on the website as well. We also hope to be able to make all the information that the drones capture visible on the control panel as well. Something we'd like to add to the chatbot includes adding functionality for emergency services/first responders.

Built With

Share this project: