Inspiration

With an increasing number of disasters, resulting to as many as 600,000 deaths in last 20 years, we felt we needed to invent something to tackle this problem, this is how the idea for an automated search system came into life. Additionally with the expanding research in the field of drone technology and artificial intelligence, we decided to use these to find a solution to a problem in a way it was never done before.

What it does

Our technology uses drones and facial recognition to find people who have been stranded (for example during a flood). It then sends their precise co-ordinate location in the form of a SMS. This technology could be of particular interest to disaster management teams and rescue services, who could deploy several drones when an disaster/emergency is notified and then rescue people based on their precise location. This can potentially save lifes as well as costs, and most importantly time for government bodies to react to the crisis. Another good application of this could be to rescue hikers who are stranded/unable to find way back.

How I built it

We used the drone's API, DJI API and implemented the technology by creating a project in Swift code. After creating the IOS App, it was all down to the complex backend development. Twilio API was used to send messages with the latitude and longitude of the person. It was the IBM Watson which we trained on, and this was used to recognise if the picture (sent by the drone) includes a person. Additionally we used Heroku + Amazon AWS and Node.JS to deploy our server.

Challenges I ran into

There were lot of challenging aspects of doing something this advanced. Some of the challenges include:

  1. Learning to deal with DJI API which was fully written in Objective C with no proper documentation (included contacting a DJI employee in China on Saturday night, who got back to us on Sunday morning).
  2. Analysing images to ensure accuracy of detection, most importantly training IBM Watson.
  3. Learning legacy code.
  4. Dealing with images was very complex since most of us had not dealt with it before.

Accomplishments that I'm proud of

  1. Managing to finish the project (with so much to learn and the complexity of the task)
  2. Learning to deal with IBM Watson and how it works.

What I learned

  1. Working in high pressure situations.
  2. Learning to use Objective C and Swift.
  3. Improving the performance of IBM Watson's predictions.
  4. Using Twilio API.

What's next for Automated Search Rescue System

  1. Using infrared technology to detect humans with the help of thermal radiation ( this would make it a commercially working model).
  2. Polishing the interface and adding more features (for example, ability to send back a picture of the person)
Share this project:
×

Updates