After all the presentations on Friday, we were inspired that at the core of all disasters were people and tracking what is happening to people dictates what actions are taken, what resources are deployed, amongst a number of other critical decisions.

What it does

Data streams, like cameras, are deployed on the disaster network when it is deployed, such as hospitals, refugee site, or disaster check points. Those are priorities sites for connectivity, so we focus on data streams we can tap into at those sites, along with additional data sets such as census, among others. We built a data ingestion engine that can be tapped by both civilians and the military. For civilians there is a chatbot they can talk to, upload an image, and find loved ones. For the military, there is a dashboard SitRep of what is happening, from current status of missing people to real-time movements, among others.

How I built it

End to end integration from Raspberry Pi uploads photo to firebase, there is a service looking for recent photos uploaded and serves it to our machine learning image recognition model. On the other end is a chatbot that communicates with people looking for loved ones. They upload images to us, we tag them with additional data and correlate them to photos already taken.

Challenges I ran into

Making it connect together on the backend, especially pushing the images from the camera to be added to our database.

Accomplishments that I'm proud of

We were able to get this to work end to end

What I learned

The solution is dictated by the infrastructure that is available at the time

What's next for EventGlance

Finding ways to integrate additional data streams and pitching this to the military.

Share this project: