After learning about how the Cajun Navy and other disaster relief services were relying on one-to-one communication services such as Zello for information in Houston, we decided to come up with a way to help provide the big picture.

What it does

OpenArmsData uses data from Twitter, IoT devices, and text messages to create a modular comprehensive view of disaster areas for disaster relief personnel. Texts sent to an OpenArmsData phone number are analyzed to determine the resources needed and the location of the sender, allowing for easier relief facilitation. Twitter data is used to create a sentiment graph of the city, expressing the current feelings of the citizens in the area. IoT devices are used to detect natural disasters, providing for instant access to disaster data from the internet.

How we built it

We built it using Node.js for the back-end, WebGL, and Three.js for front-end and MongoDB for maintaining our database. We also use Twilio for sending and receiving texts.

Challenges we ran into

Integrating 3D graphics with the Google Maps API quite a struggle. Additionally, managing and finding the data for sentiment analysis and some preliminary visualizations was difficult.

Accomplishments that we're proud of

Designing and implementing a such a complex visual application with a variety of inputs is quite an accomplishment in itself. There were lots of components that required tight integration.

What we learned

We learned a lot about web graphics, as well as communication infrastructure. Working in a collaborative environment at such a fast pace was challenging, but we learned a lot about how to divide our work or work together to avoid development blocks.

What's next for OpenArmsData

We would love to explore better recognition algorithms and interfaces that could better match those in need with those who can help!

Share this project: