After seeing Project Owl in the initial HackMIT sponsored challenge email, we noted how their solution required "throwing out" all the old systems and replacing them with new ones. Despite the potential superiority of Project Owl's transceivers, it seemed unlikely that first responders would be eager to discard all their existing, expensive equipment. Following that, we decided to build a system that would allow interoperability without overhauling the preexisting communications infrastructure.
What it does
It simulates several different first responders on different radio networks and it routes traffic between them as needed and streams data back to a central analytics server. Once the audio data reaches the analytics server, we use IBM Watson Natural Language Understanding features to semi-autonomously monitor the first responders and intelligently suggest dispatching other resources, for example sending a helicopter to a first responder who reports that they have a patient in need of a medivac.
How we built it
We simulated the movement of the different first responders on the server and manually mocked up audio recordings to represent the radio traffic. We passed this data to our analytics server, queried several NLP and map APIs to collect information about the situation, and then streamed suggestions to a web app.
Challenges we ran into
This was our first time working on a full stack web app, and it was one of our members' first hackathons. Because of this, we did not finish integrating all of our features, and the front-end is not yet complete.
Accomplishments that we're proud of
We believe we had a very strong product idea and that we all learned a lot this weekend implementing the features that we could.