Inspiration
We were inspired by Last Week Tonight with John Oliver - the 911 segment.
What it does
Calls to 911 operators can have a wait time of over 20 minutes. The callers emergency description is recorded and analyzed to evaluate the state of emergency. A trained classification model decides on the priority (HIGH, MEDIUM, LOW) of the call, similar to what the emergency room does during triage.
How we built it
We record an incoming call, stream the audio data to Watson's Speech to Text API. The resulting transcript is send to Watson's Natural Language Classification API, using an analyzer we trained with our own data set. The classification result contains a (priority) label and a confidence interval, which we use for a placement in the queue for each class. Furthermore, we utilize python's sklearn library to cluster similar messages together to group same incident reports. We vectorize each message and use cosine similarities to calculate the similarity between each report.
Challenges we ran into
Emergency data is not available through open data portals. We had to create our own minimal (biased) data set to train the initial classifier.
Accomplishments that we're proud of
Utilizing multiple interfaces powered by Watson.
What we learned
Websockets!
What's next for 911 PQ
Improve the classification by enriching the data set, as well as the clustering process.
Log in or sign up for Devpost to join the conversation.