Inspiration

This weekend at Deltahacks we were challenged to hack for change and the betterment of our communities. The Innovation Factory Hamilton challenged the competitors to look to the future and help build a more connected city that is better enabled to serve it's inhabitants. We used this as inspiration for building our application. It has the potential to save lives by cutting down on response time for dangerous situations such as a car crash or an active threat. This was our chance to "imagineer" a part of the future

What it does

Smart Response is made to minimize the overhead time notifying first responders when there has been an incident that is potentially life-threatening. Using the cityIQ API we connected to the Hamilton cityIQ nodes to receive real-time audio and video from nodes around the city. Using the audio data, we trained and implemented a model using Python to detect when there has been a significant event and then notify first responders or the public of a threat. The application would then geotag the location of the incident with a real-time image and classify it (car crash, gunshot etc.) based on the audio properties. First responders can view the image to determine the severity of the incident and determine what actions to take next.

How we built it

We built this system using many technologies.

Front-end app: React Native and Apple Maps

Back-end: Node.js + express, Python, Math and machine learning libraries such as librosa, openCV, matplolib, scikit, tensorFlow; and finally PostgreSQL for the database

API: We used the cityIQ api to get live audio and images of Hamilton.

Challenges we ran into

We ran into many challenges over the course of the project. Connecting to the cityIQ API was a challenge we faced at the beginning of the project, but we persevered and were able to receive data from the node. Later building and training a model for classifying mel spectrogram images was a challenge as it was our first time using Machine Learning and the Python libraries associated with it. Also building digital filters to help remove most of the ambient noise present in the audio data was a challenge as it was more of a trial and error to receive that data and determine a filter that worked best for filtering noise but still being able to classify different sounds.

Accomplishments that we're proud of

Every group member contributed and learned lots for such a short period of time. It was challenging to stay focused when facing many issues along the way, but we are all proud for staying the course and building the platform. We are proud of the fact that what we built has the potential to impact people on a day to day basis.

What we learned

Our group stepped out of our comfort zone and all learned something new. It was our first time developing an iOS app in react-native as well as use iOS Map API. We also learned the ML and signal processing libraries in Python.

What's next for Smart Response

Expand the number of connected nodes for serving the community. Also use the nodes to obtain more training data for the model as well as training it with other common urban noise data to accurately analyze a larger range of disturbances.

Share this project:

Updates