Inspiration
Memphis, TN is known for many things and unfoundedly it is known for the crime. Technology continues to improve people's lives but has overlooked some people.
What it does
SonaSpot uses CoreML and the Apple Watch microphone to detect gunshots and alert the user. The user will have the ability to contact their emergency contacts or local police agencies.
How we built it
We developed this application using Swift and Core ML. We used Figma to design the flow and screens of the app. We then used Core ML, to train a model on identifying sound by fireworks and gunshots
Challenges we ran into
We never used Machine learning in a project before and didn't know what to expect. We ran into problems with training the data on different sounds.
Accomplishments that we're proud of
We are proud of design an Apple Watch UI and successfully creating a model for know the difference between gunshots and fireworks.
What we learned
We learned that Machine learning takes a lot of data to train and can be very time-consuming. Our team learned to work together under hard deadlines, which resulted in us pulling this project together.
What's next for Sonaspot
We want to get cleared through Apple so we can have access to emergency services. We have started with a landing page and now allowing customers to start using our service.
Log in or sign up for Devpost to join the conversation.