Inspiration

Wanting to help people in dangerous situations using artificial intelligence

What it does

SafeSound uses voice recognition to alert the police/ loved ones whenever a user says a key word. Additionally, this app also allows people to see how safe neighborhoods are before actually visiting them.

How we built it

Using android studio we used our little experience in java and app development to create an app to help others stay out of dangerous situations, along with helping others in deadly situations. For the crime predictor, we first scraped datasets, primarily Austin's crime rates 2018 dataset, using Python. We then created and ran an iOS CoreML classification algorithm on this dataset. Lastly, we tested a bunch of data and saved the outputs into a CSV file which we ultimately integrated into Android's Google Maps API (we used markers to designate how dangerous an area is).

Challenges we ran into

Low experience, tons of conflict, constant problem after problem with github and react native. We first connected through React Native and Firebase. However, our simulators were incompatible as it ran on a Mac and used IOS libraries. Thus, we all switched to using Android Studio and started from scratch.

Accomplishments that we're proud of

Got the voice recognition and the maps page with predictions to successfully work.

What we learned

How to work with Android Studio as well as how to connect to Firebase.

What's next for StayAlert-pages-

Expand the crime predictor to other cities as well as figure out how to continuously run the app in the background.

Built With

Share this project:
×

Updates