Inspiration

UT Dallas has a vast majority of male engineering students in the School of Engineering and Computer Science, often leaving female freshmen students feeling marginalized, as they may not be accustomed to this gender gap. Furthermore, although the many academic clubs on campus prepare students for their educational endeavors, they often meet at night, discouraging women from attending due to concerns for their safety. As a result, female freshmen students miss out on the opportunity to connect with their peers and feel like a part of their new school. Artemis Assist helps new female engineering students feel safe walking on campus at night, encouraging them to participate in clubs on campus and adjust to UT Dallas.

What it does

Artemis Assist is a wearable device, integrated with a mobile application to enforce the safety of women walking alone on campus at night. The sound sensor, attached to a necklace or hair clip, sleeps when not activated. Tapping the attached touch sensor activates the sound sensor, which records sounds and detects anomalies that may indicate a dangerous situation. Anomalies in the sound are detected using an ML algorithm. Furthermore, the device will begin recording surroundings to gather evidence in the event of a violent situation. The app provides a function to have a conversation with a generative AI while walking in an unsafe area, reducing the risk of harm or victimization. All these functionalities can be monitored, and the camera footage can be accessed through the user-friendly mobile app.

How we built it

The user interface of our app was built with the React Native framework, and the front end was designed using JavaScript. The danger-detecting machine learning model was programmed using Python, Numpy, Librosa, and sklearn.ensemble RandomForestClassifier. It utilizes over 5,000 data points to achieve an accuracy of 94%! The hardware code was written using C++. The Arduino microcontroller was implemented with a touch sensor, sound sensor, and a piezoelectric buzzer. Our hardware component simulates a prototype of a wearable technology such as a necklace, hair clip, or even earrings. For the generative AI calls, we used Respell, an AI API.

Challenges we ran into

Due to the time limit, we weren't able to get all of the necessary components needed to store sensor data in external SD cards or a miniature Arduino camera to record live with audio. Therefore, the team ideated a way for the Arduino IDE’s Serial Monitor to communicate with the React Native application, and focused on audio recordings to identify danger.

Accomplishments that we're proud of

We're extremely proud that we implemented an idea in 24 hours to better the lives of female engineering students entering UT Dallas. Furthermore, the team is proud that they were able to integrate unfamiliar technologies and different disciplines of engineering (software engineering, electrical engineering) to come up with an innovative solution. We were able to create a fully-functioning React Native user interface with Google Maps integration, an ML model, generative AI calling, and a hardware wearable technology simulation. Throughout this hackathon, our team had to think on their feet to make sure that they were able to efficiently develop programs for the various technologies we were doing within the time constraint. We ideated, learned, developed, refined, and repeated that process multiple times to create the best version of our product.

What we learned

The team learned how to effectively transfer data from external sensors into a mobile application. Our team had three computer science majors, who all learned how to work with hardware and integrate it with software. We also learned how to create efficient data analysis with supervised learning, and how to use the Librosa library. Finally, the team also learned how to work with Grove sensors and the Grove base shield.

What's next for Artemis Assist

Currently, the sensors and microcontroller are bulky, reducing Artemis Assist’s wearability. By creating a PCB (printed circuit board) with these components, the circuit can be confined to a small chip, making it easy to integrate into wearable devices and accessories. Also, in the future, the team will add more data points to the ML algorithm, ensuring that it can detect dangerous situations with a higher accuracy. We also hope that we’ll be able to implement computer vision technology on a large scale so that the PCB camera can identify dangerous people or situations around the user.

Built With

Share this project:

Updates