Inspiration
This was inspired by our fellow team member, Atuh Fon who got into a car accident by falling asleep while driving. Everyone has experienced long days at school and work, but many people don't realize the risk of driving tired. This app is meant to tackle drowsy driving.
What it does
The app captures high-frequency camera frames and analyzes facial landmarks in real-time to calculate the Eye Aspect Ratio (EAR) and head pitch, determining if the user is alert or drowsy
How we built it
The backend is built with flask and incorporates mediapipe facemesh to analyze face geometry and predict whether the user is drowsy or alert. The frontend is built with React Native and Expo for easy demoing on the Expo Go application.
Challenges we ran into
Since we were developing on Linux + Windows and testing on ios, we couldn't compile custom native modules required by some high-speed camera libraries. We pivoted to a polling strategy using expo-camera and Base64 encoding, optimizing transfer of images to maintain a high enough FPS for safety monitoring over an ngrok tunnel. In addition, tweaking the thresholds that mediapipe classified as drowsy or alert required a lot of testing and experimenting with lighting and phone angles.
Accomplishments that we're proud of
This is Sarah's first in person hackathon! We are also proud of conceptualizing some of the math behind the EAR algorithm and Sarah and Atuh's Figma mock-ups.
What we learned
For 2/3 members, it was their first time using React Native and they learned a lot. This is also the first time we have used Mediapipe.
What's next for NoDoze
- We plan to make it so the user can upload their own alarms.
- Using a library to force the hardware volume to 100% for more alertness
Built With
- figma
- flask
- mediapipe
- ngrok
- react-native
Log in or sign up for Devpost to join the conversation.