iAwake makes driving safer by using eye detection to recognize driver drowsiness. When drowsiness is detected, an alarm plays. After three alarms, emergency contacts are notified of the driver's location and the driver is directed to a nearby gas station.

Originality: In coming up with the idea for iAwake, our team wanted to challenge ourselves with a technical project that tackles an everyday problem space while ensuring a safer world. With the onset of computer vision and image detection technology, two frontier fields of computer science, we thought it an interesting idea to use already-existing facial recognition technology and instantiating such capabilities into an eye drowsiness detection application. With strike features such as playing an alarm after a few seconds of eye closure and pinging a driver’s ICE (emergency contacts) after he/she has failed a few times in staying awake, we built two apps that can be run on both desktop and iOS devices. We wanted to emphasize the ability of creating a very tangible solution to remaining alert at the wheel and easy-to-use setup through any common smartphone camera. In addition, the desktop version can even navigate the driver to the nearest gas station to rest or refuel, a mechanism that could prove useful to families on long, overnight road trips and truck drivers. This implementation uses Twillio, Google Maps, and Apple Vision APIs.

Impact/Usefulness: According to the National Highway Traffic Safety Administration (NHTSA), over 100,000 cases of automobile accidents involve drowsy driving and falling asleep at the wheel. In addition, a solution to this problem remains stagnant as many people shy away from admitting or acknowledging the root cause of these accidents--a lack of attention to the road due to significant sleep or energy deprivation. Designing smarter and safer cars continues to be one of the industry’s leading priorities, especially nowadays with the help of cutting-edge image detection technologies. By implementing CV technology with the common mobile camera (as well as compatible with other dashboard cameras and on Desktop), we created a simple and intuitive approach that any driver can use. The simple alarm sound played when our application detects eye drowsiness can keep tired drivers alert, leading to improved safety on the roads for the driver, passengers and other vehicles. In cases of known and frequent user “sleepiness-at-wheel”, the app notifies emergency contacts that a driver may be in danger and can even route them to a gas station rest stop. iAwake has the potential to save lives and prevent accidents through personalized user experiences, reminding all that it is okay to stop and take a break from driving if it means ensuring the security of you and the people around you on the road.

Feasibility/Practicality: It would be very feasible and practical to implement iAwake on a large scale within the next three to five years; in fact, all of the fundamental functionality is already put into place. iAwake could be easily integrated to use a dashboard camera facing the driver and a remote hardware device such as a Raspberry Pi, or eventually even a camera build into the car. Other further improvements would include using higher precision imaging and making the detection time faster. This would only require a simple hardware setup in the car itself and on a larger scale could be added to a car by manufacturers down the line. For the iOS implementation, as FaceID and Apple’s ARKit continues to improve its accuracy, becoming more readily available to consumers in future products, our app will reach a wider audience.

Technical Difficulty: Over the course of the hackathon, we made several pivotal shifts that ultimately led to iAwake. We started initially in a completely different problem space—food waste—but realized by Saturday morning the difficulty in crawling certain normalizers for food transportation data. We started our project with a simple desktop implementation, expanding on top of an open source computer vision library. We modified parts of the implementation to focus strictly on eye movement, and then created our own eye-drowsiness algorithm to account for elapsed time from eye closure. Using many APIs and unique algorithms, we had to learn and build from the ground up for each. We wanted to take this one step further by creating an iOS app that implemented similar features, with no immediate starter code. By using Apple’s ARKit, Alamofire, Twilio APIs, Google APIs, and Adobe Photoshop, this app was difficult in that it was a learning experience for all of us. Being first-time hackers, we created two apps that we are very proud of and perform their function.

Polish / Design: The hack is fully usable in both our desktop and iOS implementations in its current state. There are no immediate bugs, providing an easy to use and straightforward user interface. If given more time, we would want to finish implementing the automatic place navigation on the iOS app for users to be directed to the nearest gas station; by developing a modern logo and simple-to-use interface, we provide a beautiful experience for drivers.

Built With

Share this project:

Updates