Inspiration
Dash cams are useful and are able to provide limited data in simple accidents, but in more complex and confusing accidents they do little to provide any clearance or evidence (brake checking accidents, collisions involving multiple cars, etc.).
We built Dash Cam Pro to redefine the way drivers user their dash cams by using their iPhones as the dash cam and accessing many of it's available sensors like LiDar, TrueDepth, and accelerometer to capture data that a normal dash cam would not be able to (instantaneous acceleration). Dash Cam Pro can even reconstruct an interactive 3d model of the accident in real time with incredible accuracy and save it for later so interested parties can examine the scene and objects for interest for further evidence and data.
What it does
During a drive, while the phone is mounted on the car's dashboard, Dash Cam Pro records a continuous stream of RGD + D video (a video with depth data). Simultaneously, the phone's accelerometer is being queried in short intervals to check for high g-force impacts that would indicate a crash. If a crash is detected, the recorded RGBD stream is saved in real-time to a cloud-based python application that's converts the RGBD stream into an interactive 3d model using the video's depth and texture data.
In order to save memory, the continuous video stream is stored in a buffer. If the accelerometer doesn't detect a crash within a certain interval, then the video stream stored in the buffer is deleted and replaced with more video. If the accelerometer detects a crash, then the video stored in the buffer in addition to 10 seconds after the crash are sent to the remote python application. This allows the user to access to most critical crash data as close to the time of the crash as possible and interact with a model of the scene immediately before the crash, at the time of the crash, and immediately after the crash.
How we built it
The front end dash cam video capture is built with Swift so we can access native iPhone features like LiDar, TrueDepth and the accelerometer.
Apple Kits we used include:
- AVFoundation to access TrueDepth and LiDar
- SensorKit and CoreMotion to capture and process accelerometer data
The python application was built in Open CV and stored in AWS Sagemaker.
Challenges we ran into
- Accessing TrueDepth and LiDar to stream depth data in addition to RGB data.
- Accessing the accelerometer to instantaneous velocity.
Log in or sign up for Devpost to join the conversation.