Inspiration
To improve the quality of life and safety of individuals who are vision-impaired or blind. A new solution that provides further independence to everyday life.
What it does
A vibrational language that collects real time visual information (i.e. cars, people, traffic lights etc.) about a user’s surroundings and generates unique, identifying vibrations on their smart device to indicate nearby objects. All processing is done on device, with no connection to the cloud, making it very secure. Our implementation paves the path for the App to be able to communicate the location and distance of growingly complex object libraries, as technology improves with the Apple Watch.
How we built it
Using Joseph Redmon’s, aka pjreddie, YoloV3, converted from Darknet to Keras/Tensorflow and then to Core ML, we created an iPhone app that allows users to sense the world around them. The machine learning model detects people, cards, and birds (for demo purposes) and tracks them in real time through their iPhone iSight camera. The conversion was done with YAD2K and Core ML Community Tools. Through the element of the Taptic Engine on iPhone 6S and later, users are able to sense what specific object is around them. The Taptic Engine is much more precise than a regular vibration motor, which can be programmed in a variety of combinations and frequencies. We controlled it through the use of the Piano framework which allows much easier access to the Taptic Engine and the ability to create symphonies of different vibration styles. The App was made in Xcode 9.4.1 and ran on an iPhone 6S running iOS 11.0.
Challenges we ran into
- Balance between speed and image processing accuracy
- Working with Fitbit Studio and its various limitations
- YOLOV3’s conversion from Darknet to Keras to Core ML because of Core ML’s support of Keras one of the only official conversions.
Accomplishments that we're proud of
- Creating something with the ability to positively impact lives
- Imagine recognition was smooth and snappy
What we learned
- Application of ML
- Real time object detection
What's next for 6ixth Sense
Open source where people can contribute different vibrational patterns to detect different objects. Eventually have all objects in database and create a functional vibrational language for the blind.
Log in or sign up for Devpost to join the conversation.