We didn't really have an idea coming into DragonHacks but we wanted something to help the disabled community. We researched for a few hours about the troubles that the disabled community face day to day, we decided to narrow it down to the visually impaired. One of the complications they have is navigating indoor spaces. We decided to use this as inspiration to create a solution to assist them in navigating indoor spaces.
What it does
The iCane features two products, the white cane and the mobile application. The mobile application utilizes Bluetooth Beacons that have been positioned at points of interest such as bathroom and information desk. The mobile application calculates the distance between the beacons and uses clever trigonometric math to determine the positioning of the phone. It provides directions on how to get there, constantly alerting the user through the headphones or speakers. We also have a White cane that has an ultrasonic sensor that will vibrate when they reach an object within a specific threshold, it also has the ability to determine what you are looking at based on a camera.
How we built it
We built the mobile application on Swift and Xcode. We used the gimbal beacon to broadcast the namespace and rssi values. The white cane was developed using a cane, raspberry pi, ultrasonic sensor (HC-SR04),micro servo,Logitech camera, buttons.
Challenges we ran into
We had a lot of trouble with the beacons because there is a lot of interference. We also had trouble setting up the raspberry pi since we weren't given an sd cards and the raspberry pi was corrupt. So we had to re download and upgrade the pi. Image recognition proved a little harder than expected, the clarifai api assist in simplifying the operation.
Accomplishments that we're proud of
We are accomplished to have a cheap and working end product that can be used to assist the visually impaired. Given the time constraint of 24 hours and accounting for the difficulties we faced, we've accomplished so much this weekend and we're able to use what we learned in class to apply it to a real world project
What we learned
We learned how to utilize the beacons and how they interact using Eddystone Urls and iBeacon format. We learned more about the gimbal sdk available for iOS and how to use it. We figured out how to recognize an object using a image. All the things we've learned this weekend, will come handy in the future.
What's next for iCane
We want to proceed and make a professional product. This is just a prototype so it really dosen't look that good and some things can be tweaked. What makes this marketable is the design criteria of being simple, cheap, and efficient. This would be a big help to the visually impaired community.