Inspiration
We were inspired to work this project by looking at the lack of ADA certified infrastructure and walkways present throughout a large amount of underdeveloped communities in the country. The lack of this can hinder people with impaired vision from being able to access the resources they need throughout their city, town, or neighborhood. As such we wanted to design a system that can be taken with them wherever they need to go and would augment their navigation capabilities in these under-developed areas.
What it does
The smart cane activates and uses a small scale LiDAR sensor to scan up to 6ft ahead of ground from the cane's position. The cane establishes a base height that will indicate that the ground being walked on is level, and if the LiDAR detects any significant change in the elevation or "flatness" of the ground, it will send a signal to a set of vibrating motors on the cane to gently alert the user that the direction they are pointing the cane towards has a curb, a hole, or some other change in the ground level.
How we built it
We built our device using distance sensors and vibration motors from the IF Magic company. The distance sensors needed to be positioned at an angle in order to accurately calculate the distance from ground ledges and drop-offs. After finishing this, we used the information from the sensors to process the information on the main UI graphs which therefore vibrated the motors immediately alerting the blind user. The software stack we used were Python for the backend logic and device communication, JavaScript to handle real-time data flow, and CSS to design the user interface.
Challenges we ran into
The main obstacles to this project were mechanical design, hardware compatibility, and integration.
Mechanical design
It was hard to attach the sensor to the cane in a way that accurately measured the change in ground level. This was because the sensor needed to be angled relative to the cane's axis. We iterated through several designs, but eventually landed on a design that would make MacGyver proud.
Hardware compatibility
We decided to use the IF Magic framework for connecting the physical world (distance sensors, vibration modules) and the software (frontend, algorithm to detect ground level change, etc.). The IF Magic team were incredible. We were pushing the limits of the software package and forcing the development team to quickly push updates and bug fixes in response to our use case.
Integration
We had many different machines to coordinate with each other. IF Magic provided us with 2 separate processor modules and 2 different sensors. Streaming the data was relatively straight-forward in Python (thanks to their library) but the main challenge was fetching the processed data and displaying into the frontend, in real-time. We were stuck for a while because there was a very noticeable 1 second delay between an object being detected and a vibration being enabled. To overcome this, we switched to using IF Magic's 'Equations', while still displaying real-time data via our own dashboard.
Accomplishments that we're proud of
Despite the challenges, we made a working prototype within 36 hours. It's not perfect, but the proof of concept is there and we see many opportunities for improvement. Also, against all odds, we affixed our sensor to our cane without damage. Finally, we are happy that we could make a project which has a positive impact on the community.
What we learned
Integration is the hardest part of any project. We all were able to complete our individual tasks but in doing so we made many assumptions on the format and structure of the input/output data. It was only when we came together to try and connect the pieces that we had to establish clearer abstractions.
We also learned that physical feedback to the user needs to have a delay of less than 100 ms, otherwise it's unusable.
What's next for CaneYouSee
There are many directions we could take CaneYouSee. There are general improvements involving UI, latency, and false positive error rate. There are more involved endeavors including adding additional sensors and training ML models instead of hard-coded algorithms to know when to vibrate.
Built With
- ifmagic
- javascript
- python
Log in or sign up for Devpost to join the conversation.