Inspiration
Technology has evolved by leaps and bounds in the past few decades and the aim of technological advancements has always been the same: to create a better world. With the goal of making life easier and more accessible for people in mind, we designed our project to fulfill one of the biggest needs of a visually impaired person: spatial awareness. Through our product 4Sight, our users will regain a sense of their surroundings.
What it does
The 4Sight is designed in the shape of a wand. It is fitted with 5 Ultra-Sonic sensors, along with 5 asynchronous haptic response vibration motors under the grip that buzz in the direction, or directions, the sensors sense any presence within the range of a few meters, and at the moment precise up to a meter. Its feedback gets increasingly rapid the closer you are to your surrounding obstacles.
How we built it
We placed 5 Ultra-Sonic sensors onto a 3-D trapezoid shape so that the sensors could capture virtually every front-facing angle. Then, we connected the sensors to a Raspberry Pi Pico module, which is the center of this project. The RPi Pico is then connected to a laptop, which is both its power source as well as the place which gets the distance outputs for troubleshooting (these outputs are for prototype testing only). The RPi Pico further is connected to 5 vibration motors, each receiving a response from one of the 5 sensors, which come out of the handle and are located on the grip. The RPi Pico is also loaded with the code to run, which determines which sensor sends the signal to which vibration motor, as well as the frequencies and durations of the vibration given the proximity of the obstruction detected.
Challenges we ran into
The initial idea for the project was to have a proximity sensor attached to a guiding cane for a visually impaired person to get a slightly better idea of their surroundings, and the sensor was originally planned to send out a beeping sound through an attached speaker. But the challenge was to make the product efficient and compact, as well as a true improvement to the use of a walking cane. Thus, we decided to make it in the shape of a wand. The next challenge was deciding what module to use as our main processor. We were previously using a Raspberry Pi 3, which is quite bulky and would not fit inside the wand, sacrificing the sleekness and efficiency of the product. We then decided to upgrade to a Raspberry Pi Pico, which we were comparatively unfamiliar with using but improved our CircuitPython skills to use effectively. While we upgraded from 1 to 5 sensors and started using vibration motors to opt for a haptic response instead of audio, getting the sensors to work asynchronously proved to be tough, in the event that multiple sensors detected obstacles. With some trial and error, we were able to fit all buzzer responses into separate asynchronous functions working simultaneously.
Accomplishments that we're proud of
We are very proud of how far the project has come since its inception, and how much it has evolved from being just a slightly improved walking cane to a true successor to all visual aid technology. The product Is just 20cm in length now, and boasts an accuracy range of a few meters, with detections being measured for a change as small as 10cm, which is impressive given the constraints of the sensors available and the development time.
What's next for The 4Sight
While currently in only its prototype stages, the 4Sight has immense potential for greatness. The following future prospects of 4Sight are something to look forward to: -Armed with better sensors that have longer ranges, more sensitivity, and higher accuracy, the product can be perfected further. -A temperature sensor add-on can alert you to extremely hot or cold objects, giving the user an added precaution from accidentally burning themselves by toughing the wrong surface -A gyroscopic compass attachment that can guide you through long distances by indicating the cardinal directions, and can be further evolved into an in-built GPS interface. -A camera attachment equipped with computer vision can identify and relay the objects in your surroundings using a machine-learning library -An application interface encompassing all of it and runs through Google or Siri voice assistant, The possibilities are endless. The only limit is our imagination.
Built With
- 3dprinting
- circuitpython
- distancesensors
- python
- raspberry-pico
- vibratormotors
Log in or sign up for Devpost to join the conversation.