Inspiration

We wanted to build a practical and safe solution to improve the livelihood of visually impaired people. To help them avoid dangers in day-to-day life, we built a system that calculates their relative distance to oncoming surfaces, ledges, and other objects and alerts them through a notification service.

What it does

Our application uses a combination of software and hardware components to achieve its task. First, the application runs an Arduino program that frequently processes input from an ultrasonic sensor to gauge the distance a user is from nearby obstacles. If the measured distance falls within a low-to-high threshold range, the program indicates that the user is safe. However, if the measured distance is below the low threshold value, then the program indicates that the user is approaching a surface. Additionally, if the measured distance is above the high threshold, the program indicates that the user is approaching a ledge, such as stairs. In addition to immediate, real-time hazard detection via the sensor, as described above, the application also interfaces with a camera via a button trigger to capture a photo every time the user steps. This photo is then analyzed using Google Gemini's image recognition feature to summarize the hazards a user is approaching. Both methods work to direct users safely through hazardous environments and provide less stressful experiences.

How we built it

  • We built EdgeSense as a full-stack system using a Python Flask backend that ingests sensor & camera info and sends alerts to users. On the hardware side, we configured an Arduino by wiring an ultrasonic sensor, button, and camera module to appropriate digital and analog pins. Then, we validated the circuit design by creating a schematic in Tinkercad and testing it before uploading the program onto the Arduino.
  • The Arduino firmware continuously reads sensor inputs and streams structured data over serial to the backend, where we implement threshold-based hazard classification and persistent logging using JSON storage. We integrated AWS SES through boto3 for event-driven email alerts, handling authentication, region configuration, and request signing while maintaining a continuous flow from embedded data collection to backend processing and cloud notification.

Challenges we ran into

  • The ultrasonic sensor was quite sensitive and did not report the distances accurately in close range. To fix this issue, we adjusted the threshold range to account for close-range sensitivity and ensured that the user was alerted about potential hazards much more in advance.
  • Another challenge we faced was that since the sensor and button were on two different branches in the same circuit, their outputs could not be synchronized often. To fix this issue, we limited the delay between the two branches in the Arduino program while increasing the delay AFTER both branches were complete. This allows the sensor, pressure detecting button, and camera enough time to process their inputs.
  • When the script to analyze images via Gemini was started, there was a high delay when using the camera to capture an image because the camera was constantly opened and closed within the method call. To make the process faster, we started the camera outside of the method call and kept it running while the Arduino program still ran to make sure images would be captured and analyzed quickly.

Accomplishments that we're proud of

  • We're proud of building a prototype of a tool that has high potential for positively impacting the lives of visually impaired individuals. Through additional resources and development, the system can be made smoother and more efficient.

What we learned

  • We learned that software can intersect with hardware to create user products with high impact. While most tend to think of software and hardware as separate entities, we found through developing this project that both software and hardware complement each other quite well, and one cannot be without the other in highly performant systems.

What's next for EdgeSense

  • We'd like to incorporate sound through the Arduino itself by using a buzzer to alert the user more easily. Also, we'd want to miniaturize the system in the future so it can fit on a shoe. Lastly, we are interested in generating a flash in dark, dimly lit environments for added visibility and safety.
Share this project:

Updates