I recently visited a school for blind children in India and interviewed a few students and listened to their stories. While researching IoT and AI based solutions for Accessibility, I wanted to provide blind people with the ability to see longer distances and through walls using simple gadgets such as mobile phones. While researching popular computer vision projects, many of them were fancy, expensive to compute and solved very trivial problems. So I decided to build a Hyperbraille interface where instead of conveying just one letter or one image at a time on the display, my map can convey information regarding surrounding obstacles. Therefore braille display is capable of being extended to RF sensors to detect humans across walls on invisible positions such as inside rooms.

What it does

Assuming there is an AI model or bot that can detect objects ( For example Imagenet), my interface maps spatial information from the bounding boxes onto a hyperbraille map.

  1. The input is a raw image
  2. Step 2 of the pipeline detects objects in it using an assumed AI cloud based model ( easily accessible via API call)
  3. Step 3 receives the Bounding box coordinates and size and converts the information into a binary form on a 64x64 hyperbraille grid.
  4. Step 4 displays the information on a web interface. '1' is where there is an obstruction and '0' is no obstruction. ## How I built it I built it using javascript, HTML and CSS with API calls being executed using Node JS. ## Challenges I ran into
  5. Getting access to hyperbraille software interface or UI/UX
  6. Training object detection models

Accomplishments that I'm proud of

  1. Building a solution to improve existing technology for communication for Blind people
  2. Taking AI and Computer Vision to the next level by bridging the gap between human and AI. ## What I learned
  3. I learned about various braille displays such as taxels and blindpad which also use binary signals for electronically displaying information.
  4. Humans can be sensed without cameras too using RF signals. This makes my Hyperbraille map see through walls. ## What's next for AI based HyperBraille Mapper
  5. Integrating with Google Brailleback
  6. Integrating taxel displays and blindpad with my real-time grid.
  7. Integrating RF sensing capabilities to the map to enable blind people to see through walls.

Built With

Share this project: