Project Story: Empowering Visually Impaired Artists Through Haptics and Innovation

Inspiration

This project was born out of a desire to redefine artistic expression for visually impaired individuals, utilizing the principles of cognitive extension through touch, tactile communication, and multisensory integration. During the 2025 MIT Reality Hackathon, we aimed to create a tool that not only empowers users to create but also enables them to connect with their work in a deeply immersive and accessible way. By incorporating haptic feedback and spatial mapping, we sought to bridge the gap between art and accessibility, ensuring inclusivity for all creators.

What We Learned

This journey taught us how tactile and haptic technologies can transform accessibility in XR experiences:

  • Cognitive Extension Through Touch: Haptic feedback helps users build mental maps of their creations, improving spatial awareness and interaction.
  • Tactile Communication: Haptic devices allow users to feel and interact with objects, fostering deeper engagement with their art.
  • Multisensory Integration: Combining tactile feedback with auditory cues enhanced accessibility, making complex designs more tangible.
  • Memory and Learning Support: Engaging the sense of touch helped translate abstract concepts into meaningful, hands-on experiences.

We also gained valuable insights into designing user-friendly interfaces and systems tailored specifically to visually impaired individuals.

How We Built the Project

Our system transforms user input into a vector-based representation of an image, enabling tactile exploration. Using OpenCV, FastAPI, NumPy, and Render Deployments, we developed a service that outlines an image’s silhouette and maps it into anchor points on a 2D plane. Sending and Receiving camera access from server. Careful consideration was given to balancing the number of anchor points—enough to convey detail but simple enough for users to navigate intuitively.

The Haptikos Exoskeleton (Version 1) was integrated into Unity, enabling precise, per-finger haptic feedback. Users can feel along the edges of their vectorized designs, experiencing their art through touch. This system provides a sense of physical anchoring, grounding users in their virtual environment and reducing disorientation.

To enhance guidance and accuracy, we incorporated the Logitech MX-Ink Stylus, which uses haptic vibrations to alert users when they deviate from intended vertices or anchor points. This real-time feedback creates a natural feedback loop, helping users refine their designs and understand cause-and-effect relationships in the virtual space.

By combining these technologies, we developed a cohesive system where visually impaired artists can "see" their work through touch, encouraging creativity, empowerment, and inclusivity in the digital art world.

Challenges Faced

  • Precision in Feedback: Ensuring haptic sensations were accurate and intuitive was a technical and design challenge.
  • Hardware Integration: Merging the functionality of the Haptikos Exoskeleton, Unity, and the Logitech MX-Ink Stylus into a seamless experience required significant effort.
  • User Accessibility: Designing the system to meet the unique needs of visually impaired users required continuous testing and iterative improvements.

Conclusion

By leveraging haptic feedback, cognitive extension through touch, and multisensory integration, we’ve created a tool that empowers visually impaired individuals to explore art and design in a way that is intuitive, immersive, and meaningful. This platform not only fosters creativity but also promotes inclusivity and accessibility, ensuring that everyone can participate in the digital art space.

The 2025 MIT Reality Hackathon provided the perfect opportunity to bring this vision to life, and we’re proud to have developed a solution that bridges technology and human connection in the most tangible way.

Built With

  • 2dvectormapping
  • c#
  • fastapi
  • hapticfeedbacksystem
  • haptikos
  • haptikosexoskelecton
  • logitechmxinkstylus
  • numpy
  • opencv
  • python
  • renderdeployments
  • unity
Share this project:

Updates