Inspiration

We realized that modern IDEs assume two things: you can type efficiently, and you can visually process dense, nested syntax. For developers with motor impairments, RSI, or neurodivergent conditions like autism or dyslexia, a standard wall of code can be overwhelming or physically painful to navigate. We wanted to move coding beyond the screen and keyboard, turning syntax into a multi-sensory experience where logic has a physical "pulse."

What it does

Voice-to-Syntax: Users can speak natural logic (e.g., "Create a loop from 1 to 10"), and our AI engine instantly converts it into perfectly indented Python code.

Haptic Structure Rendering: This is our core innovation. As the code is read aloud, the device vibrates to indicate indentation levels. A simple if statement feels like a short buzz; a nested loop feels like a double-pulse. This allows visually impaired users to "feel" the shape of the logic.

Error Translation: We stripped away cryptic compiler errors. HaptiCode catches errors and uses AI to explain them in encouraging, plain English.

How we built it

Challenges we ran into

The "Invisible" API: The Vibration API is tricky it doesn't work on most laptops or iOS devices due to hardware/OS restrictions. We had to develop and test on our laptops, deploy to the cloud, and then debug on Android phones in real-time.

Prompt Engineering: Getting the AI to output only raw code without conversational filler (like "Here is your code:") was difficult. We had to strictly enforce JSON schemas and system instructions to ensure the code could be inserted directly into the editor.

Async Synchronization: Syncing the Text-to-Speech audio with the Haptic Vibration was a challenge. If they fell out of sync, the user would feel the vibration for line 3 while hearing line 4. We solved this with a recursive promise chain.

Accomplishments that we're proud of

The "Pulse" Feature: Successfully mapping whitespace characters (tabs/spaces) to physical vibration patterns. Feeling the code "kick" when a loop starts was a magical moment.

Speed: Building a fully functional IDE with AI integration in under 12 hours.

Accessibility Design: We didn't just add features; we built the UI from the ground up with high contrast, large touch targets, and neuro-inclusive colors.

What we learned

Accessibility is Innovation: By solving for accessibility, we built a tool that is actually helpful for beginners too. "Simplified Errors" isn't just for accessibility; it's a great feature for anyone learning to code.

The Power of Web APIs: Browsers are incredibly powerful. We didn't need external hardware or complex drivers to create a haptic experience; the tools were already there in standard JavaScript.

Built With

  • accessibility
  • artificial-intelligence
  • haptics
  • next.js
  • openai
  • react
  • tailwind-css
  • typescript
  • web-speech-api
Share this project:

Updates