"My Voice" is a liberating spelling tool for non-speakers. It provides rich, expressive voices through ElevenLabs for personal expression, a fully customizable communication board, and different visual themes to truly put the user first. Add your own first name, and make it your own! Unlike other text-to-speech assistive devices, this requires no specialized hardware, works on any browser-enabled device, and can be used with the support of a caregiver or aide, or independently for those with more gross and fine motor skill function.

What inspired this?

I had in my mind non-speaking individuals who need to have their voices heard, but for whom expensive tech may get in the way. Specifically, I've been getting to know the beautiful friends profiled on Ky Dickens' "Telepathy Tapes" podcast, and our recent family view out of the movie Out of My Mind. Here, our main character Melody experiences revolutionary change with a computerized tool. Some folks have entire inner worlds to express, but many lack the tools to do so. My own kids are on the spectrum, and I've worked with clients who support folks with autism, and I love the idea of a user-centered, creative, expressive tool that puts the person in the driver's seat. I wondered: what if Bolt could build a free, we-based, self-serve, user-centered spelling tool? And lo and behold, it did!

Features: ElevenLabs voice integration, live transcript/captioning room monitoring, customizable boards, personalized UI themes (Gamer, Play, Zen).

How I built it

This was created start-to-finish in Bolt.new, from the opening prompt to the closing credits. Once I could verify simple sentence-building text to speech, I went straight to visual theme options, as that aspect of customization was important. My kiddos were my beta testers throughout the process, poking and prodding it and helping me see where it was too simple, or just right. I went through some wacky options: at one point, it visualized an iMessage style convo on screen, with room noise appearing as recipient text bubbles, and "Speak" transmissions as right-side blue coloured bubbles. It was cute, but a little much! I focused on trimming and cutting back to get it all right.

What I Learned

I recognize the assistive device community is way ahead of me on all this! There are great tools out there. I relied on my wife, who is a physiotherapist, to point me towards Speech Language Pathology communication boards, which were my starter reference. I'd love to keep learning from people with living experience in this space to understand how to better suit real-world needs.

Challenges

I started with simple text-to-speech, using Mac voices. It worked, but wasn't the kind of dignified experience I had dreamt of. It was hard to figure out how to use an ElevenLabs API, via SupaBase edge function, brought into Bolt, as a non-developer, but IT WORKED! I was so happy, I called all my kids over.

The future is partnership

This was built with love and some hope by Kevan Gilbert, for the Bolt hackathon in 2025, in under 1 day’s time. For it to be truly useful, it must be co-created in partnership with folks with living experience. I recognize this simple tool will have many gaps!

I imagine a future version that...

  • Allows user accounts, to save customizations.
  • Allows custom voices, to suit user’s preferences more deeply.
  • More themes! More customization.
  • And, LLM support, with a tool like ChatGPT/Claude built in for even more remarkable personalization.
  • An onboarding/setup wizard to help populate the board with everything you love.
  • And, the chance to send text messages one-on-one and in a group chat!
  • Integration with key hardware tools, switches, eye tracking…the list goes on!

If you or someone you love is a speller or non-speaker and wishes to collaborate on building new, better versions, please get in touch! [myfirstnamelastname]@populargooglemailservice.com


Built With

  • bolt.new
  • elevenlabs
  • supabase
Share this project:

Updates