Inspiration
For the past 15 years, Julie has lived with retinitis pigmentosa, a condition that has gradually stolen her sight. Yet her longing to read and absorb the Word of God has never faded. Despite trying various apps and tools, none truly offered her the independence she craved—until now.
Thanks to the power of accessibility in myguidinglight.me, Julie can now invoke God’s Word by voice, effortlessly opening Scripture and engaging with its meaning on her terms. No longer does she need to rely on others to read aloud or navigate clunky interfaces.
This is more than a personal win for Julie. It marks a meaningful breakthrough for thousands like her—individuals who yearn to connect with the Bible independently, reflect deeply on verses, and find guidance without feeling like a burden or needing constant help from others.
Technology, in this case, isn’t just a tool—it’s a bridge to dignity, spiritual growth, and renewed confidence.
What it does
myguidinglight.me is a voice-activated web app that enables visually impaired users to independently access and interact with the Bible using AI and speech technology.
How we built it
This project was built using:
- React and TypeScript
- Vite as the development and build tool
- Tailwind CSS for styling
- ElevenLabs API for speech synthesis
- Gemini API for AI-powered responses
- Web Speech API for speech recognition
- Howler.js for background audio
- Local storage for persisting user settings and chat history
- Netlify for deployment
- Entri integration for custom domains
The project's UX was significantly shaped by discovery and usability tests conducted with individuals who are visually impaired. This inclusive design approach ensured that the application's features and interactions are accessible and intuitive for a broader range of users.
Challenges we ran into
- Connection with OpenAI API: Unable to make requests to the OpenAI API despite having a paid account, consistently encountering 404 server errors.
- Enabling Siri: Siri couldn’t be directly integrated with the web app, requiring a workaround for voice interaction.
- ElevenLabs Tokens: High token consumption led us to refactor our code for efficiency.
- Onboarding Experience: Custom onboarding for visually impaired users presented issues with mobile browsers, especially Safari and Chrome on iOS.
- Implementing Sound Effects: The Web Speech API failed to support our audio requirements, so we uploaded MP3 files to Supabase as a solution.
- GitHub Repository Access: Initially, only the creator could push updates or create branches until the project was made public.
Accomplishments that we're proud of
We are happy that a person with 95% vision loss was able to use the app and complete the happy path. We are happy that we were able to pull this off in such a short amount of time with the help of APIs (Elevenlabs, Gemini) and Bolt.new. We are also a global team focusing on using our talents to drive a common good for mankind.
What we learned
We learned that it's easier to build a working prototype with a web application rather than a native application due to technical complexities, such as:
- Bolt compilation issues
- Rendering in Expo
- Voice permissions
- Error handling
What's next for Guiding Light
Our next step is to socialize the app with:
- Friends
- Family
- Church communities
- Angel investors
Due to the cost of ElevenLabs tokens, we are considering monetizing the application instead of offering it completely free.
Features we plan to build
- Voice-controlled personalized onboarding
- Personalized authentication and conversation history stored on Supabase
- Multi-language support
Built With
- css
- elevenlabs
- entri
- figma
- gemini
- howler.js
- netlify
- react
- supabase
- tailwind
- typescript
- vite
Log in or sign up for Devpost to join the conversation.