Inspiration
I’m not a hardcore developer. Most of my background is in testing and general IT, and outside of tech, I’m also a farmer who deeply values family life.
I discovered the World’s Largest Hackathon just the night before the deadline. Despite the time crunch, I felt inspired to prove that even someone who’s not a full-time coder can create something meaningful with modern tools.
I’ve always believed technology should simplify life for everyone, including people who might struggle with traditional apps—like those with motor impairments or busy multitaskers. That’s why I chose the Voice AI Challenge: to build a voice assistant that makes productivity effortless and accessible.
What it does
Voice Assistant – Productivity Made Simple is a web app that lets users:
- Add, complete, and manage tasks entirely by voice
- Set voice-activated reminders and alerts
- Get conversational responses using lifelike voices via ElevenLabs
- Use english languages and voice personalities
- Operate entirely hands-free or switch to manual typing
- Access the app seamlessly on any device with beautiful, accessible design
It’s built to help anyone stay organized and productive—even those who can’t always use a keyboard or screen.
How we built it
I built this app primarily with:
- Bolt.new for rapid development and deployment
- React 18 + TypeScript for modern frontend architecture
- ElevenLabs API for lifelike text-to-speech
- Web Speech API for real-time speech recognition
- Tailwind CSS for styling
- Netlify for deployment
I wrote custom logic for:
- Parsing flexible voice commands
- Fuzzy matching task names
- Scheduling voice-based reminders
- Handling english text-to-speech
- Managing accessibility for diverse users
All data is stored locally for privacy and offline use.
Challenges we ran into
- Time pressure: I only had one night to build the entire project, which pushed me to learn and build faster than ever before.
- Learning curve: I was new to Bolt.new and ElevenLabs and had to figure out how to integrate everything from scratch.
- Voice conflicts: Managing audio playback and speech recognition without overlapping was tricky.
- Natural language flexibility: Users say things in many ways, so building robust voice command parsing was challenging.
Accomplishments that we're proud of
- Building my first full-stack voice assistant app overnight
- Successfully integrating ElevenLabs to produce natural voice responses
- Implementing fuzzy command matching for flexible voice input
- Creating an app that’s truly accessible for users with different abilities
- Deploying the entire app live on Netlify and including the “Built on Bolt.new” badge
What we learned
- How to rapidly build a full-stack app using Bolt.new
- How to integrate advanced voice technologies like ElevenLabs
- The importance of accessibility and inclusive design
- That with modern tools and determination, anyone can build innovative apps—even on a tight timeline
- How voice interactions can transform usability and user experience
What's next for Voice Assistant – Productivity Made Simple
- Add scheduled voice reminders that trigger automatically at set times
- Expand support for more languages and voices via ElevenLabs
- Connect to external services like calendars or email
- Add richer analytics to help users track their productivity
- Enhance the AI conversation engine for more natural dialogue
- Explore deployment as a PWA for offline usage
- Keep improving accessibility for all users
Built With
- bolt.new
- css
- css-frameworks/libraries:-react-18
- elevenlabs
- elevenlabs-api-(text-to-speech)-storage:-localstorage-for-persistence-testing:-vitest
- html
- javascript
- lucid
- lucide-react-platforms/tools:-bolt.new
- netlify
- netlify-apis/services:-web-speech-api-(speech-recognition)
- react
- react-testing-library
- react-testing-library-accessibility:-wcag-2.1-aa-compliance
- tailwind
- tailwind-css
- typescript
- vite
- vitest
- wcag
Log in or sign up for Devpost to join the conversation.