Inspiration
The inspiration for EasyLetter is my mother. She is an Indonesian immigrant living in the Netherlands, and the Dutch language is difficult for her. When she receives a letter from the government or other companies, she often doesn’t fully understand it. This causes her to panic. She feels she’s lost control, and everything feels urgent. She then calls me and asks me to come immediately, even if I’m at work or already have plans.
Because I can imagine how she feels, I usually help her right away. But I’ll admit, sometimes it also causes frustration. I believe many immigrant families deal with the same issue. It’s hard for the parent, who feels panicked and helpless. And it’s hard for the son or daughter, who might feel guilty if they can’t help right away.
Based on this very real problem, I wanted to build a solution.
What It Does
EasyLetter transforms confusing letters into crystal-clear explanations and even reads them aloud using AI-powered audio narration.
Users simply take a photo of a letter. The app then provides:
- Simplified summaries at a 6th-grade reading level.
- Clear action items with deadlines.
- Comforting audio narration in their preferred language.
It’s specifically designed for elderly immigrant users who deserve to understand important letters without panic or confusion.
How We Built It
I built EasyLetter using Bolt.new, which was amazing for rapid development.I used Think Mode as much as possible, and occasionally debugged errors myself or with a programming assistant. But I would estimate that alt least 95% was built using Bolt.new’s AI.
Simplifying the Letter
When a user uploads a photo, I use OCR to extract the text. I chose Tesseract, which runs in the browser. This way, the image isn’t sent to a server, protecting the user’s privacy.
Once I have the text, I use OpenAI prompts to simplify the message. These prompts don’t just translate, they:
- Create summaries,
- List clear action points,
- Use language appropriate to the user’s culture,
- Keep the reading level at around 6th grade.
The Audio Narration
For the audio, I wrote a special OpenAI prompt that generates a comforting narration script. It’s designed to be clear and emotionally calming. The prompt guides the AI to:
- Use a logical structure,
- Include transitional narration words,
- Sound reassuring and kind, so the listener feels in control.
This was especially important to me because most AI voices sounded cold or unsettling when reading the letters to my mother. That only added to her anxiety. ElevenLabs made a big difference. Their wide range of voice options allowed me to choose warm, reassuring voices for each language. I was genuinely impressed by the quality and emotional tone of their voices. It helped turn something stressful into something calming and supportive.
Customer Journey
To make sure it worked for my mom, I built and tested every step with her.
The app has 4 simple steps:
- Intro – explains the problem and shows how AI can help.
- How It Works – a short and clear overview.
- Upload – the user takes or selects a photo.
- Results – the app shows the summary, action points, and plays the narration.
Users can also share the results or start over.
Interface Design
I built the UI using React, focusing on elderly-friendly design:
- Big buttons
- Clear text
- Simple navigation
I tried to include these design goals in every Bolt.new prompt.My mom tested every screen. I kept refining the UI until she understood what to do at every step.
Challenges We Ran Into
User-Centered Design Challenge
The biggest challenge was creating a product my mother (and users like her) could actually use.
Initially, I built for desktop. Then I realized she only uses mobile. Also I had everything on one page at first. But it overwhelmed her. So I switched to a step-by-step workflow.
Balancing technical features with real usability was tough. But now my mom can use the app, so we overcame that hurdle.
Audio Implementation Challenge
Getting the audio feature right was another major challenge.
The first 90% worked well. The last 10% took a lot of effort.
I learned that the best narration you get when you work with structured narration include in your prompts. So I started including this as a must have result in the instruction o fthe prompt:
- A 1-line intro,
- A clear summary,
- Bullet-pointed action items,
- A calming conclusion.
This made a huge difference. Through debugging and iteration, I got everything working smoothly.
Accomplishments That We're Proud Of
- A working product that solves a real problem.
- Family bonding, the process of testing with my mom brought us closer.
- AI for social good, a real example of AI helping everyday lives.
- A user-focused approach, my mom stayed at the center of every decision.
- A fully functional audio feature.
- Rapid development—thanks to Bolt.new, I went from idea to product quickly.
What We Learned
About Development with Bolt.new
Bolt is incredible for building 90% of a product quickly. But for the final 10%, you need to:
- Slow down, step by step
- Think more critically about prompts,
- Debug carefully.
- I learned not to burn tokens by rushing to ask the AI before understanding the problem myself.
- Sometimes, AI adds complexity instead of solving the root issue.
- Refactor code often, because that keeps app more stable. Reason I learned is that AI rewrites the whole script so if this is smaller less chance of introducing errors.
About User-Centered Design
Building for a specific user helps focus the product. If I had thought more about who I was building for in the beginning, I would’ve gone mobile-first right away, or even made it an app. So More upfront analysis could’ve saved time and effort.
What’s Next for EasyLetter
(Short-term) Improvements
- Make the app more stable (so test it with more users, and add tests also in the code).
- Add more languages to be able to help more people
- Improve the AI, now I used OpenAI, but I would really like to experiment with other AI models to see if results differ.
- Also I want to add a chat feature, so users can ask questions about their letter. The AI will answer in video or spoken audio format. That way, users like my mom can have a real conversation with the app.
Long-Term Vision
I want to keep this site free for users. Right now, each analysis costs me money. But I know other families deal with the same issues we did. So I want to reach out to maybe charities for donations to make this site for everyone free.
This problem is real and widespread. If AI can help, it should. So longterm I hopefully this will be a good showcase how we can use AI for social good.

Log in or sign up for Devpost to join the conversation.