Inspiration

I still remember the first time I watched my close friend's nephew, Alex, struggle to recognize his classmates’ smiles in a playground. His eyes would scan each face, trying to decode emotions that came effortlessly to others. As I helped him navigate these social cues, I thought: what if there was a patient, always-available guide—like Anne Sullivan, teacher of Helen Keller—to walk him through every joyful grin or confused frown? That moment planted the seed for SulliFeel. We set out to blend cutting-edge AI with heartfelt empathy so that every child with ASD can learn social fluency at their own pace—and parents can finally see the journey unfold in real time.

What it does

  • Emotion Recognition Games
    Through playful “guess-the-feel” challenges, kids learn to match facial expressions and real-world scenarios—smiling faces in park photos, surprised looks in video clips, or empathy in animated stories.
  • Interactive Role-Play Scenarios
    Our avatar, “Annie,” engages users in simulated conversations: greeting neighbors, asking to join a game, or apologizing if they accidentally bump someone. Haptic feedback and voice prompts make each interaction feel tangible.
  • Moral Reasoning & Social Rules
    Story-driven modules tackle questions like “Why does saying ‘thank you’ matter?” with branching narratives that show outcomes based on the child’s choices—reinforcing right vs. wrong in a safe space.
  • Tailored Daily Routine Coach
    From brushing teeth to packing a backpack, Annie guides children through morning and evening rituals with gentle reminders, checklists, and animated rewards for completing each step.
  • Parent Insights Dashboard
    Real-time charts track progress: emotion-recognition scores over time, scenario-completion rates, and custom milestones. Parents receive weekly highlights and can adjust lesson difficulty or focus areas with a click.

How we built it

We wanted an agile, collaborative process—so we turned to bolt.new, an AI development platform that felt like having a co-laboratory in the cloud.

  • Rapid Prototyping with bolt.new
    We sketched avatars and dialogue flows in plain English; bolt.new instantly generated working code snippets for our project, letting us iterate on Annie's voice and expressions without waiting days for manual implementation.
  • Integrating AI Models
    Using bolt.new’s built-in connectors, we plugged in Tavus API AI to provide digital human conversation and interactions. No messy pipelines, just clean “point-and-click” integration.
  • Secure Cloud Backend
    bolt.new’s data modules handled encrypted storage and real-time sync to our React-based parent dashboard—saving us weeks of DevOps work and letting us focus on user experience. The help of netlify also helped agile deployment.

Challenges we ran into

  • Expression Accuracy in the Wild
    Providing affordance and trying to fine tune the digital AI avatar to fit well to children with special needs were the most challenging one but we managed to make the AI to speak and interact with children with special needs in a calmly and kind, patient, and professional manner.
  • Adapting to Diverse Speech
    Some users speak softly or use alternate phrasing. Tuning our voice interface through bolt.new’s analytics helped us catch misrecognitions and retrain within hours—critical for keeping engagement high.
  • Balancing Simplicity and Depth
    Designing an interface intuitive enough for younger kids, yet rich enough to challenge older ones, required dozens of UX sprints. Our iterative A/B tests—managed end-to-end in bolt.new—guided us to the right mix of visuals, audio cues, and progress feedback.

Accomplishments that we’re proud of

  • MVP in 1 Months
    From idea to beta, we shipped three core modules—Emotion, Scenarios, and Dashboard—thanks to bolt.new’s speed.
  • Adaptive Avatar
    Annie now adjusts lesson pacing in real time—slowing down after errors or speeding up when mastery is detected—powered by bolt.new’s dynamic decision-tree engine and Tavus AI's perception.
  • Seamless Updates
    Post-launch tweaks and new content rollouts are now a breeze: design in Figma, generate code in bolt.new, and push to users without downtime.

What we learned

  • Empathy at the Core: Technology must feel warm, not robotic. Personal touches—like Sullivan recalling a child’s favorite snack—create deeper engagement.
  • Iterate Fast, Iterate Often: Early feedback loops with both kids and parents exposed hidden UX gaps. bolt.new’s rapid build-test cycles saved countless redesign headaches.
  • Customization Wins: No two learners are the same. Giving parents control over lesson focus and difficulty transformed our tool from a one-size-fits-all into a truly personalized coach.

What’s next for Sullivan AI – SulliFeel

  1. iOS & Tablet Launch
    Widen our reach with optimized UI for iPad and iPhone—bolt.new’s cross-platform build tools will make this seamless.
  2. Augmented Reality Practice
    Imagine role-playing a checkout line in your living room! AR layers will bring real-world scenarios into safe, guided practice.
  3. Therapist & School Integrations
    APIs for special-ed platforms and telehealth services will let educators and therapists plug SulliFeel directly into their workflows.
  4. Global Language Support
    Expanding NLP modules in Spanish, Mandarin, and beyond—so every child worldwide can learn social fluency with Sullivan by their side.

Built With

Share this project:

Updates