Inspiration

As a Network Engineering student, I’ve seen how technology often focuses on speed rather than empathy. I wanted to build something for the neurodivergent community, specifically for people who experience "situational anxiety". The "unwritten rules" of places like doctor offices and job interviews can be overwhelming, and I wanted to create a digital "shield" to navigate them.

What it does

The Gentle Guide is a situational accessibility app that provides a roadmap for high-stress environments. It offers: Expectations: A breakdown of the social and physical environment. Checklist: A "What to Bring" list so no essential items are forgotten. Social Scripts: Contextual phrases for check-ins, asking for help, or requesting accommodations. Sensory Tips: Proactive advice on managing lights, noise, and crowds.

How we built it

The app is built using Python and the Streamlit framework for a clean, accessible UI. I focused on a "Mock-AI" architecture to ensure 100% reliability and zero latency, which is critical for users already experiencing high stress.

Challenges we ran into

The biggest hurdle was initially trying to integrate live AI models via the Vertex AI API. I faced persistent 404 errors and quota limits during the heat of the hackathon. I had to make a quick "engineering pivot" to hardcoded logic to ensure the user experience remained stable and predictable for the demo.

Accomplishments that we're proud of

I’m proud of creating a neurodiversity design that reduces cognitive load. Seeing the "Social Scripts" expand into different contextual situations, like asking for a quiet waiting room for practical accessibility.

What we learned

I learned that in accessibility tech, reliability is a feature. While AI is powerful, a user in a sensory-overload crisis needs a tool that works instantly and offline without failing. I also sharpened my skills in Git version control and Streamlit UI development.

What's next for The Gentle Guide

I plan to integrate Text-to-Speech so the app can "speak" the social scripts for non-verbal users. I also want to add a community feature where users can submit their own "sensory ratings" for local places in Oakville and the GTA, turning the app into a living accessibility map.

Built With

Share this project:

Updates