Inspiration

We work at Student Accessibility Services. Most days a student calls before they come over is the office busy right now? are the lights bright? is the elevator working? They're not asking to skip their exam, they're asking if today is the kind of day their body can handle the trip.

We started Sensory because the answer to those calls shouldn't be "come and see."

What it does

Sensory is a map that tells you how a place will feel before you go.

Open it, and the streets you already know are layered with five things Google Maps doesn't show: how loud a place gets, how bright the lights are, how crowded it is, what it smells like, and where the exits are. You set what your body needs quieter, dimmer, step-free, signed in your language, and the route is picked accordingly. If a venue isn't right today, you'll know before you leave the door.

You can also point your phone at any sign and have it read aloud in a voice you trust - your mom, your sister, whoever you recorded.

How we built it

The map is Google Maps with our data laid on top: 148 venues across USF and Tampa, each with a sensory profile a community can correct. MongoDB stores it, Gemini reads reviews and turns them into noise/light/crowd numbers, ElevenLabs gives us the calm voice. Everything else is Next.js, a lot of Tailwind.

The voice agent grew into a friend you can ask questions to. The camera grew into something that can tell you whether you're heading the right way. The map grew a heartbeat, it pulses when you walk into a hard place.

Challenges we ran into

Three real ones.

The first was honesty. Sensory data is fragile, getting quiet wrong is worse than not saying anything. We had to design the whole thing to say "I don't know yet" gracefully, and only score what people actually told us.

The second was that haptics on iOS Safari don't exist. We can't fix that, so we layered haptic + voice + visual together so no one feels out anything.

The third was the demo. None of this matters if a stranger can't open the URL on their phone and immediately understand. We rebuilt the boot, the cards, and the bottom nav about four times until it felt like welcome instead of learn this app.

Accomplishments that we're proud of

That my family member could open it on a phone at the dining table and find the quietest cafe within walking distance without anyone helping him.

That a sponsee asked it about wheelchair access and it answered like someone who'd actually been there.

That if you take a picture of a building and you're heading the wrong way, it tells you - gently -to turn left.

What we learned

The accommodations that change a person's day are not big. They're a sentence: "the back booths are softer." A vibration. A name spoken right.

We spent hours making the map do clever things, and the part people care about is whether the voice sounds like someone they love.

What's next for Sensory

More cities. Right now it knows USF and Tampa because that's where the students I see live. The data model is built so any community can drop their own venues in.

And the part I want most is letting people contribute their score not by typing a review but by simply being there for a minute and tapping yes or no. The lived experience of the people who need this should be what powers it, not us.

Built With

Share this project:

Updates