Inspiration
It's 2am. You're in pain. Googling symptoms leads you down a rabbit hole of worst-case scenarios. You can't afford the ER ($300+ just to walk in), and the earliest doctor's appointment is next week. You just need to know: is this urgent, or can it wait? We've watched friends choose between paying rent and seeing a doctor. We've seen people wait until it's too late because they could not afford an early intervention. Healthcare should not cost a peace of mind. That is why we built SensoryX, it gives everyone instant access to medical insights without financial worry.
What it does
SensoryX is an AI-powered healthcare companion that supports patients in many ways such as: -AI doctor consultation: Chat with out AI doctor for $0-35 (vs $150-300 for clinic visit), it has voice-enabled consultations using ElevenLabs, instant responses and 24/7 availability, it has a free tier and premium tier -Symptom twin matching: Describe symptoms naturally: "Sharp pain behind my left eye when I swallow" and our AI pipeline converts the description into a mathematical "pain signature" using vector embeddings, it searches the 100s of symptom records inside the pre-populated database to find a "symptom twin" -Shows what conditions similar patients had and what helped them -Financial impact: Predicts how much your symptoms could cost if left untreated, Shows medical bankruptcy risk using financial intelligence -Prediction health intelligence: Real-time health risk scoring, Personalized prevention recommendations, Early warning system for chronic conditions, 3D data visualization using Three.js
How we built it
We used a frontend consisting of React, Three.js, Framer Motion and Tailwind CSS. The backend was on a FastAPI server which also communicated with Google's Gemini 2.5 Flash model to power our AI doctor. It also has vector embeddings for "symptom signatures" for twin matching. We used Pinecone's vector database to find the symptom twin in seconds. We had a Snowflake warehouse with millions of medical records. We iused Redis to cache conversations for instant responses. ElevenLabs API was used for speech-to-text and text-to-speech. Dedas was used to coordinate multiple AI specilaist agents (triage, cardiology, neurology, etc.) and it would specialize recommendations from all agents into a final diagnosis.
Challenges we ran into
We launched the AI doctor feature, and it kept returning generic fallback responses. After hours of debugging, we discovered Google deprecated gemini-pro and didn't update their docs properly. We had to search for the new model names (gemini-2.5-flash) and update our entire service.
Accomplishments that we're proud of
We are proud that we built something that actually is useful and has potential in the vast field of healthcare.
What we learned
We learned that vector embeddings are incredibly powerful for matching similar medical experiences, and that voice interfaces can make healthcare genuinely accessible. More importantly, we discovered that the financial barrier to healthcare is even worse than we thought—people are literally choosing between medical care and survival, and AI can actually help bridge that gap if we build it right.
What's next for SensoryX
We're deploying to production on DigitalOcean immediately, then partnering with free clinics to get SensoryX into underserved communities. Long-term goal: make it completely free for anyone who can't afford healthcare and actually reduce medical bankruptcies, because healthcare is a right, not a luxury.
Built With
- dedaluslabs
- digitalocean
- elevenlabsapi
- fastapi
- framer
- gemini
- lucide
- photon
- pinecone
- python
- react
- redis
- snowflake
- tailwind
- three.js
- typescript


Log in or sign up for Devpost to join the conversation.