Vora AI

Inspiration

I was inspired by the growing disconnect between technology and human emotion in e-commerce. While shopping online, I noticed how stressful it can be when prices feel overwhelming or when you're having a tough day and just need something comforting. I wanted to create a shopping experience that doesn't just sell products, but actually understands how you're feeling and responds with genuine empathy. The idea came from realizing that the best salespeople in physical stores can sense when someone is stressed and offer help - so why can't AI do the same?

What it does

Vora is an empathic shopping assistant that uses Hume AI's Empathic Voice Interface to detect emotions in real-time and adapt the shopping experience accordingly. When I speak to Vora, she analyzes my vocal patterns to understand my emotional state - whether I'm calm, engaged, or stressed. Based on these emotions, Vora offers personalized empathy discounts (up to 10% off) when she detects frustration or stress in my voice.

The platform is entirely voice-driven: I can filter products, add items to cart, checkout, and even provide my delivery address just by speaking naturally. Vora responds with genuine care, offering comfort-driven suggestions and never being pushy about sales.

How we built it

I built Vora using Next.js 16 with TypeScript as the foundation. The core emotion detection comes from Hume AI's EVI SDK, which I integrated to analyze prosody scores from voice input. I calculate emotion discounts by extracting stress-related emotions (distress, frustration, anxiety, sadness) from Hume's response and converting the highest score to a discount percentage - with a maximum cap of 25% that sellers can configure.

For the backend, I used Sanity CMS for product management, Stripe for secure payments, and Firebase for order persistence and analytics. The state management is handled by Zustand with persistence middleware. I implemented 6 voice tools that Hume EVI can call: filter_products, add_to_cart, trigger_checkout, apply_discount, collect_address, and navigate_to_orders.

The UI follows a "Zen" design philosophy with glassmorphism effects and emotion-driven color themes (teal for calm, amber for engaged, rose for stressed). I used Framer Motion for smooth animations and Recharts for real-time emotion analytics.

Challenges we ran into

The biggest challenge I faced was getting the emotion calculation right. Hume returns complex prosody data with dozens of emotion scores, and I had to figure out which emotions actually indicate stress or frustration that would warrant a discount. I experimented with different combinations before settling on the four key stress emotions.

Another major challenge was the voice tool integration with Hume EVI. The tool parameters come as JSON strings that need parsing, and I had to ensure proper error handling when tools fail. Getting the timing right for navigation (waiting for AI to finish speaking before redirecting) was also tricky.

I also struggled with the address formatting - when users say "one twenty three main street," I needed to convert that to "123 Main Street" with proper capitalization and number recognition.

Accomplishments that we're proud of

I'm most proud of creating a truly empathetic shopping experience that feels natural and caring. The emotion detection works seamlessly - when I sound frustrated about prices, Vora genuinely offers help and discounts without feeling manipulative.

The voice-first interaction is incredibly smooth - I can complete an entire shopping journey from browsing to checkout using only my voice. The real-time emotion analytics dashboard provides fascinating insights into shopping behavior patterns.

I'm also proud of the technical architecture - the integration between Hume EVI, Stripe, Sanity, and Firebase creates a robust, scalable platform. The glassmorphism design with emotion-driven theming creates a calming, zen-like experience that matches the empathetic concept.

What we learned

I learned that emotion AI is incredibly nuanced - it's not just about detecting "happy" or "sad," but understanding the subtle variations in stress, frustration, and comfort levels. Working with Hume's prosody data taught me how complex human emotion really is.

I also discovered the importance of timing in voice interfaces. Users expect immediate responses, but you also can't interrupt the AI mid-sentence. Finding the right balance between responsiveness and natural conversation flow was crucial.

The project taught me about the ethical considerations of emotion-based pricing. I made sure the discounts are genuinely helpful (capped at 25%) rather than exploitative, and the messaging focuses on comfort rather than manipulation.

What's next for Vora

I want to expand Vora's emotional intelligence by adding more sophisticated emotion patterns - detecting when someone is shopping for a special occasion versus stress-shopping, and adapting the experience accordingly.

I plan to add multi-language support and cultural emotion recognition, since emotional expressions vary across cultures. I also want to implement seller analytics so merchants can understand their customers' emotional journeys and optimize their product offerings.

Long-term, I envision Vora becoming a platform where any e-commerce business can integrate empathic AI, creating a more human-centered shopping ecosystem. I'd also like to explore integration with mental health resources - if Vora detects someone is consistently stressed while shopping, she could gently suggest wellness resources.

The ultimate goal is to prove that technology can be both intelligent and genuinely caring, setting a new standard for human-AI interaction in commerce.

Built With

Share this project:

Updates