Inspiration

We identified a gap in the way that customer service currently handles customer queries. The process for someone to get help is a lot longer than it needs to be, we realized AI voice agents are the perfect tool to get people the help they need.

What it does

We enabled customers to call a real support number and be connected with an agent that can pull details of their profile and the network live. To improve overall customer satisfaction, we provide network wide support sentiment monitoring. We track social media sentiment, network quality, and support requests to calculate CSATs and a happiness index.

How we built it

Calls

Calls are routed through Twilio API with real time STT and TTS conversion, allowing for customers to interface with an agent rather than "press 1 for billing support". The application is deployed through Brev.dev to allow for high performance and AI integration.

Agent Layer

An NVIDIA Nemotron model was used through OpenRouter, empowered with tool calls. We adjusted the temperature, max tokens, and other parameters to give the user a more human feel. We chose a lightweight model to allow for faster realtime communication and processing of user requests.

Post-Call Analysis

An NVIDIA NeMo ASR model was deployed on Brev.dev to provide tonality analysis on the users call experience, to give depth to a person beyond just textual sentiment analysis. For example, someone saying "thank you for your help" in a sullen tone vs. someone saying "my phone isn't working" in a cheerful tone is an important factor to indicate to capture human depth and provide effective support.

Human Responder Dashboard

If the agent escalates the call to a human customer service representative, an incoming call notification appears on the human responder dashboard. The CSR can then accept the call, and is taken to a live page with information on the customer, their call transcript with the agent, sentiment scores and analysis on their call, as well as AI-powered suggestions enhanced by the policies and features of the company, to provide the best possible service to the customer. As the customer and CSR talk further and resolve the issue, the sentiment scores will continue to update in real-time, providing insight into the customer's feelings and how the CSR should respond to resolve the situation and result in the customer leaving with a net-positive feeling.

Network and Social Media Sentiment Analysis

It's also important to track overall network health and flag any issues before they get out of hand. Our network health dashboard scrapes data from Reddit on TMobile and subsidiaries continuously, running sentiment analysis on recent posts to flag any issues as they occur. Additionally, if a call is dropped or there are any spotty connections, those nodes will be flagged and turn red on a map when it crosses a certain threshold. This allows the company to better monitor their network and ensure customers don't become irate and frustrated with their service.

Challenges we ran into

Some of the biggest challenges we ran into had to do with latency, as when we finally connected all the different features, the voice-based tonal sentiment detection alongside text based models like transformers through Hugging Face caused a lot of latency. The Hugging Face transformers caused hanging imports and large downloads, so we opted to remove transformers and pivot to a VADER based sentiment model - migrating this across all the systems was hard to implement with the time constraint, but we got it done.

Accomplishments that we're proud of

We're very proud that we were able to generate a working voice agent that knows information about the caller and can provide help unassisted, while also being able to route information to the appropriate place. This took a lot of effort from all team members, and the fact that we were able to get a voice agent done on top of everything else in our project is something we feel amazing about. We're also very proud that we managed to get everything connected and integrated well enough that it works with no assistance.

What we learned

This project definitely taught us a lot. First off we learned Real-time AI is way harder than it looks and adding sentiment analysis on top just adds to that complexity. We realized that emotion is multimodal, and voice tone and text sentiment don't always align. Figuring out the balance between these two was critical. A lot of our project was also cleaning, synchronizing, and integrating our full tech stack, which required a lot of very technical moments, so we definitely had to learn how to work together better than we've done in the past.

What's next for Net Pulse

We're hoping to take this even further, perhaps building it out into a universal customer service product, providing end-to-end solutions for any business. From our experience, no customer service is ever really "great" over the phone, and we're hoping we can change that.

Built With

Share this project:

Updates