Watching the Hack the North livestream, we couldn't help but notice a lack of interaction and understanding with the chat on YouTube. We thought to ourselves: what if there was a way to make the conversation more engaging by automatically looking at the context of every chat message?

What it does

Airchat lets users chat on livestreams and provides real-time sentiment analysis to show how all the viewers are feeling as the event goes on. We also implemented a mobile interface to bring physical event attendees in the same conversation as online viewers

How we built it

We used firebase to handle all our server-side data with a minimal back-end, using node js and express to compliment it. The sentiment analysis was completed by sending an asynchronous request to Indico's machine learning API. We then show a visual 'heatmap' of sentiment using pure javascript visual elements.

Challenges we ran into

Considering how best to visualize the data we received and how often to show the changes (balancing the fine line between engagement and over-stimulation). We also ran in to a few issues with YouTube's limited live-streaming API, as well as the lack of documentation/developer tools on other platforms (, Livestream, etc).

Accomplishments that we're proud of

Creating a truly browser and mobile responsive web app that optimized for speed, engagement and interactivity. We were especially impressed at how well the sentiment visualizations worked, and how stimulating they were to us as end-viewers.

What we learned

How a simple addition such as sentiment analysis provides a deeper level of engagement and value to both the users/commenters, as well as the event hosts themselves.

What's next for Airchat

Setting it out in the wild for the Hack the North closing ceremony!

Share this project: