We wanted a fun way to visualise tweets. Text is boring, animations are fun!

What it does

It uses Microsoft Cognitive Services API to identify the facial features of people on twitter. The user can follow people on twitter and when they tweet, a 3d model of the person is created and they speak the text. The sentiment of the tweet is calculated using Microsoft API and the sentiment is reflected in the animation (e.g. bad sentiment gives red light).

How we built it

We used node.js to interact with twitter API. We used python for AI stuff (e.g. sentiment analysis and facial feature recognition). Used Unity for generating and displaying the models.

Challenges we ran into

Twitter API was a pain. Communicating between the javascript, python and Unity apps was a pain.

Accomplishments that we're proud of

It kinda worked in the end.

What we learned

Focus on making the different components of the project work together before developing much further.

What's next for Twitter fun

We want to make this into a game where you guess the tweeter based on the tweet and their 3d animated model.

Share this project: