Imagine walking into a booth and entering a keyword of a topic on a computer screen. Suddenly, all the LEDs around you light up with different colors and intensities. Isn't it beautiful to be able to feel the sentiments of the world on a particular topic? Our project aims to create an environment where users can feel real-time social media sentiments by immersing themselves in a physical setting with LED lights. We want to increase people’s awareness of social issues and give people a sense of social responsibility when it comes to social media interactions.

What it does

  • The Raspberry Pi is used as a controller for an LED strip where each LED light is individually controlled. The user enters space-delimited keyword(s) in the text box of a pop-up UI. After clicking the button, the LED strip will show different colors with different intensities.
  • Each LED light represents an aggregation of a certain number of tweets: the color of the LED represents an emotional category (sadness as blue, joy as yellow, fear as white, anger as red), and the intensity of the color represents the intensity of the emotion. The darker the color, the more intense the emotion is. Colors are sorted in a way that similar colors appear next to each other. By observing the number of LED lights and their corresponding colors, users will have an intuitive idea of social sentiments of real-time tweets on that particular topic.

How we built it

We used the GPIO library to interface the Raspberry Pi with the LED strip. We used the neopixel library to enable individual control of the LED lights on the LED strip. We built the simple yet easy-to-use U* with Python’s tkinter module. We fetched real-time tweets by using the tweepy API. We analyzed the emotional categories as well as the emotional intensities of tweets by using the IBM Watson Cloud tone analyzer API.

Challenges we ran into

  • We spent quite some time installing all necessary Python dependencies on the Raspberry Pi due to environment configuration issues since we first wrote our code on our laptops. It was challenging and tiresome to figure out different installation paths and how that affects our code.
  • It was hard to integrate APIs into our code since IBM’s API does not have the best documentation. It was a lot of pain reading through the source code of the API and trying to understand the internal behaviors of the API.
  • It was also a challenge to figure out how to interface LED hardware with the Raspberry Pi. We went through the hardware documentation comprehensively to understand what it takes to control the hardware from the Raspberry Pi.

Accomplishments that we're proud of

  • We successfully completed a project that uses API, processes data, and interfaces with hardware in one week. Visualizing social sentiment makes us wonder about events in the world.

What we learned

  • Sentiment: People feel happy, angry and a little sad about feminism. People feel sad about New Zealand. People feel happy and sad about Trump. People feel happy about mineral water.

  • Dependency Management: Controlling hardware requires administrator privilege, so all dependencies should be installed with sudo. Having a requirement.txt saves the trouble of installing dependency one by one. Most python library for hardware interface uses python2, while many software library uses python3. We can use future to make python2 and python3 work together.

  • API Design: Good documentation for API is important to attract users. Examples are best to get users started.

What's next for SentimentLED

Make a booth with more LED strips and monitors to display top tweets, providing an immersive experience of social sentiment towards controversial topics.

Built With

Share this project: