What it does

Using a kinect, chaplin periodically determines the number of people on a room, as well as their current emotional state. It then uses this information to generate a relevant playlist to complement the setting of the room. For example, rooms with happier people will also get happier music, and rooms of angry people will get angry-sounding music. The music will also vary by the number of people in the room; rooms with more people in it will get more instrumental music and less lyrics, so as to not disturb a conversation. Rooms with a lot of people will receive more dance music, as a way to detect and appropriately entertain a party.

How I built it

The images are recieved by a Kinect. Chaplin uses the image and skeleton API to get pictures of peoples' faces and count when people enter the scene and how many. The people's emotions are determined by using Microsoft Project Oxford's Emotion API. After using some heuristics, it uses the Echonest API to make correlations between the people in the room and the music they should get. Echonest is searched by factors such as valence, danceability, and speechiness. The song are then located on spotify's library and streamed to a Bose SoundTouch speaker.

Share this project: