Inspiration

We wanted to provide some medium through which people with communication disorders could indicate their emotions and have fun doing it.

What it does

It uses the Microsoft Band sensors as a modified polygraph in conjunction with our neural network implementation

How I built it

I created a neural network abstraction to create neurons dynamically for every new song played. The android app reads in sensor input from the Microsoft Band, gets a user's saved songs through the Spotify Web API, runs the sensors through the neural network which outputs a song name or random, plays the song with the Spotify Android SDK, and then waits for the user to let the song play or skip in order to train the neural network for that song.

Challenges I ran into

Fuck Spotify, fuck Microsoft, fuck Android. That is all.

Accomplishments that I'm proud of

Spotify, Microsoft, Android.

What I learned

Don't use Spotify, don't use Microsoft, don't use Android

What's next for Intelli-DJ

Built With

Share this project:
×

Updates