Listening to songs is one of the most common and favorite ways for teenagers and adults all over the world to spend time or feeling in company. Songs are also a way to transmit culture and remember past experiences, that is why they are meaningful and important in our society. For those that cannot hear songs, being able to see the messages that the song portrays can be just as valuable as listening to the notes themselves. In circumstances in which someone is preparing a presentation or giving a talk, the visual support is just as important to convey key concepts as the verbal messages themselves.
We worked with IBM-Watson to be able to get the timestamps of all words in voice files. We then used the Microsoft Azure Text Analytics Api to process the lyrics and retain a list of the keywords most relevant to the search. We then ran each word through the Microsoft Azure Search Api to randmly find a related image for it. The final object would be returned from the server and loaded by the frontend to display the sequence of images ordered by time.
We came up with an app that enables the user to sing a song, or search online for it, and be then able to experience the song played back to a sequence of images representing the meaning of the song.