Inspiration

Our team wanted to address the challenge #01 presented by EduHack: "Support for collective musical creation.

The project is part of MusicEduHack's expressed need to generate a tool that, through AI techniques, helps collective composition.

What it does

It is proposed as a tool to use in two areas:

1.- Collective live performance, for example a DJ assisted by an AI that collects parameters of the movements of the mobile phone of the public and these, combined, generate a MIDI melody that will serve as material to generate music.

The DJ is no longer the only one who is generating everything that happens musically (or visually) speaking, but the public is participating by providing sound material or producing some kind of variation (for example filters) on what the DJ produces. Thus diluting the barrier between the artist (star) and the public.

2.-The other possible scenario is an improvisation with two or three people, in which the MIDI musical data generated by the performers generate an automatic melody, producing a dialogue (a very common resource or strategy in musical improvisation), producing a dialogue between performers and a AI previously trained.

This project has focused on generating a melody from several melodies (sequences of notes) using the magenta RNN library.

How we built it

Using Cycling '74 Max, Python and Magenta Project. Sensors2OSC.

Challenges we ran into.

Understanding how Magenta Project is working

Accomplishments that we're proud of

Condensing the encoding data from many samples into guidelines for a single melody

What we learned.

Magenta project capabilities.

What's next for MusicAI.

Apply many more parameters in the sound creation and get the public to be aware (sonically or visually) of how they are modifying the performance.

Built With

Share this project:

Updates