Inspiration
The inspiration for MusicGenic came from thinking about a problem my teammate Napasorn and I both had in common. We both needed music. She wanted to find a way to create unique intro music for each episode of her podcast series and I wanted to create instrumental music that would help me sleep.
In general, music production is hard, and there are not many resources out there to learn it. Moreover, most of the time people want to create lyrics and sing rather than create instrumentals so having a tool to easily generate the instrumental would save low-budget music artists time and money. Also, there tend to be many issues with copyright when it comes to music so creating something which is completely free to use and distribute is ideal.
This is why we made it possible through an app that generates/produces music based on our individualized needs.
What it does
MusicGenic is a web & cross-platform mobile application that uses artificial intelligence to make music production individualized and easy.
How we built it
For app development we used:
- Andriod Studios
- Dart
- Flutter
For model architecture:
- Google Colab
- TensorFlow Keras Library
- MatPlotLib
Challenges we ran into
We ran into a lot of challenges with coming up with the idea initially because we couldn't find an idea that resonated with us but we eventually found this idea. Another challenge we faced was figuring out how to implement the model into the app, which we researched and now understand. Due to limited time, we weren't able to actually train or embed the model into the app but that is a future step.
Accomplishments that we're proud of
We started our project Saturday the 13th of August morning because I had another commitment on Friday so given the extremely limited amount of time (Just above 24 hours), we were able to get a lot done: a presentation, research, and a functional mobile application.
What we learned
We learned about artificial intelligence and what type of model we would need in order to create MusicGenic which is a Variational Auto Encoder explained briefly in our video.
What's next for MusicGenic
We would like to finish creating, training, and testing our Variational Auto-Encoder Model and then use TFlite to embed it inside the app. That would be awesome.
Built With
- android
- android-studio
- canva
- dart
- flutter



Log in or sign up for Devpost to join the conversation.