Inspiration

We wanted to make music creation accessible to everyone where no instrument needed, and no lessons required. We asked: what if you could just describe a sound and instantly play it? The idea of generating any instrument from a single prompt felt like the perfect blend of creativity and AI. We wanted the ability to enjoy the fun in music without having to have an instrument, and you can play any one you want!

What it does

Instament AI lets you type any instrument whether real or imaginary, and play it instantly in your browser. You describe it (e.g. "haunted carnival organ" or "grand piano"), Groq's LLaMA model designs unique sound prompts for all 24 keys, ElevenLabs generates real audio for each one, and a fully playable keyboard appears. You can play multiple keys at once, see note particles fly, and even record your performance.

How we built it

Built with React for the frontend, Groq (LLaMA 3.3 70B) for instrument design, and ElevenLabs Sound Generation API for audio. The keyboard uses the Web Audio API for polyphonic playback and the MediaRecorder API for recording. All 24 note sounds are generated in parallel for speed.

Challenges we ran into

Getting ElevenLabs to generate sounds that actually felt like a coherent instrument across all 24 keys was tricky since prompt engineering for each note took a lot of iteration. We also had to fix multi-key playback so notes didn't block each other, and getting keyboard shortcuts to work reliably across all keys (including special characters) required a full rewrite of the key mapping logic. Also, the tokens where a big challenge. Elevenlabs and Groq where constantly requiring us to have new API keys when we ran out.

Accomplishments that we're proud of

A fully playable AI-generated instrument that works in the browser with zero setup. The fact that you can type "underwater glass harp" and actually play it 60 seconds later still feels kind of amazing to us. You can play from anywhere! Also, we're both freshmen and there's only two of us, in which it's our first hackathon ever. We're pretty proud of what we've come up with! We very much hope that it's enjoyable and interesting.

What we learned

How powerful ElevenLabs' sound generation API is for creative audio use cases, how to manage parallel async audio loading in React, and how much prompt engineering matters when you're generating 24 variations of the same sound.

What's next for Instament AI

More octaves, sustain pedal support, the ability to share your generated instrument with others via a link, and a beat-making mode where you can layer multiple instruments together.

Built With

Share this project:

Updates