I'm inspired by the fusion of traditional K-Pop aesthetics with cutting-edge AI-generated visuals.

My goal was to create a music video that not only showcases the song's powerful message but also pushes the boundaries of what's possible in visual storytelling using AI.

The process of building this project involved several key stages. I implemented Unreal Engine for the core 3D environments and character animation, leveraging its real-time rendering capabilities. For characters replacement, I used Mago Studios, which allowed for seamless integration of AI-generated character performances. Dzine AI was used for lip-syncing and facial animation, Eleven labs was used for voiceover generation. Akai MPC-One and Suno AI was used for the song creation. Adobe premiere was used for video editing and color grading. Quick magic AI was used for motion capture.

The biggest challenge was seamlessly integrating the AI-generated elements with the live-action footage to create a cohesive and visually stunning narrative.

I've learned a slue of AI platforms that integrated with motion capture, music, and restyling video.

What's next for Wray_Won_I'm Never gonna stop_MV His next song and music video will be coming out soon. He has a twin brother Jae Won. he is a singer. They recently collaborated on Jae's new song.

Built With

  • adobe
  • ai
  • akai
  • and
  • color
  • creation.
  • editing
  • eleven
  • for
  • generation.
  • grading.
  • i-implemented-unreal-engine-for-the-core-3d-environments-and-character-animation
  • i-used-mago-studios
  • labs
  • leveraging-its-real-time-rendering-capabilities.-for-characters-replacement
  • magic
  • motion
  • mpc-one
  • premiere
  • quick
  • song
  • suno
  • the
  • used
  • video
  • voiceover
  • was
  • which-allowed-for-seamless-integration-of-ai-generated-character-performances.-dzine-ai-was-used-for-lip-syncing-and-facial-animation
Share this project:

Updates