Inspiration

Humanoid control should feel more intuitive. We wanted to turn human intent, through language, animation selection, or teleoperation, into real Unitree G1 motion.

What it does

G1 Control Studio lets us play predefined motions, generate custom motions from prompts, preview them in MuJoCo, and deploy approved motions to the G1. Separately, we use Meta Quest teleoperation to record demonstrations for future training.

How we built it

We connected Llama / LLM2Vec, Kimodo-G1, our Kimodo-to-SONIC converter, MuJoCo, and the SONIC / GR00T whole-body controller. Kimodo generates the motion, our converter makes it controller-ready, and SONIC stabilizes it into motor commands.

Challenges we ran into

The hardest parts were real hardware connection, DDS/network setup, and understanding the robot’s different control modes. We also hit a “two policies fighting” issue, where conflicting control paths made the robot stutter and lose stability.

Accomplishments that we're proud of

We built a working prompt-to-G1 motion pipeline, integrated generated motions with SONIC / GR00T, added sim preview, real robot deployment tooling, and got Meta Quest teleoperation working for data collection.

What we learned

Generating motion is only half the problem. Real humanoid execution depends on timing, stabilization, networking, mode switching, and safety.

What's next for G1 Control Studio

Next, we want to train and deploy policies from teleoperation datasets: clean the demos, train imitation policies, validate in simulation, then safely deploy autonomous behaviors on the G1.

Built With

Share this project:

Updates