Inspiration

LeLamp is a real life Pixar lamp, inspired by the Apple Elegnt paper. It was developed in 3 weeks, from mechanical design all the way up to the runtime. It has 5 degrees of freedom, 2 mics, a speaker and a RGB matrix in its head. Everything is controlled by a Raspberry Pi 4 in its base.

What it does

Right now, LeLamp can talk, understand commands, and express emotion with the help of:

  • LiveKit for voice infra
  • Groq with gpt-oss-120b for inference
  • OpenAi for STT and TTS.

How we built it

After reading Apple's Elegnt paper, I began sketching the first design of LeLamp on paper, then transformed it into 3D CAD in OnShape. My idea was having a talking lamp that can moves and obviously talks.

For the hardware, I picked STS3215 servos, ReSpeaker Lite Mic Hat for speaker/mic and WS2812b led for lighting. Locking the components, the work I had to do was simply designed a frame that can hold them.

For the software, I wrote a robot implementation in LeRobot. This makes it easy to control everything and open door for direct policy training. I then further abstract movement in LeRobot and the led control into services. I based my software design on a simple event driven architecture so I can hand off the tool calls to agents easily.

For voice, I used LiveKit, Groq and OpenAI as mentioned above. The movement and color the lamp is exposed through the software above. All I had to do was hooking them up with the right tool functions.

Challenges we ran into

The hardware was hard and I ran into many design problems, but nothing that couldn't be solve with multiple iterations on OnShape. The software main bottleneck is in the animation system of the lamp.

As you can see in the demo video, the lamp moves ultra fluidly. This is thanks to an animation system that I wrote that makes animations interruptible and interpolatable. LeLamp also support idle animation so it always move, making it feels even more alive.

Accomplishments that we're proud of

I'm super proud of the animation system, but overall building a robot from scratch in such a short time was a huge achievement.

What we learned

A lot. Simply a lot. From 3D design to how control servos from Feetech through serial to how to coordinate all the servos to play animation to making everything smooth so a robot feels alive.

What's next for Pixar Lamp Comes To Life With GPT-OSS

I want to make it jump. And to make it jump, I'll need beefy GPU to simulate and run RL for it.

Note

I built LeLamp to be completely open source. Everything is hosted under Human Computer Lab which is my lab. Originally I wrote the runtime for gpt-4o-real-time. Then I branched new development for gpt-oss. This branch also contains the new fluid animation system. In case you found LeLamp under both my github and humancomputerlab github.

LeLamp Runtime with gpt-oss: https://github.com/pham-tuan-binh/lelamp_runtime LeLamp Design: https://github.com/humancomputerlab/lelamp_runtime

Built With

Share this project:

Updates