Inspiration

We wanted to create something unique and crafty with the hardware we had on hand. We didn't have much in the way of sensors, but we had an idea. We wanted to work around the principle of structured light--the concept of recording light patterns as they distort around objects--to create a depth map. Mimicking the use of a LiDAR sensor (which is too expensive for us). We then had the idea of incorporating it to make a topographic map... which then turned into a full fledged DnD campagin. Our other teammates are artistically and musically talented, so we also wanted to make custom sprites and music.

What it does

Displays a live and immersive DnD board onto a sandbox with voice-recognized commands, such as "Build a village in the top left quadrant named Learned Hall". Equipped with custom-drawn assets and self-composed music for chance encounters and boss fights.

How we built it

A projector and webcam are hooked up to a Python script running the Open Source MiDaS project, which uses trained neural net models to artificially estimate depth based on key contextual clues within images. The depth map allows for dynamic scene generation, such as rivers and mountains within valleys and high points. The code then recognizes voice commands that listen to keywords to dynamically build the DnD map and scene, allowing for structures and enemy encounters.

Challenges we ran into

Physical limitations of not having the right sensor. The principle of structured light is hard in practice to do with one camera, which means other means (MiDaS) needed to be explored.

Accomplishments that we're proud of

Completing and competing.

What we learned

We learned how good Claude code is.

What's next for Dungeon Master

After implementing a dual IR structured light algorithm and a more advanced projector a full DnD table setup can be made allowing for a more real-time refresh rate.

Built With

+ 19 more
Share this project:

Updates