Inspiration
We were inspired by the idea of building a physical AI system that truly integrates software with real-world hardware through hands-on collaboration. Our concept was also influenced by Project Hail Mary, specifically the character Rocky, whose ability to navigate confidently through dark, unfamiliar environments pushed us to imagine a robot capable of underground exploration.
What it does
Project SubTerra is an AI-powered cave exploration robot designed to operate offline. It uses onboard sensing and autonomous navigation to detect obstacles, evaluate terrain, and choose safer paths while moving through confined, underground-style environments.
How we built it
We started by developing a Python-based simulation to model navigation and decision-making. This allowed us to prototype path selection and obstacle avoidance before moving to hardware. We then attempted to deploy the system onto a Rubik Pi platform, integrating USB LiDAR for environmental scanning and ultrasonic sensors for short-range detection. The goal was to run autonomous navigation directly on-device with a native runtime, enabling fully offline decision-making.
Challenges we ran into
The biggest challenge wasn’t just building the system—it was integrating AI with real hardware under time constraints. Bridging simulation to real-world behavior was difficult Handling LiDAR data over USB introduced latency and processing issues Sensor fusion between LiDAR and ultrasonic data was inconsistent Hardware-specific motor control required tuning we didn’t have time to fully complete Overall, we were rushed, which limited how far we could push full system integration Even though we had most of the groundwork in place, getting everything to work seamlessly together in real time was significantly harder than expected.
Accomplishments that we're proud of
Despite the challenges, we’re proud of how much we built in such a short time. We created a working simulation for autonomous navigation We established a partial pipeline from AI decision-making → hardware execution We successfully integrated multiple sensing modalities (LiDAR + ultrasonic) Most importantly, we collaborated across disciplines—AI, mechatronics, and electrical engineering—to build a complex system together Given the time constraints, what we achieved as a team is something we’re genuinely proud of.
What we learned
We learned: Simulation does not directly translate to real-world performance Sensor noise, latency, and calibration matter more than expected Embedded systems introduce constraints that fundamentally change how AI must be implemented Small integration issues can break an entire autonomous pipeline
What's next for Project SubTerra
Next, we want to: Fully stabilize motor control and hardware integration Improve sensor fusion and real-time processing Implement stronger path planning and mapping Add recovery behaviors for failure scenarios Move toward a more reliable system for underground search, inspection, and exploration
Log in or sign up for Devpost to join the conversation.