Inspiration
The primary driver for ADRE was the "Last Frontier" on Earth: the Hadal Zone. At depths of 11,000 meters, the crushing pressure reaches 16,000 PSI, making it nearly impossible for humans or standard tethered drones to explore.
I was inspired to solve the Latency Gap. In the deep ocean, signal delay means a human operator cannot react fast enough to a sudden threat or a rare scientific discovery. I wanted to build an autonomous "Pilot" that doesn't just follow a path but reasons through its environment. By using Gemini, I aimed to transform a piece of hardware into a self-aware explorer that can make survival decisions in the dark, isolated depths where human contact is lost.
What it does
ADRE is a simulation of a spherical titanium ROV that uses Gemini’s reasoning capabilities to navigate and analyze the deep ocean without human intervention. Multimodal Perception: It processes three distinct visual streams—Normal, Thermal, and LiDAR—to "see" through the darkness of the deep sea. AI Classification & Logic: The system classifies deep-sea entities in real-time. It uses behavioral analysis to determine if a creature is Neutral (triggering a data scan) or Aggressive (triggering evasive maneuvers). Autonomous Navigation: Utilizing 6-axis vectored thrusters, the AI calculates optimal paths to shadow targets or maintain safety protocols. Predictive Physics: The simulation accounts for high-pressure environments, calculating structural integrity and energy expenditure.
How we built it
The Build: "Smart Simulation "Since we didn't have 3D modeling software, we used a web-native approach to simulate the deep ocean. The Engine: Built with Three.js and JavaScript for high-performance 3D rendering in the browser. The Brain: We integrated Gemini 3 as the reasoning core. It processes multimodal sensor data (Thermal, LiDAR, and Vision) to classify deep-sea life and dictate ROV movement. The Physics: We modeled a spherical titanium hull capable of withstanding extreme pressure. To ensure scientific accuracy, we calculated the Buoyancy Force needed to maintain a "Hover" state in the Hadal Zone.
Challenges we ran into
Logical Failures: We faced a bug where the ROV wouldn't "escape" from aggressive entities. I had to refine the Reasoning Loop to ensure the AI prioritized survival over data collection when a threat was detected. Visual Depth: Initially, the simulation was pitch black. We implemented a multi-spectrum HUD (Thermal and LiDAR mesh) to allow the AI (and the user) to "see" using data rather than just light. UI Instability: Our telemetry data was crashing and overlapping. We rebuilt the Tactical HUD using a dynamic layering system to handle multiple target classifications simultaneously.
Accomplishments that we're proud of
True Autonomous Agency: We successfully moved beyond hard-coded scripts. By using Gemini 3, the ROV makes independent "Reasoning" decisions—switching from scientific shadowing to evasive maneuvers based on real-time threat assessment. Sensor Fusion Mastery: We implemented a Tri-Spectrum Vision system (Normal, Thermal, and LiDAR). Seeing the AI successfully navigate and "see" in a zero-light Hadal Zone simulation was our biggest technical milestone. Semantic Communication: We developed a protocol to compress complex visual data into textual metadata. This allows the ROV to transmit critical discoveries through low-bandwidth acoustic channels
What we learned
AI Agency vs. Automation: We learned that Gemini 3 can act as a "Reasoning Pilot." It doesn't just follow scripts; it interprets high-stakes situations to make independent decisions, moving us from basic automation to true AI Agency. The Necessity of Sensor Fusion: Operating in the pitch-black Hadal Zone taught us that "vision" is secondary to data fusion. Grounding the AI in LiDAR and Thermal streams was the only way to ensure 100% navigational accuracy without hallucinations. Physics-First Development: Using Three.js to simulate 16,000 PSI taught us how to translate mathematical constraints into code. We learned that realistic movement in a simulation requires precise calculation of drag and buoyancy . The Value of Failure: Debugging "ghosting" entities and "infinite ascent" bugs taught us that Environmental Guardrails are just as important as the AI’s brain. An autonomous system is only reliable if its world has strict physical rules.
What's next for ADRE- Aquatic deep reasoning explorer
"Sea-to-Sky" Integration: Linking ADRE with the Super Brain UAV to create a unified autonomous network. The UAV will act as a surface relay, while the ROV handles deep-sea intelligence, sharing a synchronized Gemini reasoning core. Gemini Live Interaction: Integrating real-time voice protocols so researchers can query the ROV verbally—"ADRE, analyze the thermal anomaly at 30 degrees"—receiving instant, natural language feedback. On-Device Edge AI: Migrating the reasoning engine to local hardware to eliminate satellite latency, ensuring 100% autonomous survival even when disconnected from the cloud. Swarm Exploration: Deploying multiple units that use Distributed Sensing to map the ocean floor. By coordinating via acoustic pings, they can maximize coverage area.
Built With
- c++
- figma
- google-web-speech-api
- html
- java
- javascript
- kimi.ai
- python

Log in or sign up for Devpost to join the conversation.