Inspiration
Industrial manufacturing systems generate rich operational data, but most operator experiences are fragmented across static dashboards, technical logs, and siloed tools. We were inspired to design a single spatial interface where machine understanding feels immediate, intuitive, and elegant. The vision became a digital co-pilot for the HP Metal Jet S100: blend real-time telemetry, predictive insight, and contextual AI into one “always-on” control surface.
What it does
Digital Co-Pilot provides a unified digital twin experience with:
- A 2D schematic and 3D digital twin view that switch seamlessly
- Live component monitoring for key subsystems (build unit, printhead, recoater, etc.)
- Contextual component focus cards with status, alerts, and metrics
- A summonable Aether chat that can answer general or part-specific questions
- Voice input via speech-to-text for hands-free operation
- Smooth camera focus and progressive disclosure to inspect internals safely and clearly
How we built it
- Backend: FastAPI service exposing twin, telemetry, and agent endpoints
- Simulation/Data layer: Synthetic degradation simulator + long-horizon validation trajectories
- AI layer: Grounded diagnostic agent with telemetry-aware responses and evidence-linked reasoning
- Frontend: React + TypeScript dashboard with:
- 3D scene (React Three Fiber / three.js)
- 2D schematic mode
- Analytics tiles (trajectory, ranking, alerts)
- Stateful interaction model (selection, focus, time controls, chat context)
- Voice: Speech-to-text and text-to-speech integrations for operator UX
- State + UX: Zustand store, motion transitions, progressive disclosure, and contextual overlays
Challenges we ran into
- Rendering stability in 3D transitions (black-screen/context lifecycle issues)
- Keeping 2D, 3D, and analytics views perfectly synchronized to one source of truth
- Designing camera behavior that is inspectable but not disorienting
- Communicating forecast semantics clearly (e.g., maintenance/reset effects vs “flatline” -misconceptions)
- Balancing rich intelligence output with concise, operator-friendly explanations
- Integrating proactive alerts, agent context, and temporal simulation into one coherent flow
Accomplishments that we're proud of
- Built a polished digital twin UX that feels like one product, not a set of tools
- Delivered both required dimensions:
- Functionality: model + simulate + interact
- Intelligence: predict + explain + recommend + verify
- Achieved smooth cross-view interaction and contextual component drill-down
- Added grounded AI assistance that remains useful during multi-turn troubleshooting
- Created a demo flow that clearly shows operational value, not just technicalities
What we learned
- In industrial UX, clarity beats feature count: context and prioritization are everything
- Predictive systems must explain why, not just output risk scores
- Temporal simulation is powerful only when tied to concrete action loops
- Spatial interfaces (2D/3D) increase trust when interactions are deterministic and consistent
- “AI co-pilot” value comes from grounding + actionability, not conversational flair
What's next for Digital Co-Pilot for HP Metal Jet S100
- Add scenario comparison (maintenance policy A vs B over future horizons)
- Improve explainability with causal traces and confidence decomposition
- Expand proactive workflows (auto-generated maintenance work orders)
- Add role-based experiences (operator, reliability engineer, plant manager)
- Integrate with real historian/MES streams for production pilots
- Support multi-printer fleet orchestration and plant-level optimization dashboards
Log in or sign up for Devpost to join the conversation.