Inspiration

We come from the world of biological programming and bioinformatics. Our day-to-day usually involves modeling living systems, cells, or DNA strands, so industrial hardware and 3D printing were quite far from our usual domain. However, we noticed a fascinating parallel: just as a living organism suffers cellular wear and tear from environmental stress, an industrial 3D printer like the HP Metal Jet S100 suffers thermal and mechanical fatigue.

We asked ourselves: What if we applied our experience modeling "living systems" to create a Digital Twin of an industrial machine, and added an Artificial Intelligence assistant to diagnose it? From this leap out of our comfort zone, Casiopea was born.

What it does

This project is an end-to-end Digital Twin that simulates the operational life of an HP Metal Jet S100 printer. Its main features include:

  • Physics and ML Simulation: Models the real-time degradation of the Recoater Blade, Nozzle Plate, and Heating Elements based on environmental stress, temperature, and factory contamination.
  • Historian Database: Logs thousands of hours of simulated telemetry across different scenarios like "Dirty Factory" or "Chaos".
  • Casiopea (AI Co-Pilot): An AI agent based on RAG (Retrieval-Augmented Generation) that reads real-time telemetry and acts as an expert diagnostic assistant.
  • Interactive Dashboard: Operators can talk to Casiopea using their voice, view real-time health graphs, and receive predictive alerts in their native language (English, Spanish, or Catalan).

How we built it

We designed the project architecture in four modular phases:

  1. The Logic Engine (phase1.py): We combined Machine Learning (GradientBoostingRegressor to detect thermal anomalies) with pure mathematics. Interestingly, we used models that are sometimes applied in biological survival analysis. For example, the abrasive wear of the recoater blade is modeled using a Weibull reliability distribution, where the probability of survival is calculated as: $$R(t) = e^{-(t/\eta)^\beta}$$
  2. The Simulation Engine (phase2.py): A time-series generator that creates long-term scenarios and stores telemetry in a local SQLite database.
  3. The AI Agent (phase3.py): We integrated the blazing-fast Groq API running the Llama-3.3-70B-versatile model. We built a custom RAG pipeline so the LLM queries the historian.db database and grounds its answers in the machine's actual data.
  4. The Frontend (app.py): We wrapped everything in an interactive dashboard using Streamlit, integrating Plotly for charts and a microphone component for voice commands.

Challenges we ran into

  • A completely new domain: This project completely pulled us out of what we are used to doing. We are not mechanical engineers; we are biological programmers. Understanding and translating the thermodynamics of an extruder or the abrasion of a metal blade into code was a monumental challenge.
  • Balancing the math: Adjusting the decay rates so components didn't fail too quickly or last forever required a lot of iteration to mimic reality.
  • LLM Hallucinations and strict context: Getting Llama 3 to stick strictly to the telemetry data instead of inventing generic advice, and forcing it to speak in specific languages (like Catalan or Spanish), required injecting "Critical Override" instructions into the system prompt.

Accomplishments that we're proud of

  • Overcoming the barrier of entry to an industry that isn't ours (additive manufacturing) by applying our mathematical foundations.
  • Building a simulation that doesn't rely on random numbers, but has a robust mathematical logic behind it.
  • Achieving near-instantaneous AI responses by interacting directly with a real-time telemetry database.

What we learned

We discovered that mathematical abstraction is a universal language. The same data modeling tools we use to analyze cellular deterioration or viral behavior can, with the right adjustments, predict when a titanium heating element is going to fail. Additionally, we learned how to integrate cutting-edge LLMs in highly restrictive environments where data accuracy is everything.

What's next

  1. Real IoT Integration: Replace our synthetic simulation engine with live sensor data from a real 3D printer via MQTT.
  2. Multimodal Computer Vision: Integrate a vision AI model that adapts biological/medical image diagnostics to detect physical anomalies in printed parts.
Share this project:

Updates