NOTE:
You can view the website, our model, and research + results directly at https://www.projectmarkey.com/
For our demo and presentation we have a live demo! However for the hosted website (www.projectmarkey.com) the demo is replaced with a link to the model directly (HuggingFace) for security reasons. If you would like to try the demo check out the GitHub or if you would like to see the model itself here is the model link: https://huggingface.co/jungter/markey-v1/tree/main
Inspiration
Ghost guns are untraceable firearms that require no background checks and carry no serial numbers. They are frequently 3D printed at home, and the statistics are alarming. In the past 5 years, there has been almost a 1,600% surge in unregistered 3D-printed firearms. Over 92,000 have been seized since 2017, with more than 1,700 directly tied to homicides. Despite this massive threat, the software running these printers have absolutely no idea what they are manufacturing.
This is no longer just a hypothetical issue. In 2026, states like New York, Washington, and California began actively pushing hardware mandates. Lawmakers are now demanding that 3D printers and slicing software (the software that converts model files into printing instructions or G-code) ship with built in algorithms designed specifically to block the production of firearm components.
The problem is that the industry is not ready for these laws. Traditional security tools completely fail at this task because they rely on simple filename checks or basic 2D image recognition. Anyone can bypass those checks by simply renaming a file or slightly altering the outer shell of a 3D model.
We built Markey to prove that true hardware safety has to be meticulous. Instead of looking at the surface, we analyze the actual G-code instructions. By reading the exact physical toolpath, we can identify the high strength structural patterns required for functional weapon parts. This allows us to detect the danger and block the job before the first layer is ever printed. Using G-code also allows for the identification of "hidden" components within the model. Markey was named after Senator Ed Markey, who has pushed legislation for restrictions on (unregistered) 3D-printed guns for almost a decade.
What it does
When a user uploads a 3D model (.stl, .obj, or .glb), the system generates a forensic dashboard. It creates six orthographic renders for visual inspection and slices the model using CuraEngine to generate raw G-code. This G-code is then analyzed by our model that combines PyTorch and Hugging Face embeddings to determine if the geometry represents a restricted firearm part. The result is a forensic dashboard that provides a binary "Allowed vs. Restricted" decision backed by a confidence value and other metrics.
Markey is a fine tuned model from Qwen 3 0.6B embed. The model was trained based on a hybrid approach: (29) feature extraction and a projection layer on text embedding. In this configuration, feature extraction does most of the heavy lifting, while text serves as an supplement classifier.
G-Code is like Assembly, it is human readable, but not human readable enough where it's very very hard to do logic on what the output part will be like. This gives the tokenizer a very easy time, since it will already tokenize strings such as "G0", and we can expect reasonable results from just using Qwen's tokenizer instead of making one from scratch.
The model in the end showed very strong accuracy with only 3 false positives and 0 false negatives out of the 400 eval files, with its loss plateauing at around 0.01, and validation accuracy at around 0.9925. These numbers might be a sign of overfitting, but in our testing examples the model haven't seen before, it achieves similar accuracy to our benchmarked eval accuracy.
We have already built in slicer integration, but the model can also be loaded onto 3D printer firmware (such as on the Ender s3).
How we built it
The pipeline is a complex orchestration of five distinct stages. The Next.js frontend handles the upload and communicates with an API route that manages a temporary workspace. We used Python and Trimesh for 3D normalization and rendering. The critical slicing stage uses a custom Node.js wrapper for CuraEngine. For the heavy lifting, we deployed a remote GPU workflow on Vast.ai running a hybrid classifier. This model uses sentence-transformers to embed G-code features and text, providing a high-accuracy classification that is returned to the UI as an auditable JSON payload. In development we also used ngrok to expose the remote classifier to the local Next.js API.
Challenges we ran into
Our biggest challenge was GETTING THE MODEL TO RUN ON DIFFERENT ENVIRONMENTS. Running a C++ slicer like CuraEngine across different operating systems led to significant linker and library errors, even with attempts at containerization. We had to build a robust local-to-remote bridge, using Ngrok to create a tunnel so our web server could talk to the Vast.ai GPU (we rented a GPU) instance without being blocked by cloud firewalls. It took many hours to get working on both Windows and Mac as we originally configured it on Ubuntu.
Accomplishments that we're proud of
We are proud of moving the needle from "simple AI wrapper" to a legitimate engineering pipeline. Successfully taking a raw 3D mesh and turning it into a base64 visual render, a physics-based G-code file, and a remote ML prediction in a single request was a very grueling task. Also, in our testing, frontier models such as gpt-5.4-xhigh, opus-4.6-high, and gemini-3.1-pro hallucinated and outputted the wrong answer on what the G-code might be printing out. We managed to beat these frontier LLMs in G-code classification accuracy by a long shot! We know a simple ML model will not stop people from printing unregistered firearms, but we wanted to defer it as much as possible.
What we learned
We learned that G-code is a "high-signal" data source for security. While you can disguise the name or the outside of a 3D model, you cannot easily hide the high-density infill and reinforced walls required for a firearm to function. We also gained deep experience in orchestrating multi-modal systems, learning how to sync data between Next.js, Python, and remote C++ binaries.
What's next for Markey
We want to expand the model to recognize a broader range of components. Ultimately, we see this as a foundation for a general "policy-aware" restriction/identification layer helping makers navigate the complex ethics of 3D printing while promoting responsible manufacturing. Furthermore, IP theft costs the industrial sector hundreds of millions annually, and many hobbyists use 3D-printing to recreate trademarked/copyrighted property. It is a very similar class of problem: accountability for what gets printed. This is a very ambitious goal but the natural next step for Markey!
Built With
- cura
- fastapi
- g-code
- huggingface
- javascript
- jupyter
- next
- ngrok
- node.js
- python
- typescript
- vast.ai
- wsl




Log in or sign up for Devpost to join the conversation.