Smart Extrude

Inspiration

Smart Extrude was inspired by challenges I encounter in my role at the CICS Makerspace at UMass, where manual monitoring and delayed intervention during extrusion lead to material waste, energy loss, and increased operator overhead. The goal was to build a practical system that improves efficiency and sustainability through automation.


Project Overview

Smart Extrude is a real-time, vision-assisted monitoring system for extrusion workflows. It uses live video and computer vision to detect extrusion states and provide actionable feedback through a web interface, reducing the need for constant human supervision.


Technical Approach

Computer Vision & AI

  • YOLO for real-time object detection
  • OpenCV for image processing and camera handling
  • OpenAI to assist with system reasoning, confidence threshold tuning, and edge-case analysis

This stack enabled rapid iteration while maintaining robustness in a real-world environment.

Backend

  • Python + Flask for camera management and low-latency MJPEG video streaming
  • Threaded execution for reliable performance on embedded hardware

Frontend

  • SvelteKit for a responsive, intuitive monitoring interface
  • Live video display optimized for operator use

Key Learnings & Challenges

Key challenges included tuning detection thresholds under varying lighting conditions, optimizing performance on constrained hardware, and ensuring smooth real-time streaming. The project reinforced the importance of careful AI integration in cyber-physical systems where software decisions directly affect material outcomes.


Impact & Sustainability

Smart Extrude helps reduce material waste, energy consumption, and operator workload by enabling earlier detection and faster response to extrusion issues. Designed for deployment in the CICS Makerspace, the system demonstrates how applied AI can improve sustainability and operational efficiency in shared fabrication environments.

Built With

Share this project:

Updates