🧠 Inspiration

Industrial systems and human operators often face failures and health risks that start with subtle, invisible changes — tiny vibrations, slight motion shifts, or micro-color variations that go unnoticed by the naked eye.
Our team wanted to create a unified intelligence system that could detect these micro-level patterns before they become major issues.
Inspired by predictive maintenance, contactless health monitoring, and AI-powered video magnification, we built VibeSight — a solution that bridges human safety and machine reliability under one intelligent ecosystem.


⚙️ What It Does

VibeSight uses Eulerian Video Magnification, computer vision, and machine learning to analyze video footage of humans or machines.
It amplifies micro-motions and vibrations to detect:

  • Early-stage machinery faults such as imbalance or wear.
  • Human health indicators like heart rate, breathing rate, and fatigue.

The processed data is analyzed, visualized, and transmitted into the supOS Unified Namespace via MQTT, where real-time dashboards display health and vibration metrics.
Through supOS Event Flows, alerts and automation are triggered instantly when anomalies are detected — enabling predictive maintenance and proactive safety monitoring.


🧩 How We Built It

  • Frontend: Built using React.js for user interaction, video upload, parameter adjustment, and real-time WebSocket logs.
  • Magnification Server (Flask): Uses OpenCV and custom Eulerian/Phase-based filters to reveal subtle motion changes frame by frame.
  • Graph Generation Server (Python): Extracts motion data and generates analytical graphs using IO Net Intelligence.
  • Report Generator (Node.js): Calls AI models, aggregates responses, and produces human-readable summaries.
  • AI Models: TensorFlow/Keras for classifying human and machine health states.
  • Integration: Sends analyzed metrics to supOS via MQTT for dashboards and automation.

🚧 Challenges We Ran Into

  • Implementing real-time video magnification efficiently without GPU acceleration.
  • Achieving cross-device compatibility (Web, Windows, Android).
  • Debugging MQTT data flow between multiple servers and supOS.
  • Calibrating ML models to distinguish natural and abnormal vibrations.

🏆 Accomplishments We're Proud Of

  • Built a complete AI-driven system for detecting both human and machine anomalies through video.
  • Seamlessly integrated analytics with supOS Unified Namespace for real-time monitoring and event automation.
  • Developed a working live magnification prototype.
  • Created a modular, scalable architecture combining Flask, Node.js, and Python services.

📚 What We Learned

  • How to combine AI, computer vision, and IoT orchestration into one functioning platform.
  • Deep understanding of supOS Unified Namespace and Event Flow for automation.
  • The importance of data standardization and modular architecture for predictive industrial AI systems.

🔮 What’s Next for VibeSight

  • Integrate supOS AI Toolkit for anomaly detection inside the supOS interface.
  • Add GPU acceleration and edge deployment for real-time factory monitoring.
  • Build low-code dashboard templates for quick supOS adoption.
  • Expand to multi-factory orchestration and operator fatigue analytics.
  • Evolve into a plug-and-play AI assistant for industrial health monitoring powered by supOS.

Built With

Share this project:

Updates