Inspiration

The inspiration behind EdgeInferInator comes from the need to make IoT systems more autonomous, resilient, and efficient by reducing dependence on centralized processing. Traditional home automation solutions like Home Assistant are powerful but can become single points of failure, especially when reliant on cloud-based inference. By enabling real-time anomaly detection and pattern classification directly on an ESP32 microcontroller, this project decentralizes processing, allowing the system to continue functioning independently even if the central server is down.

Additionally, the rise of TinyML presented an exciting opportunity to bring machine learning capabilities to small, low-power devices, expanding the possibilities for edge-based intelligence. The ability to make intelligent decisions directly on-device inspired the concept of using TinyML with ESP32 for tasks like predictive maintenance, security monitoring, and energy-efficient automation. This project demonstrates how affordable, accessible hardware like the ESP32 can transform smart home systems and beyond, offering a robust, low-latency, and scalable solution.

What it does

EdgeInferInator brings real-time anomaly detection and pattern classification to the ESP32 microcontroller, allowing it to function autonomously and make intelligent decisions directly at the edge. Here’s a breakdown of its core functions:

  1. Real-Time Anomaly Detection:
    The ESP32, using a TinyML model trained with Edge Impulse and deployed with TensorFlow Lite Micro, continuously monitors input data from sensors. It detects anomalies based on trained patterns, allowing it to recognize unusual behavior without relying on a central system.

  2. Pattern Classification:
    In addition to anomaly detection, the model can classify specific patterns, such as temperature fluctuations, noise levels, or motion, enabling tailored responses based on different environmental cues.

  3. Home Automation Integration:
    Once an anomaly or pattern is detected, the ESP32 sends real-time notifications to Home Assistant through MQTT. This integration enables automated responses, such as triggering alarms or adjusting settings in connected devices, based on the data received.

  4. Data Backup to MongoDB Atlas:
    Detected events are also sent to MongoDB Atlas via REST API, where data is logged and stored securely for historical analysis. This setup allows for in-depth tracking of patterns over time, supporting insights that can lead to improved model accuracy and smarter automation rules.

  5. Secure Remote Access:
    Using Cloudflare’s Argo Tunnel, the project enables secure, remote access to Home Assistant. This setup ensures that users can monitor and manage their home automation from anywhere, maintaining data privacy and security.

Together, these functions allow EdgeInferInator to create a decentralized, resilient, and intelligent home automation system, reducing dependency on cloud services and centralized controllers while enhancing data-driven automation.

How we built it

EdgeInferInator was built by combining TinyML, edge computing, and IoT integration on an ESP32 microcontroller to achieve autonomous anomaly detection and pattern classification. Here’s a step-by-step breakdown of how we built the project:

  1. Data Collection and Model Training:
    We used Edge Impulse to collect and preprocess sensor data for anomaly detection and classification. The data was then used to train a machine learning model specifically optimized for edge devices. After achieving satisfactory accuracy, we exported the model as a TensorFlow Lite Micro (.tflite) file.

  2. Deploying the Model on ESP32:
    The trained model was then deployed on the ESP32 microcontroller using TensorFlow Lite Micro, which provides a lightweight, optimized library for running ML models on resource-constrained devices. We used Arduino IDE and PlatformIO to load the model and handle the necessary TensorFlow libraries for ESP32.

  3. Integrating MQTT for Home Assistant Communication:
    To enable real-time communication with Home Assistant, we configured MQTT on the ESP32. This allows the ESP32 to send notifications directly to Home Assistant whenever an anomaly or pattern is detected. By doing this, we achieved seamless integration with the home automation system, enabling automated responses to detected events.

  4. Setting Up REST API for MongoDB Atlas Logging:
    For data storage, we configured the ESP32 to communicate with MongoDB Atlas via REST API. This allows the ESP32 to log classified data and detected anomalies directly to the cloud, ensuring that historical data is securely backed up and accessible for further analysis.

  5. Enabling Secure Remote Access with Cloudflare’s Argo Tunnel:
    To allow remote monitoring and management, we set up Cloudflare Argo Tunnel to expose our Home Assistant instance securely to the internet. We paired this setup with a custom domain from GoDaddy, allowing secure, remote access while maintaining data privacy and integrity.

  6. Testing and Optimization:
    Finally, we conducted testing to ensure the model performed reliably on the ESP32, optimizing for power efficiency and minimal latency. We adjusted MQTT configurations, API calls, and ESP32 parameters to create a robust and efficient solution for edge-based inference.

By combining machine learning with IoT components, EdgeInferInator provides a scalable, autonomous, and decentralized approach to home automation, reducing reliance on centralized systems while enhancing data-driven, real-time decision-making.

Challenges we ran into

Building EdgeInferInator presented several challenges, primarily in optimizing TinyML and integrating various IoT components. Here’s a breakdown of the main challenges we faced:

  1. Model Optimization for ESP32:
    Running machine learning models on a low-power microcontroller like the ESP32 required significant optimization. We needed to reduce the model’s complexity while maintaining accuracy, which involved balancing feature selection and data preprocessing in Edge Impulse. This process required multiple iterations and testing to get the model to perform effectively without overwhelming the ESP32's limited resources.

  2. Managing Memory Constraints:
    The ESP32 has limited memory and computational power, which made it challenging to run inference without causing memory overflows or crashes. We carefully managed the memory footprint of the model and optimized the code to minimize resource usage. TensorFlow Lite Micro also required specific memory allocations, which we had to adjust to avoid issues during real-time processing.

  3. Ensuring Reliable MQTT Communication:
    Establishing reliable communication between the ESP32 and Home Assistant over MQTT was challenging, especially in environments with intermittent connectivity. We had to fine-tune the MQTT configurations to reduce latency and ensure messages were sent and received reliably, as missed messages could impact the accuracy of automated responses in Home Assistant.

  4. Implementing Secure Cloud Integration:
    Setting up secure cloud connections for logging data to MongoDB Atlas was a critical but complex task. We needed to ensure that data was transmitted securely, requiring the setup of encrypted REST API requests on the ESP32. Managing API keys and securing data transfer added another layer of complexity to the project.

  5. Remote Access Security with Cloudflare Argo Tunnel:
    Exposing Home Assistant securely to the cloud using Cloudflare Argo Tunnel required careful configuration. Ensuring a secure, encrypted connection was essential to protect data privacy, but the initial setup and troubleshooting took time. We had to ensure the tunnel was stable, accessible, and reliable for remote monitoring and control.

  6. Power Efficiency and Latency:
    Since the ESP32 operates continuously, maintaining power efficiency while keeping latency low was a key challenge. We experimented with various optimization techniques, including adjusting the frequency of model inference and MQTT messages, to ensure the device could function for extended periods without frequent recharging or lag in response times.

Despite these challenges, we were able to create a fully functional, autonomous system by carefully balancing model size, resource constraints, and secure communication protocols. Each obstacle helped us improve our approach, leading to a more robust and efficient solution.

Accomplishments that we're proud of

Building EdgeInferInator was a challenging yet rewarding experience. Here are some of the key accomplishments we’re proud of:

  1. Running Machine Learning Inference on ESP32:
    Successfully deploying a TensorFlow Lite Micro model on the ESP32 for real-time anomaly detection and pattern classification was a major milestone. This required extensive model optimization and memory management, and achieving reliable inference on a microcontroller with limited resources was a testament to our persistence and technical skills.

  2. Seamless Integration with Home Assistant:
    Creating a decentralized, edge-based inference setup that integrates smoothly with Home Assistant for real-time automation was another big achievement. By enabling autonomous decision-making on the ESP32, we reduced the dependency on Home Assistant, making the system more resilient and capable of functioning independently when needed.

  3. Efficient Data Logging with MongoDB Atlas:
    We implemented a secure and scalable solution to log classified data and detected anomalies in MongoDB Atlas via REST API. This integration ensures that all historical data is backed up in the cloud, providing a robust record for future analysis and improving the value of our data-driven automation.

  4. Enhanced Remote Accessibility and Security:
    Configuring Cloudflare Argo Tunnel to provide secure remote access to our Home Assistant instance was a major accomplishment. By securing the setup with end-to-end encryption, we ensured that users can safely monitor and manage their home automation system from anywhere, maintaining high standards of data privacy.

  5. Optimizing for Low Latency and Power Efficiency:
    Balancing low latency and power efficiency on the ESP32 was no easy task. We optimized the frequency of model inference and data transmission, allowing the ESP32 to perform continuous anomaly detection with minimal power consumption. This accomplishment enhances the project’s practicality for real-world, long-term use.

  6. Creating a Fully Autonomous System:
    We’re proud of achieving true autonomy with EdgeInferInator. The ESP32 can make intelligent decisions, trigger automated actions via Home Assistant, and back up data without any human intervention. This accomplishment represents the future of edge-based intelligence in IoT, where systems can operate independently with robust reliability and security.

Through each of these accomplishments, we learned new techniques, strengthened our skills in TinyML, IoT, and cloud integrations, and developed a solution that leverages the power of machine learning at the edge in a meaningful way.

What we learned

Creating EdgeInferInator provided us with valuable insights across several technical domains, as well as some critical lessons in problem-solving and optimization:

  1. Deep Dive into TinyML:
    Working with TinyML on a microcontroller like the ESP32 gave us a solid understanding of the intricacies of deploying machine learning on low-power devices. We learned how to optimize models for resource constraints, balance accuracy and performance, and select features that maximize the efficiency of on-device inference.

  2. Model Optimization and Memory Management:
    The limited memory of the ESP32 required us to make strategic choices in model design and data handling. We gained hands-on experience in optimizing TensorFlow Lite Micro models, including minimizing model size and reducing the computational load without sacrificing accuracy, which will be valuable for future TinyML projects.

  3. IoT Communication Protocols:
    We became proficient in using MQTT and REST APIs to facilitate smooth communication between the ESP32, Home Assistant, and MongoDB Atlas. This experience taught us the importance of choosing the right communication protocol based on the data frequency, reliability, and efficiency needs of an IoT project.

  4. Security Best Practices in IoT:
    Setting up secure remote access via Cloudflare Argo Tunnel highlighted the importance of robust security in IoT applications. We learned how to use encryption, manage API keys, and secure sensitive data effectively, ensuring that user privacy and data integrity are maintained.

  5. Cloud Integration for IoT Systems:
    Integrating MongoDB Atlas as our cloud database for data backup gave us practical experience with cloud databases and RESTful API integration. We learned how to structure and store data for scalability and long-term accessibility, which will be beneficial for any data-driven IoT applications we develop in the future.

  6. Power and Latency Optimization:
    Balancing the power efficiency and responsiveness of the ESP32 was an essential part of this project. We learned how to optimize both inference frequency and data transmission, keeping energy usage low while ensuring timely response times, which is key for practical, long-lasting edge-based systems.

  7. Overcoming Real-World Constraints:
    Perhaps most importantly, this project taught us how to adapt to the challenges of working with constrained devices, including handling memory limitations, power considerations, and maintaining reliable communication. These skills will be invaluable in building efficient, scalable IoT solutions in real-world scenarios.

Through EdgeInferInator, we not only deepened our technical skills but also gained a holistic understanding of designing robust, secure, and autonomous IoT systems. These lessons will shape our approach to future projects, both in TinyML and beyond.

What's next for EdgeInferInator

We have several exciting ideas for expanding and refining EdgeInferInator to make it even more capable, efficient, and user-friendly. Here’s what’s next:

  1. Enhanced Model Complexity and Flexibility:

    • Expand Model Functionality: Improve the model to detect a broader range of patterns and anomalies, making it applicable to a wider variety of environments and use cases.
    • Automatic Model Updates: Develop a way to remotely update the model deployed on the ESP32, allowing for easier upgrades and customization.
  2. Real-Time Data Dashboard:

    • Data Visualization: Create an interactive dashboard to display live and historical data trends from MongoDB Atlas, providing users with intuitive insights into detected anomalies and system performance.
    • Customizable Alerts: Add real-time alert options, allowing users to receive push notifications via Home Assistant or a mobile app whenever a critical anomaly is detected.
  3. Energy Efficiency Optimizations:

    • Sleep Mode Integration: Implement low-power modes when the ESP32 isn’t actively inferring, significantly extending battery life in off-grid or portable applications.
    • Dynamic Inference Frequency: Adjust the inference frequency based on environmental factors or user needs, conserving power without compromising response time.
  4. Improved Edge-to-Cloud Integration:

    • Streamlined Data Processing: Introduce preprocessing on the ESP32 to reduce data sent to MongoDB Atlas, optimizing storage use and bandwidth.
    • Edge Data Aggregation: Allow the ESP32 to aggregate data over time and send summary reports to the cloud, giving users a concise view of overall trends and reducing data load.
  5. User-Friendly Setup and Customization:

    • Simplified Onboarding Process: Develop a step-by-step setup wizard to simplify the installation, making it accessible even for users without technical backgrounds.
    • Customizable Anomaly Thresholds: Enable users to adjust detection sensitivity based on their specific needs, adding flexibility for varied application scenarios.
  6. Additional Protocol Support:

    • Enhanced Connectivity: Integrate support for other IoT communication protocols, such as LoRaWAN or BLE, to broaden the range of connectivity options, especially in low-power environments.

By pursuing these improvements, we aim to make EdgeInferInator a more versatile, accessible, and powerful tool for edge-based anomaly detection and pattern classification. We’re excited about the possibilities ahead and look forward to continuing to push the boundaries of TinyML and IoT.

EdgeInferInator

This project demonstrates the use of TinyML for anomaly detection and pattern classification on an ESP32 microcontroller. The goal is to enable autonomous, edge-based inference on ESP32, reducing reliance on centralized systems like Home Assistant. This setup integrates seamlessly with Home Assistant for home automation and MongoDB Atlas for cloud data backup, while utilizing Edge Impulse, TensorFlow Lite Micro, MQTT, REST APIs, and Cloudflare's Argo Tunnel.

Table of Contents

Project Overview

The primary objective of this project is to perform real-time anomaly detection and pattern classification directly on the ESP32, allowing it to operate autonomously from Home Assistant. Data is collected, preprocessed, and trained using Edge Impulse, then deployed to the ESP32 as a TensorFlow Lite Micro (TFLite Micro) model. The ESP32 communicates detected anomalies or classifications to Home Assistant using MQTT, and backs up data to MongoDB Atlas via a REST API.

TinyML - An Introduction

TinyML is the application of machine learning (ML) on small, low-power devices such as microcontrollers (MCUs). By enabling these devices to make intelligent decisions independently, TinyML makes applications like predictive maintenance, anomaly detection, and health monitoring more efficient. TinyML brings several advantages:

  • Decentralization: Reduces dependency on central servers.
  • Low Latency: Enables faster responses as processing happens locally.
  • Reduced Bandwidth: Minimizes data transfer by performing inference on-device.

Why This Project?

Home Assistant (HA) is a powerful tool for home automation, typically relying on a centralized approach where devices report data to HA, which then performs any necessary processing or inference. However, this setup makes HA a single point of dependency. By moving the inference to the ESP32, this project aims to decentralize the automation system, making it more robust, autonomous, and less reliant on a centralized controller.

Tech Stack and Frameworks

The project leverages a combination of machine learning, networking, and IoT technologies:

  1. Edge Impulse

    • Used for data collection, preprocessing, training, and conversion to a TFLite Micro model.
    • Provides an easy-to-use interface for building ML models for edge devices.
  2. TensorFlow Lite Micro

    • Deployed on the ESP32 for real-time inference.
    • Lightweight, optimized library designed to run ML models on embedded systems.
  3. ESP32 Microcontroller

    • Performs on-device inference and communicates with other components in the system.
    • Acts as the main compute unit for executing the ML model.
  4. MQTT Protocol

    • Used to send data from ESP32 to Home Assistant.
    • Lightweight messaging protocol that ensures efficient communication.
  5. REST API

    • Logs data from ESP32 to MongoDB Atlas for cloud storage and backup.
    • Provides a reliable way to store historical data and facilitate remote access.
  6. MongoDB Atlas

    • Cloud-based NoSQL database that stores historical data.
    • Allows secure, scalable storage and retrieval of classified data and anomalies.
  7. Cloudflare Argo Tunnel

    • Exposes Home Assistant instance securely to the cloud.
    • Paired with a domain from GoDaddy, enabling secure remote access to Home Assistant.

Installation and Setup

To get this project up and running, follow these steps:

  1. Collect and Train Data:

    • Use Edge Impulse to collect and preprocess data for training.
    • Train the model and export it as a TensorFlow Lite Micro (.tflite) model.
  2. Deploy Model on ESP32:

    • Use Arduino IDE or PlatformIO to load the model onto the ESP32.
    • Include TensorFlow Lite Micro libraries to run the model.
  3. Set Up MQTT for Home Assistant:

    • Configure the MQTT broker within Home Assistant.
    • Program the ESP32 to send classified data to Home Assistant over MQTT.
  4. Configure REST API for MongoDB Atlas:

    • Set up a MongoDB Atlas account and create a collection to store data.
    • Write an HTTP POST function on the ESP32 to log data to MongoDB.
  5. Expose Home Assistant to the Cloud:

    • Use Cloudflare's Argo Tunnel to create a secure, accessible endpoint for Home Assistant.
    • Configure the domain obtained from GoDaddy for remote access.

Usage

  1. Power up the ESP32 to initiate pattern classification and anomaly detection.
  2. ESP32 sends classification notifications to Home Assistant via MQTT for real-time monitoring.
  3. ESP32 logs classified patterns and anomalies to MongoDB Atlas, ensuring historical data is accessible for further analysis.
  4. Monitor and manage the setup remotely through Home Assistant, using the Cloudflare tunnel.

Future Work and Improvements

  • Expand Model Capabilities: Enhance the model to recognize more complex patterns.
  • Data Analysis: Implement a dashboard for managing all the process under one roof.
  • Optimization: Optimize model and MQTT configurations for minimal latency and improved power efficiency and reducing size.

Conclusion

This project exemplifies the power of TinyML in creating autonomous, decentralized, and resilient IoT systems. By leveraging Edge Impulse, TensorFlow Lite Micro, MQTT, and REST APIs, we enable the ESP32 to not only make intelligent decisions but also to integrate seamlessly with centralized platforms like Home Assistant and MongoDB Atlas for broader data management and automation.


Final notes: the code provided have all the api keys and other credentials in it, feel free to use this until i destroy the instances.

Built With

Share this project:

Updates

posted an update

EdgeInferInator

This project demonstrates the use of TinyML for anomaly detection and pattern classification on an ESP32 microcontroller. The goal is to enable autonomous, edge-based inference on ESP32, reducing reliance on centralized systems like Home Assistant. This setup integrates seamlessly with Home Assistant for home automation and MongoDB Atlas for cloud data backup, while utilizing Edge Impulse, TensorFlow Lite Micro, MQTT, REST APIs, and Cloudflare's Argo Tunnel.

Table of Contents

Project Overview

The primary objective of this project is to perform real-time anomaly detection and pattern classification directly on the ESP32, allowing it to operate autonomously from Home Assistant. Data is collected, preprocessed, and trained using Edge Impulse, then deployed to the ESP32 as a TensorFlow Lite Micro (TFLite Micro) model. The ESP32 communicates detected anomalies or classifications to Home Assistant using MQTT, and backs up data to MongoDB Atlas via a REST API.

TinyML - An Introduction

TinyML is the application of machine learning (ML) on small, low-power devices such as microcontrollers (MCUs). By enabling these devices to make intelligent decisions independently, TinyML makes applications like predictive maintenance, anomaly detection, and health monitoring more efficient. TinyML brings several advantages:

  • Decentralization: Reduces dependency on central servers.
  • Low Latency: Enables faster responses as processing happens locally.
  • Reduced Bandwidth: Minimizes data transfer by performing inference on-device.

Why This Project?

Home Assistant (HA) is a powerful tool for home automation, typically relying on a centralized approach where devices report data to HA, which then performs any necessary processing or inference. However, this setup makes HA a single point of dependency. By moving the inference to the ESP32, this project aims to decentralize the automation system, making it more robust, autonomous, and less reliant on a centralized controller.

Tech Stack and Frameworks

The project leverages a combination of machine learning, networking, and IoT technologies:

  1. Edge Impulse

    • Used for data collection, preprocessing, training, and conversion to a TFLite Micro model.
    • Provides an easy-to-use interface for building ML models for edge devices.
  2. TensorFlow Lite Micro

    • Deployed on the ESP32 for real-time inference.
    • Lightweight, optimized library designed to run ML models on embedded systems.
  3. ESP32 Microcontroller

    • Performs on-device inference and communicates with other components in the system.
    • Acts as the main compute unit for executing the ML model.
  4. MQTT Protocol

    • Used to send data from ESP32 to Home Assistant.
    • Lightweight messaging protocol that ensures efficient communication.
  5. REST API

    • Logs data from ESP32 to MongoDB Atlas for cloud storage and backup.
    • Provides a reliable way to store historical data and facilitate remote access.
  6. MongoDB Atlas

    • Cloud-based NoSQL database that stores historical data.
    • Allows secure, scalable storage and retrieval of classified data and anomalies.
  7. Cloudflare Argo Tunnel

    • Exposes Home Assistant instance securely to the cloud.
    • Paired with a domain from GoDaddy, enabling secure remote access to Home Assistant.

Installation and Setup

To get this project up and running, follow these steps:

  1. Collect and Train Data:

    • Use Edge Impulse to collect and preprocess data for training.
    • Train the model and export it as a TensorFlow Lite Micro (.tflite) model.
  2. Deploy Model on ESP32:

    • Use Arduino IDE or PlatformIO to load the model onto the ESP32.
    • Include TensorFlow Lite Micro libraries to run the model.
  3. Set Up MQTT for Home Assistant:

    • Configure the MQTT broker within Home Assistant.
    • Program the ESP32 to send classified data to Home Assistant over MQTT.
  4. Configure REST API for MongoDB Atlas:

    • Set up a MongoDB Atlas account and create a collection to store data.
    • Write an HTTP POST function on the ESP32 to log data to MongoDB.
  5. Expose Home Assistant to the Cloud:

    • Use Cloudflare's Argo Tunnel to create a secure, accessible endpoint for Home Assistant.
    • Configure the domain obtained from GoDaddy for remote access.

Usage

  1. Power up the ESP32 to initiate pattern classification and anomaly detection.
  2. ESP32 sends classification notifications to Home Assistant via MQTT for real-time monitoring.
  3. ESP32 logs classified patterns and anomalies to MongoDB Atlas, ensuring historical data is accessible for further analysis.
  4. Monitor and manage the setup remotely through Home Assistant, using the Cloudflare tunnel.

Future Work and Improvements

  • Expand Model Capabilities: Enhance the model to recognize more complex patterns.
  • Data Analysis: Implement a dashboard for managing all the process under one roof.
  • Optimization: Optimize model and MQTT configurations for minimal latency and improved power efficiency and reducing size.

Conclusion

This project exemplifies the power of TinyML in creating autonomous, decentralized, and resilient IoT systems. By leveraging Edge Impulse, TensorFlow Lite Micro, MQTT, and REST APIs, we enable the ESP32 to not only make intelligent decisions but also to integrate seamlessly with centralized platforms like Home Assistant and MongoDB Atlas for broader data management and automation.


Final notes: the code provided have all the api keys and other credentials in it, feel free to use this until i destroy the instances.

Log in or sign up for Devpost to join the conversation.