Inspiration
The core inspiration was creating a completely private, secure, and self-contained communication and learning environment that works entirely without internet access. We aimed to leverage the power of Large Language Models (LLMs)—specifically the Chrome Built-in AI—by offloading the heavy computational task to a teacher's machine. This architecture makes it ideal for classrooms, field work, or emergency situations where connectivity is unreliable or privacy is paramount.
What it does
This project establishes a zero-internet, private local area network (LAN) chat system where an ESP32 microcontroller acts as the central hub, and a designated "teacher" computer processes LLM requests using its on-device AI.
Specifically, it offers:
- ESP32 Hotspot & Chat Server: The ESP32 creates its own Wi-Fi Access Point (AP) and runs a simple WebSocket server, allowing multiple client devices (students' phones/laptops) to connect locally.
- Teacher-WebSocket LLM Bridge: A specialized client application (
teacher-websocket) running on the teacher's Chrome browser connects to the ESP32. This client is the only one authorized to communicate with the Chrome Built-in AI LLM. - Offline LLM Integration: Student chat messages flagged as prompts are relayed by the ESP32 to the teacher's client, which uses the offline-capable Chrome Built-in AI to generate a response.
- Group Broadcast: The LLM's response is sent back to the ESP32 via the teacher's WebSocket connection and then broadcast to all connected clients in the chat, enabling real-time, group-wide AI assistance without touching the internet.
How we built it
The project was constructed using a hybrid architecture, combining micro-controller firmware with simple client-side JavaScript, adhering to a preference for basic code:
- ESP32 (Arduino Framework): Programmed to function as the Wi-Fi Access Point (AP) and to host a robust, but simple, WebSocket server.
- Client Interface (HTML/JS): A simple single-page web application served by the ESP32, using basic JavaScript to establish the WebSocket connection. The JavaScript uses the
async/awaitpattern for handling the connection. - Teacher's Interface (HTML/JS): A separate, security-flagged HTML/JavaScript client runs in the Chrome browser. This application contains the logic to:
- Monitor all chat messages on its dedicated WebSocket connection.
- Use the Chrome Built-in AI API to process flagged prompts.
- Send the LLM-generated reply back to the ESP32 for broadcast.
- Code Simplicity: All JavaScript code uses descriptive, camelCase variables and function names starting with "my" (e.g.,
mySendMessage(),myProcessLLMPrompt) to serve as clear, simple examples for teaching.
Challenges we ran into
***NodeJS did not give us the offline websocket chat that we wanted. It does now work, which is good so that people without an esp32 can see what the wifi chat is like. To get the node websocket working a few particular steps are needed such as installing ws, loading the teacher-soket.html from a local location, and entering the ws url slightly differently with ws://#.#.#.#:8080. Also the generating student connection needs http://#.#.#.#:8080 whereas for the esp32 connection the teacher socket is ws://#.#.#.#/ws and the student connection is just http://#.#.#.# Slight difference but important.
- Resource Constraint Management: The primary challenge was ensuring reliable, low-latency communication through the limited CPU and memory resources of the ESP32 while managing multiple client connections simultaneously.
- Zero-Internet Dependency: Developing a system that is functional and robust even at boot-up, with zero reliance on any external network infrastructure.
- WebSocket Control Flow: Designing a simple, clear mechanism for the teacher's client to differentiate itself from student clients and manage the dedicated, non-chat stream of LLM prompts/replies to and from the ESP32.
Accomplishments that we're proud of
- Zero-Internet LLM Chat: Successfully created a completely off-grid, private, and portable LLM-powered group chat system using only an ESP32 and a Chrome-enabled laptop.
- Seamless Hardware-Software Integration: The elegant integration of a tiny, low-cost microcontroller (ESP32) for networking with a powerful, desktop-class processor for local LLM inference (Chrome Built-in AI) is a major architectural achievement.
- Teaching Model: Created an entirely self-contained network and AI system that can be deployed instantly for educational workshops without worrying about Wi-Fi passwords or internet reliability.
What we learned
- Architectural Offloading: The critical importance of the architectural decision to offload the heavy LLM processing to the teacher's device, using the ESP32 solely for lightweight network relaying.
- ESP32 Network Optimization: Gained significant experience in optimizing ESP32 firmware for stability and connection management as a multi-client WebSocket server/Access Point.
- The Power of Local AI: Reinforced that powerful, real-time, AI-assisted learning can be facilitated entirely on the edge, prioritizing privacy and accessibility over cloud connectivity.
What's next for Esp32 WiFi WebSocket multi user chat to chrome LLM
***Trying multiple other formats such as BLE or LoRa to communicate our LLM chat offline.
- Advanced Prompting/Filtering: Implement better UI tools in the teacher's
teacher-websocketapplication to manage, queue, and filter student prompts before sending them to the LLM. - Offline Asset Caching: Embed and cache more educational content (e.g., simple web pages, basic text resources) directly on the ESP32's file system for students to access while offline.
- Cross-Browser/LLM Support: Explore integrating other simple local LLM options (like those using WebGPU/Transformers.js) to broaden the teacher's hardware and browser compatibility.
- Client-Side Security: Adding basic security checks within the JavaScript to prevent unauthorized use of the LLM prompt feature by non-teacher clients.

Log in or sign up for Devpost to join the conversation.