Inspiration

RoomPulse was inspired by a common problem in in-person meetings: important issues get buried, quieter participants are often left out, and teams sometimes realize too late that the conversation has drifted away from the agenda.

We wanted to build something beyond a traditional AI notetaker. Instead of quietly generating a summary after the meeting, RoomPulse acts as a room-visible AI facilitator. It listens while the meeting is happening and surfaces timely, concise nudges on a shared display.

The core idea is a heartbeat loop: at regular intervals, RoomPulse checks the live transcript, agenda, participation state, and meeting context, then asks: “What does the room need to hear right now?”

What it does

RoomPulse is a local-first web app for real-time meeting facilitation. Before the meeting starts, users provide context such as the meeting goal, agenda, expected participant count, and optional participant details.

During the meeting, RoomPulse displays the live transcript, tracks approximate speaker participation, monitors agenda progress, and surfaces room-facing reminders. For example, it can remind the group when someone has not spoken yet, when a decision is still unresolved, or when the discussion is drifting from the agenda.

Every configurable interval, the facilitator agent wakes up, reviews the latest meeting state, and updates the agenda, review document, participation status, and meeting reminders. If Pi authentication is unavailable, RoomPulse falls back to deterministic local facilitation so the demo remains reliable.

How we built it

RoomPulse is built with Next.js, TypeScript, React, and local-first server routes. The frontend includes the meeting setup flow, shared room display, agenda controls, participation panel, live transcript, facilitator review document, and heartbeat controls.

For audio, the browser captures microphone input and streams 16 kHz mono PCM audio to a local Python WebSocket service. That service runs local Whisper transcription with faster-whisper and assigns finalized transcript segments to approximate Speaker N clusters using lightweight audio features.

The heartbeat loop is the core architecture:

meeting state + transcript delta + participation state
→ Pi facilitator agent
→ room-visible intervention

On every heartbeat, RoomPulse packages the current meeting context, transcript, agenda state, participation state, review history, and prior interventions. This payload passes through a clean Pi adapter boundary. When Pi is configured, RoomPulse uses the Codex-backed Pi facilitator agent. Otherwise, it uses a deterministic local fallback.

We also added tests for the facilitation logic, Pi adapter, speaker tracking, meeting storage, transcription utilities, and API routes.

Challenges we ran into

One major challenge was keeping the product focused. It would have been easy to build a generic meeting summary tool, but RoomPulse needed to be something more specific: a visible, real-time facilitator powered by heartbeat intervals.

Local audio was another challenge. Browser microphone permissions, WebSocket streaming, local transcription, and speaker clustering each introduced their own failure modes. We designed the system so the core demo still works even when transcription or Pi authentication is not fully configured.

Pi and Codex authentication also required careful handling because local Codex auth files and Pi auth storage use different formats. We bridged these credentials into the Pi adapter without storing secrets in the repository.

Speaker diarization was another limitation. For the MVP, we used lightweight local audio features and approximate Speaker N clustering. This is useful for participation nudges, but it is not biometric identification and can be affected by room noise, similar voices, microphone placement, or overlapping speech.

Accomplishments that we’re proud of

We are proud that RoomPulse feels like a real meeting product rather than a prototype scaffold. It has a complete setup flow, polished shared display, live transcript support, agenda tracking, heartbeat status, participation nudges, and real-time facilitator interventions.

We are also proud of the heartbeat architecture. It makes the system predictable and demoable: at each interval, RoomPulse performs a fresh facilitation pass using the latest meeting state.

The local fallback is another important accomplishment. Even without Pi authentication or a production backend, RoomPulse can still demonstrate the core experience of real-time meeting facilitation.

What we learned

We learned that meeting AI behaves very differently when it is visible to the entire room. It cannot act like a private notetaker; it needs to be concise, timely, and socially aware.

We also learned that participation tracking does not require perfect identity recognition to be useful. Even approximate speaker clustering can help a group notice when expected voices have not contributed.

Most importantly, we learned that timing matters. The heartbeat loop gives the facilitator a rhythm. Instead of interrupting constantly or waiting until the meeting ends, RoomPulse checks in at intentional moments.

What’s next for RoomPulse

Next, we would improve the audio and diarization pipeline with better speaker calibration and optional mapping from Speaker N clusters to participant names.

We would also add stronger meeting memory, richer agenda automation, and more flexible heartbeat controls. Another next step is expanding the Pi facilitator tools so the agent can update agenda state, maintain a living review document, and surface more precise interventions.

Longer term, RoomPulse could become a lightweight meeting operating system for shared rooms: not just a recorder, not just a chatbot, but a visible facilitator that helps teams stay focused, inclusive, and action-oriented.

Built With

Share this project:

Updates