Inspiration
The hackathon brief itself said it best: sovereign AI is taking off. OpenClaw runs locally, on browsers, on phones. People don't want their models phoning home. But here's the contradiction nobody talks about — the moment your local agent needs to do anything useful (read your email, book a meeting, push a commit), it has to talk to cloud APIs. And that means credentials.
Every "local AI" setup I've seen solves the compute problem and ignores the auth problem. The agent either gets a raw API key hardcoded somewhere, or the user pastes tokens into a config file and prays. That's not sovereign AI — that's sovereign compute with a credential leak waiting to happen.
VaultBridge is the missing piece: a secure MCP gateway that lets OpenClaw (or any locally-run agent) interact with Gmail, Google Calendar, and GitHub — without ever seeing a single OAuth token.
What It Does
VaultBridge sits between your local AI agent and the outside world as an MCP (Model Context Protocol) server. The agent calls tools like send_email, create_calendar_event, or open_github_issue. VaultBridge intercepts those calls, resolves the OAuth tokens via Auth0 Token Vault, executes the API call, and returns the result.
The local model is completely credential-blind. It just calls tools.
Key behaviours:
- Least-privilege scoping: each connected service gets only the OAuth scopes it needs
- Step-up authentication: destructive actions (sending email, merging a PR) trigger a real-time approval prompt before execution
- Async auth: the agent can be offline; Token Vault holds the delegated tokens safely until needed
- Revocation: users can cut off any OAuth grant instantly without touching the local agent
Architecture
[OpenClaw — local / restricted mode]
↓ MCP tool call
[VaultBridge — Node.js MCP Server]
↓ Token Vault OAuth resolution
↓ Step-up auth (for writes)
[Gmail API / Google Calendar API / GitHub API]
The VaultBridge MCP server exposes a clean tool surface to the local agent. Auth0 for AI Agents owns the entire OAuth lifecycle. The user gets a dashboard showing active scopes, pending approvals, and full audit trail.
How I Built It
- OpenClaw as the local agent runtime (restricted network mode)
- Node.js + MCP SDK for the VaultBridge gateway server
- Auth0 for AI Agents — Token Vault for OAuth token management, async auth, and step-up authentication
- Gmail API + Google Calendar API + GitHub API as the connected services
- React dashboard for consent visibility and step-up approval UI
Challenges
The hardest part was designing the trust boundary correctly. The MCP tool surface exposed to the local agent had to be expressive enough to be useful but narrow enough to enforce least-privilege at the tool level, not just the OAuth scope level. Mapping "agent intent" to "minimum required scopes" without losing flexibility took significant iteration.
Step-up auth UX was also non-trivial — the local agent is async by nature, so blocking it for human approval required a clean pending-state queue that didn't break the agent's execution loop.
📝 BONUS BLOG POST: The Credential Problem Nobody in Local AI is Talking About
This section is a standalone blog post submission for the Auth0 community.
Why Your "Local AI" Setup Still Has a Credential Problem
Everyone is excited about sovereign AI. Run your models locally. Keep your data off the cloud. Own your compute. It's a genuine and important trend — and tools like OpenClaw are making it accessible on MacMinis, Android phones, and even browsers.
But there's a problem that the local AI community is largely ignoring, and it's going to bite people hard.
The moment your local agent needs to do something useful, it needs credentials.
Read your Gmail? OAuth token. Book a calendar event? OAuth token. Open a GitHub issue? Personal access token. Post to Slack? Webhook secret. Every real-world integration requires some form of secret, and right now, most local AI setups handle this the same way developers have always handled it: dump it in a .env file, paste it in a config, or hardcode it somewhere.
This is not a hypothetical risk. Local agent frameworks are being built by users who are not security engineers. The threat model for a file on your laptop that contains your Gmail OAuth token is completely different from a secret in a cloud service with proper encryption, rotation, and revocation. One house fire, one compromised Termux session, one malicious MCP server extension — and every credential your local agent uses is exposed.
Auth0 Token Vault changes this equation entirely.
Instead of your local agent holding credentials, Token Vault holds them. Your agent calls a tool. VaultBridge intercepts it. Token Vault resolves the appropriate OAuth token, makes the API call with it, and returns the result. The local model never sees the token. The .env file disappears. The attack surface collapses.
What's particularly elegant is how this maps to the principle of least privilege. Token Vault lets you scope OAuth grants precisely. Your email-reading agent gets gmail.readonly. The moment it needs to send an email, Token Vault can trigger step-up authentication — a real human approval before the action executes. This is the kind of control that enterprise security teams have had for years, now available to the indie developer running Ollama on a phone.
The pattern VaultBridge establishes is important beyond any single project: local compute + cloud auth = actual sovereign AI. The compute is yours. The credentials are safely delegated. The user is always in control.
As local AI agents become more capable — browsing the web, managing files, sending communications on your behalf — the auth layer becomes the most critical part of the stack. VaultBridge is a proof of concept for what that layer should look like: transparent, revocable, scoped, and human-approved for anything that matters.
Auth0 for AI Agents is solving exactly the right problem at exactly the right time. The ecosystem just needed someone to build the bridge.
Built for the Authorized to Act Hackathon. VaultBridge source code and setup guide available on GitHub.
Log in or sign up for Devpost to join the conversation.