Inspiration

I kept seeing developers paste logs and config files straight into ChatGPT to debug something. API keys, IPs, passwords — all of it. And it wasn’t careless people doing it, it was good engineers who just weren’t thinking about it in the moment because the habit feels so harmless. I couldn’t stop noticing it. So I built something about it.

What it does

You paste your text in. Privatiser finds all the sensitive stuff — API keys, IPs, emails, passwords, JWTs, SSNs, & PII and swaps them out for fake placeholders before you touch any AI tool. When the AI gives you a response, you paste it back and everything gets restored. The whole thing runs in your browser. Nothing hits a server.

The one thing I’m most happy with: the placeholders are consistent. If the same IP shows up six times, it becomes the same fake IP six times. So the AI can still reason about your data properly — it just never sees the real values.

How I built it

Python for the CLI, vanilla JavaScript for the browser. No frameworks, nothing fancy — I wanted the code to be simple enough that anyone could read it and trust it. Detection is all regex, but smarter than it sounds. Credit cards for example don’t just match a pattern — they get run through a Luhn checksum so I’m not flagging random 16-digit numbers. Every format needed its own approach.

Challenges I ran into

The sheer variety of secret formats nearly broke me. AWS keys look nothing like GitHub tokens. Slack tokens look nothing like JWTs. Connection strings embed credentials inside URLs. Some secrets have no standard format at all — just a variable name and some entropy. Every pattern was a negotiation between catching real secrets and not breaking legitimate text. Too loose and you’re flagging innocent strings. Too strict and someone pastes a real key that slips through and thinks they’re safe. I went through a lot of real-world examples to get it right. Still adding formats.

Accomplishments that I’m proud of

Honestly? When the test suite went green and I ran it against real data for the first time. Seeing it correctly catch everything — and restore everything — on actual logs and config files, not just made-up test cases, was the moment it felt real. All the edge cases I’d been wrestling with for days suddenly proved their worth.

And now im a regex god :)

What we learned

The best security tools don’t feel like security tools. If it slows you down, you’ll skip it. The whole point of Privatiser is that it fits into the workflow you already have. That and secrets are weird. There are so many of them and none of them look alike.​​​​​​​​​​​​​​​

What's next for Privatiser - The AI Privacy Filter

  • VS Code extension to redact directly in the editor.
  • Support for file uploads (.env, config files, log files).
  • Expanding the pattern library with community contributions.
  • A shareable “safe paste” link format for teams​​​​​​​​​​​​​​​​.

Built With

Share this project:

Updates