Show HN: PasteGuard – Self-hosted privacy proxy for LLMs
2 points
7 hours ago
| 1 comment
| github.com
| HN
sgasser
7 hours ago
[-]
Using LLM APIs but worried about sending client data? Built a proxy for that.

OpenAI-compatible proxy that masks personal data and secrets before sending to your provider.

Mask Mode (default):

  You send:      "Email sarah.chen@hospital.org about meeting Dr. Miller"
  LLM receives:  "Email <EMAIL_1> about meeting <PERSON_1>"
  You get back:  Original names restored in response
Route Mode (if you run a local LLM):

  Requests with PII  →  Local LLM
  Everything else    →  Cloud
What it catches:

  PII: Names, emails, phones, credit cards, IBANs, IPs, locations (24 languages)
  Secrets: Private keys, API keys (OpenAI, AWS, GitHub), JWT tokens
Uses Microsoft Presidio for PII detection. ~500MB RAM, 10-50ms per request.

Works with Cursor, Open WebUI, LangChain, or any OpenAI-compatible client.

Docs: https://pasteguard.com/docs

Feedback on edge cases welcome.

reply