Every day, API keys, tokens, emails, and DB URLs slip into prompts, logs, or demos. Once they hit the LLM, they’re out of your control.
I built LLM-Sentinel, a privacy-first proxy that:
Intercepts requests to OpenAI, Ollama, Claude, etc.
Masks 50+ ...