Integration guide
Secure your AI application with one line of code. Change your API endpoint URL to point at Thorn Layer. Everything else stays the same.
Thorn Layer works with any LLM provider. The examples below show OpenAI, Anthropic, and Gemini — the same one-line change applies to any provider.
1. Sign up and get your API key
Create a free account at the dashboard. Generate an API key — it is shown once. Copy it immediately.
2. Change your endpoint URL
Replace your LLM provider's URL with the Thorn Layer equivalent. That's the only change.
OpenAI
https://api.openai.com/v1/chat/completionshttps://api.thornlayer.com/v1/chat/completionsAnthropic
https://api.anthropic.com/v1/messageshttps://api.thornlayer.com/v1/messagesGoogle Gemini
https://generativelanguage.googleapis.com/v1/models/...https://api.thornlayer.com/v1/models/...Any custom LLM
https://your-llm-provider.com/v1/completionshttps://api.thornlayer.com/v1/completions3. Add your authentication header
Include your Thorn Layer API key in the Authorization header:
Authorization: Bearer <your-thorn-layer-api-key>
Your Thorn Layer API key goes in the Authorization header. Your LLM provider key is configured separately in your dashboard — it is never exposed in client code.
4. You are protected.
Every request now passes through Thorn Layer before reaching your LLM provider. You are always protected. No prompt content is ever stored or logged. Thorn Layer processes every request in memory only.