Integration guide

Secure your AI application with one line of code. Change your API endpoint URL to point at Thorn Layer. Everything else stays the same.

Thorn Layer works with any LLM provider. The examples below show OpenAI, Anthropic, and Gemini — the same one-line change applies to any provider.

1. Sign up and get your API key

Create a free account at the dashboard. Generate an API key — it is shown once. Copy it immediately.

2. Change your endpoint URL

Replace your LLM provider's URL with the Thorn Layer equivalent. That's the only change.

OpenAI

Beforehttps://api.openai.com/v1/chat/completions
Afterhttps://api.thornlayer.com/v1/chat/completions

Anthropic

Beforehttps://api.anthropic.com/v1/messages
Afterhttps://api.thornlayer.com/v1/messages

Google Gemini

Beforehttps://generativelanguage.googleapis.com/v1/models/...
Afterhttps://api.thornlayer.com/v1/models/...

Any custom LLM

Beforehttps://your-llm-provider.com/v1/completions
Afterhttps://api.thornlayer.com/v1/completions

3. Add your authentication header

Include your Thorn Layer API key in the Authorization header:

Authorization: Bearer <your-thorn-layer-api-key>

Your Thorn Layer API key goes in the Authorization header. Your LLM provider key is configured separately in your dashboard — it is never exposed in client code.

4. You are protected.

Every request now passes through Thorn Layer before reaching your LLM provider. You are always protected. No prompt content is ever stored or logged. Thorn Layer processes every request in memory only.