Back to blog
AI

How to Give Your App AI Without Putting API Keys on Every Server

February 28, 20265 min read

Storing OpenAI or Anthropic API keys on customer machines is a security nightmare. Arcellite's token-metered AI proxy solves this — keys live on our cloud, your app sends a license token. Here's how it works under the hood.


If you're building a self-hosted product that uses AI, you face an immediate security problem: your customers run your software on their own hardware, but the AI APIs (OpenAI, Anthropic, Groq) require secret keys with real spending power attached.

You can't ship those keys inside your application. And you can't ask every customer to create their own API accounts — that's a massive onboarding barrier.

The standard solutions don't work

Bundling keys in your binary is reverse-engineerable. Environment variables per-deployment requires trusting every customer's ops team. Building your own proxy means standing up and securing a cloud service that wasn't in your roadmap.

How Arcellite solves it

Arcellite's Inside Out feature is a token-metered AI proxy. Your self-hosted application sends a POST request with a `license_token` — the same token that activates the customer's installation — and Arcellite's cloud handles the actual API call using keys that never leave our infrastructure.

curl -X POST https://api.arcellite.com/api/ai/proxy \
  -H "Content-Type: application/json" \
  -d '{
    "license_token": "arc_live_...",
    "provider": "openai",
    "model": "gpt-4o",
    "messages": [{"role": "user", "content": "Summarize this document"}]
  }'

The response is a standard chat completion — your app never knows (or cares) which keys were used.

Token budgets per plan

Each Arcellite plan includes a monthly token quota: 500K for Startup, 2M for Growth, 10M for Enterprise. Usage is tracked per installation so you can audit exactly how much AI your customers are consuming.

The keys never leave the server. Your customers get AI. You ship faster.

R
S
Book onboarding with the founders