OpenAI Compatibility
Quota implements the OpenAI API spec. Change your baseURL and apiKey, and your existing code works with no other changes.
What changes
| Before (OpenAI direct) | After (Quota) | |
|---|---|---|
| Base URL | https://api.openai.com/v1 | https://api.usequota.ai/v1 |
| API Key | sk-... (OpenAI key) | sk-quota-... (Quota key) |
| Everything else | Same models, same parameters, same response format | |
curl
curl https://api.usequota.ai/v1/chat/completions \
-H "Authorization: Bearer sk-quota-YOUR_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o-mini",
"messages": [{"role": "user", "content": "Hello!"}]
}'Node.js (OpenAI SDK)
import OpenAI from "openai";
const client = new OpenAI({
baseURL: "https://api.usequota.ai/v1", // <- only change
apiKey: process.env.QUOTA_API_KEY, // <- only change
});
// Existing code works as-is
const response = await client.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.choices[0].message.content);Python (OpenAI SDK)
from openai import OpenAI
client = OpenAI(
base_url="https://api.usequota.ai/v1", # <- only change
api_key="sk-quota-YOUR_KEY", # <- only change
)
response = client.chat.completions.create(
model="gpt-4o-mini",
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.choices[0].message.content)Multi-provider models
Through the same API, you can also use Anthropic and Google models by prefixing the model name. No extra SDKs or API keys needed.
// Anthropic Claude - same OpenAI SDK, same API
const claude = await client.chat.completions.create({
model: "anthropic/claude-sonnet-4.5",
messages: [{ role: "user", content: "Hello from Claude!" }],
});
// Google Gemini - same OpenAI SDK, same API
const gemini = await client.chat.completions.create({
model: "google/gemini-3-flash",
messages: [{ role: "user", content: "Hello from Gemini!" }],
});Supported providers
| Provider | Model prefix | Example |
|---|---|---|
| OpenAI | none (default) | gpt-4o-mini |
| Anthropic | anthropic/ | anthropic/claude-sonnet-4.5 |
google/ | google/gemini-3-flash |
What you get
- One API key for OpenAI, Anthropic, and Google models
- Per-user billing so your users pay for their own usage
- Usage tracking with per-request credit costs in every response
- Streaming works identically to OpenAI (SSE format)
Next steps
- Quickstart — create an account and get your API key
- Chat API reference — full endpoint docs
- Billing modes — developer billing vs. user billing
- Core SDK — framework-agnostic client for any JS/TS runtime
- Next.js SDK — React hooks, components, and route handlers for Next.js apps