The integration path stays intentionally simple: use an OpenAI-compatible client, point it at the The Claw Bay base URL, send your bearer key, and keep the request shape your app already uses.
Send Authorization: Bearer ca_v1...
Pick from the live model list at runtime.
Backends, scripts, bots, and internal tools.
For most users, the flow is four moves: get a key from the dashboard, store it in an env var, point your SDK at the base URL, then call `/models` before your first real request.
Open the dashboard and copy your The Claw Bay key.
Store the key and base URL in your app config or runtime environment.
Call `/models` first so your app selects a live model instead of hard-coding one blindly.
export THECLAWBAY_API_KEY="ca_v1..."
export THECLAWBAY_BASE_URL="https://theclawbay.com/api/codex-auth/v1/proxy/v1"
curl "$THECLAWBAY_BASE_URL/models" \
-H "Authorization: Bearer $THECLAWBAY_API_KEY"You do not need a separate The Claw Bay SDK package. Use the official OpenAI SDKs with a custom base URL and your The Claw Bay key.
import OpenAI from "openai";
const client = new OpenAI({
apiKey: process.env.THECLAWBAY_API_KEY,
baseURL: "https://theclawbay.com/api/codex-auth/v1/proxy/v1",
});
const models = await client.models.list();
const model = models.data[0]?.id ?? "gpt-5.4";
const response = await client.responses.create({
model,
input: "Write a short launch note for a new SaaS feature.",
reasoning: { effort: "medium" },
text: {
verbosity: "medium",
format: { type: "json_object" },
},
});
console.log(response.output_text);from openai import OpenAI
import os
client = OpenAI(
api_key=os.environ["THECLAWBAY_API_KEY"],
base_url="https://theclawbay.com/api/codex-auth/v1/proxy/v1",
)
models = client.models.list()
model = models.data[0].id if models.data else "gpt-5.4"
response = client.responses.create(
model=model,
input="Summarize why usage-based pricing can work for developers.",
reasoning={"effort": "low"},
text={
"verbosity": "medium",
"format": {"type": "json_object"},
},
)
print(response.output_text)curl "$THECLAWBAY_BASE_URL/responses" \
-H "Authorization: Bearer $THECLAWBAY_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5.4",
"input": "Return a JSON object with keys headline and bullets.",
"reasoning": { "effort": "medium" },
"text": {
"verbosity": "medium",
"format": { "type": "json_object" }
}
}'const stream = await client.responses.create({
model,
input: "Stream a concise product update.",
stream: true,
reasoning: { effort: "low" },
});
for await (const event of stream) {
if (event.type === "response.output_text.delta") {
process.stdout.write(event.delta ?? "");
}
}const completion = await client.chat.completions.create({
model,
messages: [
{ role: "user", content: "What is the weather in Boston?" },
],
tools: [
{
type: "function",
function: {
name: "get_current_weather",
description: "Get the current weather for a city.",
parameters: {
type: "object",
properties: {
location: { type: "string" },
unit: { type: "string", enum: ["c", "f"] }
},
required: ["location"]
}
}
}
],
tool_choice: "auto",
});
console.log(completion.choices[0]?.message?.tool_calls ?? []);These are the public routes exposed through the The Claw Bay proxy today.
Use this first so your app chooses a model that is actually available right now.
A good default for most integrations. Supports model, reasoning, tools, text controls, include fields, and streaming.
A smaller response shape for lightweight text generation or simple UI flows.
Useful for existing chat-based apps and function-calling workflows.
Returns the same 5-hour and weekly usage budget telemetry shown in the dashboard for a regular The Claw Bay API key.
curl "https://theclawbay.com/api/codex-auth/v1/quota" \
-H "Authorization: Bearer $THECLAWBAY_API_KEY"Most standard parameters are passed through normally. A few OpenAI-compatible features are intentionally unsupported today, so handle those limits explicitly in your app.
Open the dashboard to reveal your key, or go to setup if you want the local command flow.