API docs

Use your The Claw Bay key inside your own software.

The integration path stays intentionally simple: use an OpenAI-compatible client, point it at the The Claw Bay base URL, send your bearer key, and keep the request shape your app already uses.

Base URL

OpenAI-compatible endpoint

https://theclawbay.com/api/codex-auth/v1/proxy/v1
Auth
Bearer key

Send Authorization: Bearer ca_v1...

Start here
/models

Pick from the live model list at runtime.

Works for
Apps and automations

Backends, scripts, bots, and internal tools.

Quickstart

For most users, the flow is four moves: get a key from the dashboard, store it in an env var, point your SDK at the base URL, then call `/models` before your first real request.

Create or reveal your key

Open the dashboard and copy your The Claw Bay key.

Set env vars

Store the key and base URL in your app config or runtime environment.

Discover a model

Call `/models` first so your app selects a live model instead of hard-coding one blindly.

Environment and model discovery

bash
export THECLAWBAY_API_KEY="ca_v1..."
export THECLAWBAY_BASE_URL="https://theclawbay.com/api/codex-auth/v1/proxy/v1"

curl "$THECLAWBAY_BASE_URL/models" \
  -H "Authorization: Bearer $THECLAWBAY_API_KEY"

SDK examples

You do not need a separate The Claw Bay SDK package. Use the official OpenAI SDKs with a custom base URL and your The Claw Bay key.

JavaScript / TypeScript

js
import OpenAI from "openai";

const client = new OpenAI({
  apiKey: process.env.THECLAWBAY_API_KEY,
  baseURL: "https://theclawbay.com/api/codex-auth/v1/proxy/v1",
});

const models = await client.models.list();
const model = models.data[0]?.id ?? "gpt-5.4";

const response = await client.responses.create({
  model,
  input: "Write a short launch note for a new SaaS feature.",
  reasoning: { effort: "medium" },
  text: {
    verbosity: "medium",
    format: { type: "json_object" },
  },
});

console.log(response.output_text);

Python

python
from openai import OpenAI
import os

client = OpenAI(
    api_key=os.environ["THECLAWBAY_API_KEY"],
    base_url="https://theclawbay.com/api/codex-auth/v1/proxy/v1",
)

models = client.models.list()
model = models.data[0].id if models.data else "gpt-5.4"

response = client.responses.create(
    model=model,
    input="Summarize why usage-based pricing can work for developers.",
    reasoning={"effort": "low"},
    text={
        "verbosity": "medium",
        "format": {"type": "json_object"},
    },
)

print(response.output_text)

Responses API

bash
curl "$THECLAWBAY_BASE_URL/responses" \
  -H "Authorization: Bearer $THECLAWBAY_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-5.4",
    "input": "Return a JSON object with keys headline and bullets.",
    "reasoning": { "effort": "medium" },
    "text": {
      "verbosity": "medium",
      "format": { "type": "json_object" }
    }
  }'

Streaming responses

js
const stream = await client.responses.create({
  model,
  input: "Stream a concise product update.",
  stream: true,
  reasoning: { effort: "low" },
});

for await (const event of stream) {
  if (event.type === "response.output_text.delta") {
    process.stdout.write(event.delta ?? "");
  }
}

Chat completions with tools

js
const completion = await client.chat.completions.create({
  model,
  messages: [
    { role: "user", content: "What is the weather in Boston?" },
  ],
  tools: [
    {
      type: "function",
      function: {
        name: "get_current_weather",
        description: "Get the current weather for a city.",
        parameters: {
          type: "object",
          properties: {
            location: { type: "string" },
            unit: { type: "string", enum: ["c", "f"] }
          },
          required: ["location"]
        }
      }
    }
  ],
  tool_choice: "auto",
});

console.log(completion.choices[0]?.message?.tool_calls ?? []);

Supported endpoints

These are the public routes exposed through the The Claw Bay proxy today.

GET
/models

List live models

Use this first so your app chooses a model that is actually available right now.

POST
/responses

Responses API

A good default for most integrations. Supports model, reasoning, tools, text controls, include fields, and streaming.

POST
/responses/compact

Compact responses

A smaller response shape for lightweight text generation or simple UI flows.

POST
/chat/completions

Chat completions

Useful for existing chat-based apps and function-calling workflows.

GET
/api/codex-auth/v1/quota

Usage remaining

Returns the same 5-hour and weekly usage budget telemetry shown in the dashboard for a regular The Claw Bay API key.

Usage remaining

bash
curl "https://theclawbay.com/api/codex-auth/v1/quota" \
  -H "Authorization: Bearer $THECLAWBAY_API_KEY"

Compatibility notes

Most standard parameters are passed through normally. A few OpenAI-compatible features are intentionally unsupported today, so handle those limits explicitly in your app.

Supported well today

  • Model selection with live discovery via `/models`.
  • Reasoning controls such as `reasoning.effort` and reasoning summaries.
  • Streaming responses and streaming chat completions.
  • Function-style tools in chat completions, including `tool_choice`.
  • JSON-style output and text verbosity controls.

Current limits

  • `store` must be `false`.
  • `previous_response_id` is not supported.
  • `truncation` is not supported.
  • Unsupported tool types include `file_search` and `code_interpreter`.
  • `file_id` inputs are not supported; send file bytes or URLs instead.
  • Only the supported GPT-5 / Codex model catalog returned by `/models` is accepted.

Need a key first?

Open the dashboard to reveal your key, or go to setup if you want the local command flow.