Developer Integration

OpenAI-Compatible API

Access Claude, GPT-5.3 Codex, Gemini, and DeepSeek through a single OpenAI-compatible endpoint. Change one line of code — your base_url — and everything works.

Full API Docs See Pricing

What does "OpenAI-compatible" mean?

CheapAI's proxy follows the same API contract as OpenAI's chat completions endpoint. That means any library, tool, or application built to talk to OpenAI can talk to CheapAI — including access to Claude, Gemini, and DeepSeek models that aren't natively OpenAI.

In practice, you change two things:

  1. base_url — point it to your CheapAI proxy URL instead of api.openai.com
  2. api_key — use the CheapAI key you received at checkout

Everything else — model selection, message format, streaming, function calling, JSON mode — works exactly the same way.

Models accessible through the proxy

Provider Models Endpoint format
Anthropic Claude Claude Opus 4.6, Claude Sonnet 4.6, Opus 4.5, Claude Haiku 4.5 /anthropic
OpenAI GPT-5.4, GPT-5.3 Codex Codex, GPT-5.2 Codex, GPT-5.3 Codex.2 /openai
Google Gemini 3 Pro, Gemini 3 Flash /openai
DeepSeek DeepSeek V3.2 /openai

Full model IDs and parameters: Models directory | API docs

Code examples

Python (OpenAI SDK)

from openai import OpenAI

client = OpenAI(
    api_key="your-cheapai-key",
    base_url="https://your-proxy-url/openai"
)

# Use ANY model — Claude, GPT, Gemini, DeepSeek
response = client.chat.completions.create(
    model="claude-sonnet-4-6-20260217",  # or "gpt-5.4", "gemini-3-pro-preview", etc.
    messages=[{"role": "user", "content": "Explain quantum computing in simple terms."}],
    stream=True
)

for chunk in response:
    print(chunk.choices[0].delta.content or "", end="")

JavaScript (Node.js)

import OpenAI from 'openai';

const client = new OpenAI({
  apiKey: 'your-cheapai-key',
  baseURL: 'https://your-proxy-url/openai',
});

const response = await client.chat.completions.create({
  model: 'gpt-5.4',
  messages: [{ role: 'user', content: 'Hello from Node.js!' }],
});

console.log(response.choices[0].message.content);

cURL

curl https://your-proxy-url/openai/v1/chat/completions \
  -H "Authorization: Bearer your-cheapai-key" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gemini-3-pro-preview",
    "messages": [{"role": "user", "content": "Hello Gemini!"}]
  }'

Compatible tools

Cursor AI

IDE with AI assistance

Setup guide →

Claude Code

CLI coding agent

Setup guide →

LangChain

AI framework

Open WebUI

Chat interface

n8n / Make

Workflow automation

Any OpenAI SDK

Python, JS, Go, Rust…

One API, all frontier models, 65% cheaper

Get your API key and proxy URL. Works with everything in the OpenAI ecosystem.

Get Started