Tested, drop-in snippets for connecting to CheapAI using the OpenAI SDK format. No architectural changes needed — just swap your base URL and API key.
https://cheapai-netifly-app.up.railway.app/v1
Authorization: Bearer YOUR_CHEAPAI_KEY
Use the official openai package. Just pass the CheapAI base URL — everything else stays the same.
from openai import OpenAI
import os
client = OpenAI(
api_key=os.environ.get("CHEAPAI_API_KEY"),
base_url="https://cheapai-netifly-app.up.railway.app/v1",
)
# Call Claude 4.6 Sonnet via the OpenAI SDK wrapper
response = client.chat.completions.create(
model="claude-sonnet-4-6-20260217",
messages=[{"role": "user", "content": "What is 17 * 43?"}],
)
print(response.choices[0].message.content)
The official npm openai package, with streaming enabled and the Gemini 3.1 Pro Preview model.
import OpenAI from 'openai';
const client = new OpenAI({
apiKey: process.env.CHEAPAI_API_KEY,
baseURL: 'https://cheapai-netifly-app.up.railway.app/v1',
});
async function main() {
const stream = await client.chat.completions.create({
model: 'gemini-3.1-pro-preview',
messages: [{ role: 'user', content: 'Explain HTTP streaming.' }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || '');
}
}
main();
Override base_url on the ChatOpenAI wrapper. The rest of the LCEL toolchain works natively.
from langchain_openai import ChatOpenAI
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParser
import os
llm = ChatOpenAI(
api_key=os.environ.get("CHEAPAI_API_KEY"),
base_url="https://cheapai-netifly-app.up.railway.app/v1",
model="claude-sonnet-4-6-20260217",
)
prompt = ChatPromptTemplate.from_messages([
("system", "Translate to {target_language}."),
("user", "{text}")
])
chain = prompt | llm | StrOutputParser()
print(chain.invoke({"target_language": "French", "text": "Hello, world!"}))
For raw terminal access or debugging API payloads. Works with DeepSeek, Claude, and Gemini model IDs.
curl https://cheapai-netifly-app.up.railway.app/v1/chat/completions \
-H "Authorization: Bearer $CHEAPAI_API_KEY" \
-H "Content-Type: application/json" \
-d '{
"model": "deepseek-chat",
"messages": [{"role": "user", "content": "Hello!"}]
}'
Set these environment variables to route all OpenAI-type requests through CheapAI. The /v1/models endpoint returns a full model list, so available models appear automatically in the UI dropdown.
OPENAI_API_BASE_URL=https://cheapai-netifly-app.up.railway.app/v1 OPENAI_API_KEY=your_cheapai_key