Documentation Index
Fetch the complete documentation index at: https://docs.modelstack.cc/llms.txt
Use this file to discover all available pages before exploring further.
Overview
ModelStack is fully compatible with the OpenAI SDK. Just change base_url and api_key — everything else works the same, including streaming, function calling, and all chat completion parameters.
Python
Installation
Setup
from openai import OpenAI
client = OpenAI(
api_key="your_api_key",
base_url="https://api.modelstack.cc/v1"
)
Chat Completion
response = client.chat.completions.create(
model="claude-sonnet-4-5",
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "Explain quantum computing."}
],
temperature=0.7,
max_tokens=500
)
print(response.choices[0].message.content)
Streaming
stream = client.chat.completions.create(
model="claude-sonnet-4-5",
messages=[{"role": "user", "content": "Write a poem about coding."}],
stream=True
)
for chunk in stream:
if chunk.choices[0].delta.content is not None:
print(chunk.choices[0].delta.content, end="")
List Models
models = client.models.list()
for model in models.data:
print(f"{model.id} ({model.owned_by})")
Using Different Models
# Switch between providers by changing the model parameter
providers = {
"Anthropic": "claude-sonnet-4-5",
"OpenAI": "gpt-4o",
"Google": "gemini-2.5-pro",
}
for provider, model in providers.items():
response = client.chat.completions.create(
model=model,
messages=[{"role": "user", "content": "Say hello!"}]
)
print(f"{provider}: {response.choices[0].message.content}")
Environment Variable
Instead of hardcoding your API key, use an environment variable:
export OPENAI_API_KEY="your_api_key"
export OPENAI_BASE_URL="https://api.modelstack.cc/v1"
# No need to pass api_key or base_url — the SDK reads from env vars
client = OpenAI()
Node.js / TypeScript
Installation
Setup
import OpenAI from "openai";
const client = new OpenAI({
apiKey: "your_api_key",
baseURL: "https://api.modelstack.cc/v1",
});
Chat Completion
const response = await client.chat.completions.create({
model: "claude-sonnet-4-5",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: "Explain quantum computing." },
],
temperature: 0.7,
max_tokens: 500,
});
console.log(response.choices[0].message.content);
Streaming
const stream = await client.chat.completions.create({
model: "claude-sonnet-4-5",
messages: [{ role: "user", content: "Write a poem about coding." }],
stream: true,
});
for await (const chunk of stream) {
const content = chunk.choices[0]?.delta?.content || "";
process.stdout.write(content);
}
Environment Variables
export OPENAI_API_KEY="your_api_key"
export OPENAI_BASE_URL="https://api.modelstack.cc/v1"
// SDK reads from env vars automatically
const client = new OpenAI();