Anthropic SDK 配置指南
Anthropic 官方 SDK 支持通过 base_url 参数替换 API 端点。
端点说明
Anthropic SDK 会自动在 baseURL 后追加 /v1,因此填写:
https://llm.starapp.net/api/llm
Python
安装
pip install anthropic
代码示例
import anthropic
client = anthropic.Anthropic(
base_url="https://llm.starapp.net/api/llm",
api_key="your-token-here",
)
message = client.messages.create(
model="claude-sonnet-4-5",
max_tokens=1024,
messages=[
{"role": "user", "content": "Hello, Claude!"}
]
)
print(message.content[0].text)
流式输出
with client.messages.stream(
model="claude-sonnet-4-5",
max_tokens=1024,
messages=[{"role": "user", "content": "写一首诗"}],
) as stream:
for text in stream.text_stream:
print(text, end="", flush=True)
环境变量方式
export ANTHROPIC_BASE_URL=https://llm.starapp.net/api/llm
export ANTHROPIC_AUTH_KEY=your-token-here
import anthropic
# 自动从环境变量读取
client = anthropic.Anthropic()
Node.js / TypeScript
安装
npm install @anthropic-ai/sdk
# 或
yarn add @anthropic-ai/sdk
代码示例
import Anthropic from "@anthropic-ai/sdk";
const client = new Anthropic({
baseURL: "https://llm.starapp.net/api/llm",
apiKey: "your-token-here",
});
const message = await client.messages.create({
model: "claude-sonnet-4-5",
max_tokens: 1024,
messages: [
{ role: "user", content: "Hello, Claude!" }
],
});
console.log(message.content[0].text);
流式输出
const stream = client.messages.stream({
model: "claude-sonnet-4-5",
max_tokens: 1024,
messages: [{ role: "user", content: "写一首诗" }],
});
for await (const event of stream) {
if (event.type === "content_block_delta" && event.delta.type === "text_delta") {
process.stdout.write(event.delta.text);
}
}
cURL 测试
curl https://llm.starapp.net/api/llm/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: your-token-here" \
-H "anthropic-version: 2023-06-01" \
-d '{
"model": "claude-sonnet-4-5",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello!"}
]
}'
注意事项
api_key填入您的 Inkess Token(ik-...)- 可用模型 ID 见模型列表
- Anthropic SDK 的
base_url无需追加/v1