API 文档API Documentation

AnyHop AI 兼容 Anthropic / OpenAI API 格式,无缝切换,即改即用。AnyHop AI is compatible with Anthropic / OpenAI API formats. Seamless switching, instant integration.

简介Introduction

AnyHop AI 提供统一的 API 网关服务,通过一个 API Key 即可访问 Claude、GPT、Gemini、DeepSeek 等多种主流大模型。接口与 Anthropic / OpenAI 官方完全兼容,现有代码仅需修改 Base URL 即可接入,无需更改任何业务逻辑。AnyHop AI provides a unified API gateway service. Access Claude, GPT, Gemini, DeepSeek and more with a single API Key. Fully compatible with Anthropic / OpenAI official APIs — just change the Base URL in your existing code.

💡 已在使用 Anthropic 或 OpenAI SDK?只需替换 base_urlapi_key 两个参数即可立即使用 AnyHop AI,无需修改任何其他代码。💡 Already using Anthropic or OpenAI SDK? Just replace base_url and api_key to start using AnyHop AI — no other code changes needed.

认证方式Authentication

所有 API 请求都需要在 HTTP Header 中携带 API Key 进行身份认证。AnyHop AI 支持以下两种认证格式,你可以根据使用的 SDK 选择对应的方式:All API requests require an API Key in the HTTP Header. AnyHop AI supports two authentication formats — choose based on your SDK:

方式一:Anthropic 原生格式Method 1: Anthropic Native Format

适用于 Anthropic SDK 或直接调用 Anthropic 兼容端点:For Anthropic SDK or direct Anthropic-compatible endpoint calls:

x-api-key: sk-anyhop-xxxxxxxxxxxx

方式二:Bearer Token 格式Method 2: Bearer Token Format

适用于 OpenAI SDK 或通用 HTTP 客户端:For OpenAI SDK or generic HTTP clients:

Authorization: Bearer sk-anyhop-xxxxxxxxxxxx
⚠️ 安全提醒:请勿在客户端、前端代码或公开仓库中暴露 API Key。API Key 应仅在服务端安全环境中使用。如果 Key 不慎泄露,请立即在控制台重新生成。⚠️ Security: Never expose API Keys in client-side code, frontend, or public repositories. Use them only in server-side environments. If a key is compromised, regenerate it immediately in the console.

Base URL

根据你使用的 SDK 类型,选择对应的 Base URL:Choose the Base URL based on your SDK type:

兼容格式FormatBase URL适用场景Use Case
Anthropichttps://anyhop.aianthropic SDK / Claude Code / Cline / Aideranthropic SDK / Claude Code / Cline / Aider
OpenAIhttps://anyhop.ai/openai/v1openai SDK / Cursor / Windsurf / ChatBoxopenai SDK / Cursor / Windsurf / ChatBox

版本控制Versioning

使用 Anthropic 兼容端点时,需要在请求头中指定 API 版本:When using Anthropic-compatible endpoints, specify the API version in the request header:

anthropic-version: 2023-06-01

当前支持的版本为 2023-06-01,建议始终显式指定版本号以确保接口行为一致。Currently supported version is 2023-06-01. Always specify the version explicitly for consistent API behavior.

Messages API

Messages API 是最核心的接口,用于发送消息并获取 AI 模型的回复。完全兼容 Anthropic Messages API 规范。The Messages API is the core endpoint for sending messages and receiving AI model responses. Fully compatible with the Anthropic Messages API spec.

POST /v1/messages

请求体参数Request Body Parameters

参数Param类型Type必填Required说明Description
modelstringYes模型 ID,如 claude-sonnet-4-6Model ID, e.g. claude-sonnet-4-6
messagesarrayYes对话消息列表,每条消息包含 role 和 contentList of messages, each with role and content
max_tokensintegerYes模型最大输出 Token 数Max output tokens
systemstring | arrayNo系统提示词System prompt
temperaturenumberNo采样温度 (0-1),默认 1Sampling temperature (0-1), default 1
streambooleanNo是否启用流式输出,默认 falseEnable streaming, default false
top_pnumberNoTop-P (0-1)
top_kintegerNoTop-K
stop_sequencesarrayNo自定义停止序列Custom stop sequences
metadataobjectNo请求元数据Request metadata

基础请求示例Basic Request Example

curl https://anyhop.ai/v1/messages \
  -H "Content-Type: application/json" \
  -H "x-api-key: sk-anyhop-xxxx" \
  -H "anthropic-version: 2023-06-01" \
  -d '{
    "model": "claude-sonnet-4-6",
    "max_tokens": 1024,
    "messages": [
      {"role": "user", "content": "What is an API gateway?"}
    ]
  }'

响应格式Response Format

{
  "id": "msg_01XFDUDYJgAACzvnptvVoYEL",
  "type": "message",
  "role": "assistant",
  "content": [
    {
      "type": "text",
      "text": "An API gateway is a server that acts as an intermediary..."
    }
  ],
  "model": "claude-sonnet-4-6",
  "stop_reason": "end_turn",
  "usage": {
    "input_tokens": 18,
    "output_tokens": 156
  }
}

stop_reason

Value说明Description
end_turn模型正常完成回复Normal completion
max_tokens达到上限被截断Truncated at max_tokens
stop_sequence遇到停止序列Hit stop sequence
tool_use请求使用工具Tool use requested

多轮对话Multi-turn Conversations

通过在 messages 数组中交替排列 userassistant 角色的消息,可以实现多轮对话。Alternate user and assistant roles in the messages array for multi-turn conversations.

{
  "model": "claude-sonnet-4-6",
  "max_tokens": 1024,
  "messages": [
    {"role": "user", "content": "I'm learning Python, can you help?"},
    {"role": "assistant", "content": "Of course! What topic would you like to start with?"},
    {"role": "user", "content": "How do list comprehensions work?"}
  ]
}
💡 Tips:对话消息必须以 user 角色开始,且 user/assistant 需交替出现。每轮对话都会消耗输入 Token。💡 Tips: Messages must start with user role, and user/assistant must alternate. Each turn consumes input tokens.

多模态 / 视觉Vision / Multimodal

支持发送图片给模型进行视觉理解和分析。Send images to models for visual understanding and analysis.

Base64

{
  "model": "claude-sonnet-4-6",
  "max_tokens": 1024,
  "messages": [
    {
      "role": "user",
      "content": [
        {
          "type": "image",
          "source": {
            "type": "base64",
            "media_type": "image/png",
            "data": "iVBORw0KGgo..."
          }
        },
        {
          "type": "text",
          "text": "Describe this image in detail"
        }
      ]
    }
  ]
}
项目Item说明Details
支持格式FormatsJPEG, PNG, GIF, WebP
单张上限Max size5MB
多图支持Multi-image单条消息最多 20 张Up to 20 per message

Streaming(流式输出)Streaming

设置 "stream": true 启用 SSE 流式输出,实时逐字获取模型生成的内容。Set "stream": true to enable SSE streaming for real-time token-by-token output.

curl https://anyhop.ai/v1/messages \
  -H "Content-Type: application/json" \
  -H "x-api-key: sk-anyhop-xxxx" \
  -H "anthropic-version: 2023-06-01" \
  -d '{
    "model": "claude-sonnet-4-6",
    "max_tokens": 1024,
    "stream": true,
    "messages": [
      {"role": "user", "content": "Write a short poem about coding"}
    ]
  }'

SSE 事件类型SSE Event Types

事件Event说明Description
message_start消息开始,包含元数据Message start with metadata
content_block_delta增量文本内容Incremental text content
message_delta包含 stop_reason 和 usageContains stop_reason and usage
message_stop消息完成Message complete
ping心跳保活Keepalive ping

Models API

查询当前可用的模型列表。Query the list of available models.

GET /v1/models

OpenAI 兼容接口OpenAI Compatible API

AnyHop AI 同时提供完整的 OpenAI Chat Completions 兼容接口。AnyHop AI also provides a full OpenAI Chat Completions compatible interface.

POST /openai/v1/chat/completions
from openai import OpenAI

client = OpenAI(
    api_key="sk-anyhop-xxxx",
    base_url="https://anyhop.ai/openai/v1"
)

response = client.chat.completions.create(
    model="gpt-5.4",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user", "content": "Explain microservices architecture"}
    ]
)
print(response.choices[0].message.content)

系统提示词System Prompt

系统提示词用于定义 AI 的角色、行为方式和约束条件。System prompts define the AI's role, behavior, and constraints.

{
  "model": "claude-sonnet-4-6",
  "max_tokens": 1024,
  "system": "You are AnyHop AI's support assistant. Answer questions about our product and pricing in a professional but friendly tone.",
  "messages": [
    {"role": "user", "content": "What models do you support?"}
  ]
}

Tool Use

允许模型调用你预定义的工具/函数来获取外部信息或执行操作。Allows the model to call predefined tools/functions to fetch external information or perform actions.

{
  "model": "claude-sonnet-4-6",
  "max_tokens": 1024,
  "tools": [
    {
      "name": "get_weather",
      "description": "Get current weather for a city",
      "input_schema": {
        "type": "object",
        "properties": {
          "city": {"type": "string", "description": "City name"},
          "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}
        },
        "required": ["city"]
      }
    }
  ],
  "messages": [
    {"role": "user", "content": "What's the weather in Beijing?"}
  ]
}

Prompt Caching

对于 Anthropic 模型,支持 Prompt Caching 功能。将重复使用的系统提示词标记为可缓存,可降低约 90% 输入 Token 费用。For Anthropic models, Prompt Caching lets you mark reusable system prompts as cacheable, reducing input token costs by ~90%.

{
  "model": "claude-sonnet-4-6",
  "max_tokens": 1024,
  "system": [
    {
      "type": "text",
      "text": "Your long system prompt here...",
      "cache_control": {"type": "ephemeral"}
    }
  ],
  "messages": [{"role": "user", "content": "Hello"}]
}
💡 缓存有效期为 5 分钟,至少需要 1024 Token 才能生效。详见 模型广场 中的缓存价格。💡 Cache TTL is 5 minutes, requires at least 1024 tokens. See Models for cache pricing.

错误处理Error Handling

API 使用标准 HTTP 状态码。The API uses standard HTTP status codes.

状态码Code错误类型Error Type说明Description
400invalid_request_error请求参数错误Bad request
401authentication_errorAPI Key 无效Invalid API Key
403permission_error权限不足Insufficient permissions
404not_found_error模型不存在Model not found
429rate_limit_error请求超限Rate limited
500api_error服务器错误Server error
529overloaded_error上游过载Upstream overloaded

速率限制Rate Limits

速率限制取决于你的账户等级。超出限制时返回 429 状态码。Rate limits depend on your account tier. Returns 429 when exceeded.

响应头Header说明Description
x-ratelimit-limit最大请求数Max requests
x-ratelimit-remaining剩余请求数Remaining requests
x-ratelimit-reset重置时间戳Reset timestamp
retry-after建议等待秒数Suggested wait (seconds)

SDK 与工具集成SDK & Tool Integration

AnyHop AI 完全兼容 Anthropic 和 OpenAI 官方 SDK,同时支持所有主流 AI 编程工具和框架。AnyHop AI is fully compatible with official Anthropic and OpenAI SDKs, and supports all major AI coding tools and frameworks.

官方 SDKOfficial SDKs

语言LanguageSDK安装InstallBase URL
Pythonanthropicpip install anthropichttps://anyhop.ai
Pythonopenaipip install openaihttps://anyhop.ai/openai/v1
Node.js@anthropic-ai/sdknpm i @anthropic-ai/sdkhttps://anyhop.ai
Node.jsopenainpm i openaihttps://anyhop.ai/openai/v1
Goanthropic-gogo get github.com/anthropics/anthropic-sdk-gohttps://anyhop.ai

Claude Code

Anthropic 官方 AI 编程工具,支持 CLI、桌面端(Mac/Windows)、Web 端(claude.ai/code)和 IDE 插件。Anthropic's official AI coding tool. Available as CLI, desktop app (Mac/Windows), web app (claude.ai/code), and IDE extensions.

# 环境变量配置(适用于所有客户端)
export ANTHROPIC_API_KEY="sk-anyhop-xxxx"
export ANTHROPIC_BASE_URL="https://anyhop.ai"
claude
// VS Code settings.json
{
  "claude-code.apiKey": "sk-anyhop-xxxx",
  "claude-code.apiBaseUrl": "https://anyhop.ai"
}

Cursor

# Settings → Models → OpenAI API Key
API Key:  sk-anyhop-xxxx
Base URL: https://anyhop.ai/openai/v1

Cline (VS Code)

# Cline 侧边栏 → 设置
API Provider: Anthropic
API Key:      sk-anyhop-xxxx
Base URL:     https://anyhop.ai
Model:        claude-sonnet-4-6

Continue (VS Code / JetBrains)

// ~/.continue/config.json
{
  "models": [{
    "title": "Claude via AnyHop",
    "provider": "anthropic",
    "model": "claude-sonnet-4-6",
    "apiKey": "sk-anyhop-xxxx",
    "apiBase": "https://anyhop.ai"
  }]
}

Aider

export ANTHROPIC_API_KEY="sk-anyhop-xxxx"
export ANTHROPIC_BASE_URL="https://anyhop.ai"
aider --model claude-sonnet-4-6

Windsurf

# Settings → AI Provider
Provider: OpenAI Compatible
API Key:  sk-anyhop-xxxx
Base URL: https://anyhop.ai/openai/v1

LangChain / LlamaIndex

from langchain_anthropic import ChatAnthropic

llm = ChatAnthropic(
    model="claude-sonnet-4-6",
    api_key="sk-anyhop-xxxx",
    base_url="https://anyhop.ai"
)
💡 通用规则:支持 Anthropic 原生格式的工具,Base URL 填 https://anyhop.ai;仅支持 OpenAI 格式的工具,填 https://anyhop.ai/openai/v1💡 General rule: For tools with Anthropic native support, use https://anyhop.ai; for OpenAI-only tools, use https://anyhop.ai/openai/v1.

详细配置步骤请参考 快速上手教程For detailed setup steps, see the Quick Start Tutorial.