Перейти к содержимому

Стриминг

Anthropic Messages API и OpenAI-совместимый эндпоинт поддерживают потоковые ответы через Server-Sent Events (SSE). Установите stream: true в запросе для активации.

SSE-события Anthropic

При использовании POST /v1/messages с stream: true вы получаете следующие типы событий:

СобытиеОписание
message_startПервое событие. Содержит объект сообщения с id, model, role и начальным usage.
content_block_startНачало блока контента.
content_block_deltaИнкрементальный текстовый контент.
content_block_stopКонец блока контента.
message_deltaФинальное обновление с stop_reason и output usage.
message_stopПоток завершён.

Пример SSE-потока (Anthropic)

text
event: message_start
data: {"type":"message_start","message":{"id":"msg_01...","type":"message","role":"assistant","content":[],"model":"claude-sonnet-4.5","usage":{"input_tokens":12,"output_tokens":0}}}

event: content_block_start
data: {"type":"content_block_start","index":0,"content_block":{"type":"text","text":""}}

event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":"Hello"}}

event: content_block_delta
data: {"type":"content_block_delta","index":0,"delta":{"type":"text_delta","text":"! How can I help?"}}

event: content_block_stop
data: {"type":"content_block_stop","index":0}

event: message_delta
data: {"type":"message_delta","delta":{"stop_reason":"end_turn"},"usage":{"output_tokens":8}}

event: message_stop
data: {"type":"message_stop"}

SSE-события OpenAI

При использовании POST /v1/chat/completions с stream: true события следуют формату OpenAI:

text
data: {"id":"chatcmpl-...","object":"chat.completion.chunk","choices":[{"index":0,"delta":{"role":"assistant","content":""},"finish_reason":null}]}

data: {"id":"chatcmpl-...","object":"chat.completion.chunk","choices":[{"index":0,"delta":{"content":"Hello"},"finish_reason":null}]}

data: {"id":"chatcmpl-...","object":"chat.completion.chunk","choices":[{"index":0,"delta":{"content":"! How can I help?"},"finish_reason":null}]}

data: {"id":"chatcmpl-...","object":"chat.completion.chunk","choices":[{"index":0,"delta":{},"finish_reason":"stop"}]}

data: [DONE]

Пример cURL

bash
curl https://api.claudexia.tech/v1/messages \
  --no-buffer \
  -H "Content-Type: application/json" \
  -H "x-api-key: sk_cdx_YOUR_KEY" \
  -H "anthropic-version: 2023-06-01" \
  -d '{
    "model": "claude-sonnet-4.5",
    "max_tokens": 1024,
    "stream": true,
    "messages": [
      {"role": "user", "content": "Tell me a short joke."}
    ]
  }'

Пример стриминга на Python

python
from openai import OpenAI

client = OpenAI(
    api_key="sk_cdx_YOUR_KEY",
    base_url="https://api.claudexia.tech/v1",
)

stream = client.chat.completions.create(
    model="claude-sonnet-4.5",
    max_tokens=1024,
    stream=True,
    messages=[
        {"role": "user", "content": "Tell me a short joke."},
    ],
)

for chunk in stream:
    content = chunk.choices[0].delta.content
    if content:
        print(content, end="", flush=True)
print()