Documentation Index
Fetch the complete documentation index at: https://docs.myrouter.ai/llms.txt
Use this file to discover all available pages before exploring further.
Native Protocol Support
All Anthropic models on this platform can be accessed through the native /v1/messages protocol. For example, to activate the 1M token context, use the following request:
curl https://api.myrouter.ai/anthropic/v1/messages \
-H "x-api-key: $API_KEY" \
-H "anthropic-beta: context-1m-2025-08-07" \
-H "content-type: application/json" \
-d '{
"model": "claude-sonnet-4-20250514",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Process this large document..."}
]
}'
Prompt caching
This platform supports Prompt caching via the Anthropic protocol or the OpenAI-compatible protocol.
For details, refer to the documentation Model Features - Prompt caching.
Extend thinking
Currently, controlling the thinking process is only supported via the Anthropic protocol.
curl https://api.myrouter.ai/anthropic/v1/messages \
-H "x-api-key: $API_KEY" \
-H "content-type: application/json" \
-d \
'{
"model": "claude-sonnet-4-20250514",
"max_tokens": 16000,
"thinking": {
"type": "enabled",
"budget_tokens": 10000
},
"messages": [
{
"role": "user",
"content": "Are there an infinite number of prime numbers such that n mod 4 == 3?"
}
]
}'
Currently only Bash and Text editor are supported. Computer use, Web fetch, Web search, etc. are not yet supported.
Refer to the official documentation for usage instructions.
Claude Code Usage
Refer to Third-Party Tool Configuration - Claude Code.