mirror of
https://github.com/instructkr/claw-code.git
synced 2026-04-09 01:24:49 +08:00
docs(providers): add honest provider/auth support matrix + improve MissingApiKey error copy
Problem:
- `rust/README.md` on `dev/rust` only shows `ANTHROPIC_API_KEY` under
Configuration and says nothing about other providers, so users
asking "does AWS Bedrock work? other API keys?" have no answer and
assume silent support.
- `ApiError::MissingApiKey` Display message says "Anthropic API" but
gives no signal to a user who exported `OPENAI_API_KEY` (or
`AWS_ACCESS_KEY_ID`) that their key is being ignored because this
branch has no code path for that provider.
- The multi-provider routing work (providers/anthropic.rs,
providers/openai_compat.rs, prefix routing for openai/, qwen/, etc.)
landed on `main` but has not yet merged into `dev/rust`, so the
support matrix actually differs between branches. Nothing in dev
docs communicates that.
Changes:
1. `rust/README.md`: new "Providers & Auth Support Matrix" section
between Configuration and Features, split into three sub-sections:
a. Supported on `dev/rust` (this branch) — just Anthropic. Explicit
call-out that OPENAI_API_KEY / XAI_API_KEY / DASHSCOPE_API_KEY
are ignored here because the `providers/` module does not exist
on this branch.
b. Additionally supported on `main` — xAI, OpenAI, DashScope, with
a note that model-name prefix routing (`openai/`, `gpt-`,
`qwen/`, `qwen-`) wins over env-var presence.
c. Not supported anywhere in this repo (yet) — AWS Bedrock, Google
Vertex AI, Azure OpenAI, Google Gemini, each with a one-line
"why it doesn't work today" pointing at the concrete code gap
(no SigV4 signer, no Google ADC path, api-version query params,
etc.) so users don't chase phantom config knobs. Proxy-based
escape hatch documented.
2. `rust/crates/api/src/error.rs`: `MissingApiKey` Display message now
keeps the grep-stable prefix ("ANTHROPIC_AUTH_TOKEN or
ANTHROPIC_API_KEY is not set") AND tells the user exactly which
other env vars are ignored on this branch plus which providers
aren't supported on any branch yet, with a pointer at the README
matrix section. Non-breaking change — the variant is still a unit
struct, no callers need to change.
3. New regression test
`missing_api_key_display_lists_supported_and_unsupported_providers_and_points_at_readme`
in `rust/crates/api/src/error.rs` asserts the grep-stable prefix is
preserved AND that OPENAI_API_KEY, XAI_API_KEY, DASHSCOPE_API_KEY,
Bedrock, Vertex, Azure, and rust/README.md all appear in the
rendered message, so future tweaks cannot silently drop the
user-facing signal without breaking CI.
Verification:
- `cargo build --release -p api` clean
- `cargo test --release -p api` 26 unit + 6 integration = 32 passing
- New regression test passes
- `cargo fmt -p api` clean
- `cargo clippy --release -p api` clean
Note: workspace-wide `cargo test` shows 11 pre-existing
`rusty-claude-cli` failures on clean `dev/rust` HEAD caused by tests
reading `~/.claude/plugins/installed/sample-hooks-bundled/` from the
host home directory instead of an isolated test fixture. These are
environment-leak test brittleness, not caused by this PR (verified by
stashing changes and re-running — failures reproduce on unmodified
HEAD). Filing as a separate ROADMAP pinpoint.
Does not close any open issue (issues are disabled on the repo);
addresses Clawhip dogfood nudge from 2026-04-08 about users asking
"other api keys? AWS Bedrock도 되냐" without a clear matrix.
Co-authored-by: gaebal-gajae <gaebal-gajae@layofflabs.com>
This commit is contained in:
@@ -35,6 +35,78 @@ Or authenticate via OAuth:
|
||||
claw login
|
||||
```
|
||||
|
||||
## Providers & Auth Support Matrix
|
||||
|
||||
Before anything else: **know which branch you're building.** Provider
|
||||
support differs between `dev/rust` and `main`, and neither branch
|
||||
currently supports AWS Bedrock, Google Vertex AI, or Azure OpenAI.
|
||||
|
||||
### Supported on `dev/rust` (this branch)
|
||||
|
||||
| Provider | Protocol | Auth env var(s) | Base URL env var | Default base URL |
|
||||
|---|---|---|---|---|
|
||||
| **Anthropic** (direct) | Anthropic Messages API | `ANTHROPIC_API_KEY` or `ANTHROPIC_AUTH_TOKEN` or OAuth (`claw login`) | `ANTHROPIC_BASE_URL` | `https://api.anthropic.com` |
|
||||
|
||||
That's it. On `dev/rust`, the `api` crate has a single provider
|
||||
backend (`rust/crates/api/src/client.rs`) wired directly to
|
||||
Anthropic's Messages API. There is no `providers/` module, no
|
||||
auto-routing by model prefix, and no OpenAI-compatible adapter. If you
|
||||
export `OPENAI_API_KEY`, `XAI_API_KEY`, or `DASHSCOPE_API_KEY` on this
|
||||
branch, claw will ignore them and still fail with `MissingApiKey`
|
||||
because it only looks at `ANTHROPIC_*`.
|
||||
|
||||
### Additionally supported on `main`
|
||||
|
||||
`main` has a multi-provider routing layer under
|
||||
`rust/crates/api/src/providers/` that `dev/rust` does not yet carry.
|
||||
If you need any of these, build from `main` and wait for the routing
|
||||
work to land on `dev/rust`:
|
||||
|
||||
| Provider | Protocol | Auth env var | Default base URL |
|
||||
|---|---|---|---|
|
||||
| **xAI** (Grok) | OpenAI-compatible | `XAI_API_KEY` | `https://api.x.ai/v1` |
|
||||
| **OpenAI** | OpenAI Chat Completions | `OPENAI_API_KEY` | `https://api.openai.com/v1` |
|
||||
| **DashScope** (Alibaba Qwen) | OpenAI-compatible | `DASHSCOPE_API_KEY` | `https://dashscope.aliyuncs.com/compatible-mode/v1` |
|
||||
|
||||
Any service that speaks the OpenAI `/v1/chat/completions` wire format
|
||||
also works by pointing `OPENAI_BASE_URL` at it (OpenRouter, Ollama,
|
||||
local LLM proxies, etc.).
|
||||
|
||||
On `main`, the provider is selected automatically by model-name prefix
|
||||
(`claude` → Anthropic, `grok` → xAI, `openai/` or `gpt-` → OpenAI,
|
||||
`qwen/` or `qwen-` → DashScope) before falling through to whichever
|
||||
credential is present. Prefix routing wins over env-var presence, so
|
||||
setting `ANTHROPIC_API_KEY` will not silently hijack an
|
||||
`openai/gpt-4.1-mini` request.
|
||||
|
||||
### Not supported anywhere in this repo (yet)
|
||||
|
||||
These are the provider backends people reasonably expect to work but
|
||||
which **do not have any code path** in either `dev/rust` or `main` as
|
||||
of this commit. Setting the corresponding cloud SDK env vars will not
|
||||
make them work — there is nothing to wire them into.
|
||||
|
||||
| Provider | Why it doesn't work today |
|
||||
|---|---|
|
||||
| **AWS Bedrock** | No SigV4 signer, no Bedrock-specific request adapter, no `AWS_*` credential path in the api crate. Bedrock's Claude endpoint is wire-compatible with a different request envelope than direct Anthropic and would need a dedicated backend. |
|
||||
| **Google Vertex AI (Anthropic on Vertex)** | No Google auth (service account / ADC) path, no Vertex-specific base URL adapter. Vertex publishes Claude models under a `projects/<proj>/locations/<loc>/publishers/anthropic/models/<model>:streamRawPredict` URL shape that requires a separate route. |
|
||||
| **Azure OpenAI** | OpenAI wire format but uses `api-version` query params, `api-key` header (not `Authorization: Bearer`), and deployment-name routing instead of model IDs. The `main`-branch OpenAI-compatible adapter assumes upstream OpenAI semantics and won't round-trip Azure deployments cleanly. |
|
||||
| **Google AI Studio (Gemini)** | Different request shape entirely; not OpenAI-compatible and not Anthropic-compatible. Would need its own backend. |
|
||||
|
||||
If you need one of these, the honest answer today is: use a proxy that
|
||||
speaks Anthropic or OpenAI on its public side and translates to
|
||||
Bedrock/Vertex/Azure/Gemini internally. Setting `ANTHROPIC_BASE_URL`
|
||||
or (on `main`) `OPENAI_BASE_URL` at a translation proxy is the
|
||||
supported escape hatch until first-class backends land.
|
||||
|
||||
### When auth fails
|
||||
|
||||
If you see `ANTHROPIC_AUTH_TOKEN or ANTHROPIC_API_KEY is not set` on
|
||||
`dev/rust` after setting, say, `OPENAI_API_KEY`, that's not a bug —
|
||||
it's this branch telling you honestly that it doesn't yet know how to
|
||||
talk to OpenAI. Either build from `main`, or export
|
||||
`ANTHROPIC_API_KEY`, or run `claw login` to use Anthropic OAuth.
|
||||
|
||||
## Features
|
||||
|
||||
| Feature | Status |
|
||||
|
||||
Reference in New Issue
Block a user