9.4 KiB
Claw Code Usage
This guide covers the current Rust workspace under rust/ and the claw CLI binary. If you are brand new, make the doctor health check your first run: start claw, then run /doctor.
Quick-start health check
Run this before prompts, sessions, or automation:
cd rust
cargo build --workspace
./target/debug/claw
# first command inside the REPL
/doctor
/doctor is the built-in setup and preflight diagnostic. Once you have a saved session, you can rerun it with ./target/debug/claw --resume latest /doctor.
Prerequisites
- Rust toolchain with
cargo - One of:
ANTHROPIC_API_KEYfor direct API accessclaw loginfor OAuth-based auth
- Optional:
ANTHROPIC_BASE_URLwhen targeting a proxy or local service
Install / build the workspace
cd rust
cargo build --workspace
The CLI binary is available at rust/target/debug/claw after a debug build. Make the doctor check above your first post-build step.
Quick start
First-run doctor check
cd rust
./target/debug/claw
/doctor
Interactive REPL
cd rust
./target/debug/claw
One-shot prompt
cd rust
./target/debug/claw prompt "summarize this repository"
Shorthand prompt mode
cd rust
./target/debug/claw "explain rust/crates/runtime/src/lib.rs"
JSON output for scripting
cd rust
./target/debug/claw --output-format json prompt "status"
Model and permission controls
cd rust
./target/debug/claw --model sonnet prompt "review this diff"
./target/debug/claw --permission-mode read-only prompt "summarize Cargo.toml"
./target/debug/claw --permission-mode workspace-write prompt "update README.md"
./target/debug/claw --allowedTools read,glob "inspect the runtime crate"
Supported permission modes:
read-onlyworkspace-writedanger-full-access
Model aliases currently supported by the CLI:
opus→claude-opus-4-6sonnet→claude-sonnet-4-6haiku→claude-haiku-4-5-20251213
Authentication
API key
export ANTHROPIC_API_KEY="sk-ant-..."
OAuth
cd rust
./target/debug/claw login
./target/debug/claw logout
Local Models
claw can talk to local servers and provider gateways through either Anthropic-compatible or OpenAI-compatible endpoints. Use ANTHROPIC_BASE_URL with ANTHROPIC_AUTH_TOKEN for Anthropic-compatible services, or OPENAI_BASE_URL with OPENAI_API_KEY for OpenAI-compatible services. OAuth is Anthropic-only, so when OPENAI_BASE_URL is set you should use API-key style auth instead of claw login.
Anthropic-compatible endpoint
export ANTHROPIC_BASE_URL="http://127.0.0.1:8080"
export ANTHROPIC_AUTH_TOKEN="local-dev-token"
cd rust
./target/debug/claw --model "claude-sonnet-4-6" prompt "reply with the word ready"
OpenAI-compatible endpoint
export OPENAI_BASE_URL="http://127.0.0.1:8000/v1"
export OPENAI_API_KEY="local-dev-token"
cd rust
./target/debug/claw --model "qwen2.5-coder" prompt "reply with the word ready"
Ollama
export OPENAI_BASE_URL="http://127.0.0.1:11434/v1"
unset OPENAI_API_KEY
cd rust
./target/debug/claw --model "llama3.2" prompt "summarize this repository in one sentence"
OpenRouter
export OPENAI_BASE_URL="https://openrouter.ai/api/v1"
export OPENAI_API_KEY="sk-or-v1-..."
cd rust
./target/debug/claw --model "openai/gpt-4.1-mini" prompt "summarize this repository in one sentence"
Supported Providers & Models
claw has three built-in provider backends. The provider is selected automatically based on the model name, falling back to whichever credential is present in the environment.
Provider matrix
| Provider | Protocol | Auth env var(s) | Base URL env var | Default base URL |
|---|---|---|---|---|
| Anthropic (direct) | Anthropic Messages API | ANTHROPIC_API_KEY or ANTHROPIC_AUTH_TOKEN or OAuth (claw login) |
ANTHROPIC_BASE_URL |
https://api.anthropic.com |
| xAI | OpenAI-compatible | XAI_API_KEY |
XAI_BASE_URL |
https://api.x.ai/v1 |
| OpenAI-compatible | OpenAI Chat Completions | OPENAI_API_KEY |
OPENAI_BASE_URL |
https://api.openai.com/v1 |
The OpenAI-compatible backend also serves as the gateway for OpenRouter, Ollama, and any other service that speaks the OpenAI /v1/chat/completions wire format — just point OPENAI_BASE_URL at the service.
Tested models and aliases
These are the models registered in the built-in alias table with known token limits:
| Alias | Resolved model name | Provider | Max output tokens | Context window |
|---|---|---|---|---|
opus |
claude-opus-4-6 |
Anthropic | 32 000 | 200 000 |
sonnet |
claude-sonnet-4-6 |
Anthropic | 64 000 | 200 000 |
haiku |
claude-haiku-4-5-20251213 |
Anthropic | 64 000 | 200 000 |
grok / grok-3 |
grok-3 |
xAI | 64 000 | 131 072 |
grok-mini / grok-3-mini |
grok-3-mini |
xAI | 64 000 | 131 072 |
grok-2 |
grok-2 |
xAI | — | — |
Any model name that does not match an alias is passed through verbatim. This is how you use OpenRouter model slugs (openai/gpt-4.1-mini), Ollama tags (llama3.2), or full Anthropic model IDs (claude-sonnet-4-20250514).
User-defined aliases
You can add custom aliases in any settings file (~/.claw/settings.json, .claw/settings.json, or .claw/settings.local.json):
{
"aliases": {
"fast": "claude-haiku-4-5-20251213",
"smart": "claude-opus-4-6",
"cheap": "grok-3-mini"
}
}
Local project settings override user-level settings. Aliases resolve through the built-in table, so "fast": "haiku" also works.
How provider detection works
- If the resolved model name starts with
claude→ Anthropic. - If it starts with
grok→ xAI. - Otherwise,
clawchecks which credential is set:ANTHROPIC_API_KEY/ANTHROPIC_AUTH_TOKENfirst, thenOPENAI_API_KEY, thenXAI_API_KEY. - If nothing matches, it defaults to Anthropic.
FAQ
What about Codex?
The name "codex" appears in the Claw Code ecosystem but it does not refer to OpenAI Codex (the code-generation model). Here is what it means in this project:
oh-my-codex(OmX) is the workflow and plugin layer that sits on top ofclaw. It provides planning modes, parallel multi-agent execution, notification routing, and other automation features. See PHILOSOPHY.md and the oh-my-codex repo..codex/directories (e.g..codex/skills,.codex/agents,.codex/commands) are legacy lookup paths thatclawstill scans alongside the primary.claw/directories.CODEX_HOMEis an optional environment variable that points to a custom root for user-level skill and command lookups.
claw does not support OpenAI Codex sessions, the Codex CLI, or Codex session import/export. If you need to use OpenAI models (like GPT-4.1), configure the OpenAI-compatible provider as shown above in the OpenAI-compatible endpoint and OpenRouter sections.
HTTP proxy support
claw honours the standard HTTP_PROXY, HTTPS_PROXY, and NO_PROXY environment variables (both upper- and lower-case spellings are accepted) when issuing outbound requests to Anthropic, OpenAI-, and xAI-compatible endpoints. Set them before launching the CLI and the underlying reqwest client will be configured automatically.
export HTTPS_PROXY="http://proxy.corp.example:3128"
export HTTP_PROXY="http://proxy.corp.example:3128"
export NO_PROXY="localhost,127.0.0.1,.corp.example"
cd rust
./target/debug/claw prompt "hello via the corporate proxy"
Notes:
- When both
HTTPS_PROXYandHTTP_PROXYare set, the secure proxy applies tohttps://URLs and the plain proxy applies tohttp://URLs. NO_PROXYaccepts a comma-separated list of host suffixes (for example.corp.example) and IP literals.- Empty values are treated as unset, so leaving
HTTPS_PROXY=""in your shell will not enable a proxy. - If a proxy URL cannot be parsed,
clawfalls back to a direct (no-proxy) client so existing workflows keep working; double-check the URL if you expected the request to be tunnelled.
Common operational commands
cd rust
./target/debug/claw status
./target/debug/claw sandbox
./target/debug/claw agents
./target/debug/claw mcp
./target/debug/claw skills
./target/debug/claw system-prompt --cwd .. --date 2026-04-04
Session management
REPL turns are persisted under .claw/sessions/ in the current workspace.
cd rust
./target/debug/claw --resume latest
./target/debug/claw --resume latest /status /diff
Useful interactive commands include /help, /status, /cost, /config, /session, /model, /permissions, and /export.
Config file resolution order
Runtime config is loaded in this order, with later entries overriding earlier ones:
~/.claw.json~/.config/claw/settings.json<repo>/.claw.json<repo>/.claw/settings.json<repo>/.claw/settings.local.json
Mock parity harness
The workspace includes a deterministic Anthropic-compatible mock service and parity harness.
cd rust
./scripts/run_mock_parity_harness.sh
Manual mock service startup:
cd rust
cargo run -p mock-anthropic-service -- --bind 127.0.0.1:0
Verification
cd rust
cargo test --workspace
Workspace overview
Current Rust crates:
apicommandscompat-harnessmock-anthropic-servicepluginsruntimerusty-claude-clitelemetrytools