LLM Provider Reference¶
Supported Providers¶
| Provider | Env Variable | Default Model | Status |
|---|---|---|---|
| Anthropic | ANTHROPIC_API_KEY |
claude-sonnet-4-20250514 | Primary |
| OpenRouter | OPENROUTER_API_KEY |
anthropic/claude-sonnet-4 | Full support |
| OpenAI | OPENAI_API_KEY |
gpt-4o | Full support |
| ZAI | ZAI_API_KEY |
glm-5 | Basic support |
Configuration¶
Environment Variables¶
export ANTHROPIC_API_KEY=sk-ant-...
export OPENROUTER_API_KEY=sk-or-...
export OPENAI_API_KEY=sk-...
Config File (.attocode/config.json)¶
CLI Flags¶
Provider Priority¶
- CLI flags (highest)
- Environment variables
- Project config (
.attocode/config.json) - User config (
~/.attocode/config.json) - Defaults (lowest)
Adding a Provider¶
- Create adapter in
src/attocode/providers/adapters/ - Implement
LLMProviderbase class fromproviders/base.py - Register in
providers/registry.py - Add model defaults in
config.py
Model Context Windows¶
Context windows are fetched from the model cache at startup. Override with: