Providers
SwarmClaw supports multiple LLM providers. Each agent can use a different provider.
Built-in Providers
| Provider | Type | API Key | Notes |
|---|---|---|---|
| Claude CLI | Subprocess | No | Spawns claude with --output-format stream-json. Requires Claude Code CLI installed. |
| Anthropic | API | Yes | Direct API calls to Anthropic. |
| OpenAI | API | Yes | GPT-4o, o3, o4-mini, etc. |
| Ollama | Local | No | Connects to localhost:11434. Run Ollama separately. |
Custom Providers
Add any OpenAI-compatible endpoint as a custom provider. This works with:
- OpenRouter —
https://openrouter.ai/api/v1 - Together AI —
https://api.together.xyz/v1 - Groq —
https://api.groq.com/openai/v1 - Local vLLM —
http://localhost:8000/v1 - Any other OpenAI-compatible API
Adding a Custom Provider
- Navigate to Providers in the sidebar
- Click New Provider
- Enter: name, base URL, and model names
- Link a credential (API key) if required
- Save
The custom provider appears in agent and session dropdowns immediately.
Credentials
API keys are encrypted with AES-256-GCM before storage. Add credentials from the provider setup or from the credential manager in Settings.
Provider Limitations
- Claude CLI cannot be used as an orchestrator (no LangGraph support)
- Ollama requires the Ollama server running locally
- Custom providers must be OpenAI-compatible (chat completions endpoint)