Providers

SwarmClaw supports multiple LLM providers. Each agent can use a different provider.

Built-in Providers

ProviderTypeAPI KeyNotes
Claude CLISubprocessNoSpawns claude with --output-format stream-json. Requires Claude Code CLI installed.
AnthropicAPIYesDirect API calls to Anthropic.
OpenAIAPIYesGPT-4o, o3, o4-mini, etc.
OllamaLocalNoConnects to localhost:11434. Run Ollama separately.

Custom Providers

Add any OpenAI-compatible endpoint as a custom provider. This works with:

  • OpenRouterhttps://openrouter.ai/api/v1
  • Together AIhttps://api.together.xyz/v1
  • Groqhttps://api.groq.com/openai/v1
  • Local vLLMhttp://localhost:8000/v1
  • Any other OpenAI-compatible API

Adding a Custom Provider

  1. Navigate to Providers in the sidebar
  2. Click New Provider
  3. Enter: name, base URL, and model names
  4. Link a credential (API key) if required
  5. Save

The custom provider appears in agent and session dropdowns immediately.

Credentials

API keys are encrypted with AES-256-GCM before storage. Add credentials from the provider setup or from the credential manager in Settings.

Provider Limitations

  • Claude CLI cannot be used as an orchestrator (no LangGraph support)
  • Ollama requires the Ollama server running locally
  • Custom providers must be OpenAI-compatible (chat completions endpoint)