Model Providers
The runtime is model-agnostic. Any LLM that supports tool use can power an agent. The personality_model field on each agent record maps to a Vercel AI SDK provider instance.
Supported Models
Anthropic
| Model ID | Label |
|----------|-------|
| claude-sonnet-4-20250514 | Claude Sonnet 4 (default) |
| claude-haiku-4-5 | Claude Haiku 4.5 |
| claude-opus-4-6 | Claude Opus 4.6 |
Env var: ANTHROPIC_API_KEY
OpenAI
| Model ID | Label |
|----------|-------|
| gpt-4o | GPT-4o |
| gpt-4o-mini | GPT-4o Mini |
| gpt-4.1 | GPT-4.1 |
| gpt-4.1-mini | GPT-4.1 Mini |
| o3-mini | o3-mini |
Env var: OPENAI_API_KEY
| Model ID | Label |
|----------|-------|
| gemini-2.0-flash | Gemini 2.0 Flash |
| gemini-2.5-pro | Gemini 2.5 Pro |
| gemini-2.5-flash | Gemini 2.5 Flash |
Env var: GOOGLE_GENERATIVE_AI_API_KEY
How Resolution Works
resolveModel() in providers.ts:
- Looks up the
personality_modelstring in the model registry - If found, returns the corresponding Vercel AI SDK model instance
- If the string contains
/(e.g.anthropic/claude-sonnet-4-20250514), parses the provider prefix and creates the instance directly - Falls back to Claude Sonnet 4 if the model is unknown
// Registry lookup
resolveModel("gpt-4o") // → openai("gpt-4o")
resolveModel("claude-sonnet-4-20250514") // → anthropic("claude-sonnet-4-20250514")
resolveModel("gemini-2.0-flash") // → google("gemini-2.0-flash")
// Provider/model format
resolveModel("anthropic/claude-opus-4-6") // → anthropic("claude-opus-4-6")
resolveModel("openai/gpt-4.1") // → openai("gpt-4.1")
// Unknown → fallback
resolveModel("llama-3-70b") // → anthropic("claude-sonnet-4-20250514")
Adding a New Provider
- Install the Vercel AI SDK provider package (e.g.
@ai-sdk/mistral) - Add entries to the
MODEL_REGISTRYinproviders.ts - Add a case to the
provider/modelparser - Add entries to
listSupportedModels()for UI dropdowns
Per-Agent Model Selection
Each agent has a personality_model column in the agents table. This can be set at creation time or changed in agent settings. Different agents can use different models — you could have a fast Haiku agent for triage and an Opus agent for complex research.