LLM Providers
Cryyer supports three LLM providers for generating email drafts. Set the LLM_PROVIDER environment variable to choose yours. Default is anthropic.
| Provider | Default Model | Best for |
|---|---|---|
| Anthropic | claude-sonnet-4-5-20250514 | Best writing quality |
| OpenAI | gpt-4o | Wide ecosystem |
| Gemini | gemini-1.5-flash | Speed and cost |
Configuration
Section titled “Configuration”| Variable | Default | Description |
|---|---|---|
LLM_PROVIDER | anthropic | anthropic, openai, or gemini |
LLM_MODEL | Per-provider default | Override the default model |
Model override
Section titled “Model override”You can use any model from your chosen provider by setting LLM_MODEL:
export LLM_PROVIDER=anthropicexport LLM_MODEL=claude-sonnet-4-5-20250514 # Sonnet is the defaultHow it works
Section titled “How it works”Cryyer uses an adapter pattern — all providers implement the same LLMProvider interface. The draft prompt includes your product’s voice configuration and the gathered GitHub activity, and the LLM returns a structured JSON response with subject and body fields.