Skip to main content

LLM Providers

Lightflare requires one configured LLM provider. Pick OpenAI, Ollama, or OpenRouter, then configure the provider-specific properties for that provider.

Providers

OpenRouter is useful when you want model routing through OpenRouter while keeping Lightflare's provider interface, memory embeddings, tools, and workflow behavior unchanged.

Shared Properties

VariableDefaultDescription
LIGHTFLARE_LLM_PROVIDERopenaiActive provider. Supported values are openai, ollama, and openrouter.
LIGHTFLARE_LLM_MAX_OUTPUT_TOKENS1024Max output tokens.