Installation
Lightflare can be started in three ways:
- Run the published Docker image:
shzlwio/lightflare:0.2. - Run the repository
compose.yaml. - Build and run from source.
All three paths need the same runtime pieces:
- A PostgreSQL database.
- The
vectorandpg_trgmPostgreSQL extensions. - The Lightflare database schema created in that database.
- Required environment variables for the database connection, first admin login, and one LLM provider.
Required Environment Variables
Database connection:
| Variable | Purpose |
|---|---|
SPRING_DATASOURCE_URL | JDBC URL for the PostgreSQL database. |
SPRING_DATASOURCE_USERNAME | PostgreSQL username used by the app. |
SPRING_DATASOURCE_PASSWORD | PostgreSQL password used by the app. |
First-login bootstrap superadmin:
| Variable | Purpose |
|---|---|
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_USERNAME | Username for the first superadmin account. |
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_EMAIL | Email stored on the first superadmin account. |
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_PASSWORD | Password for the first superadmin login. Change it after signing in. |
LLM provider selection:
| Variable | Purpose |
|---|---|
LIGHTFLARE_LLM_PROVIDER | Active LLM provider. Use one of openai, ollama, or openrouter. |
The bootstrap superadmin variables are only used while no users exist. After the first user is created, normal login uses the users stored in the database.
Example Environment Variables
Use values like these for a local test setup using OpenAI as the model provider:
SPRING_DATASOURCE_URL=jdbc:postgresql://127.0.0.1:5432/lightflare
SPRING_DATASOURCE_USERNAME=lightflare
SPRING_DATASOURCE_PASSWORD=lightflare
# First-login bootstrap superadmin. Only used while no users exist.
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_USERNAME=admin
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_EMAIL=admin@localhost
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_PASSWORD=changeme
# Choose exactly one LLM provider for local testing.
LIGHTFLARE_LLM_PROVIDER=openai
# OpenAI local testing.
LIGHTFLARE_LLM_OPENAI_ENABLED=true
LIGHTFLARE_LLM_OPENAI_MODEL=gpt-5.4-nano
LIGHTFLARE_LLM_OPENAI_API_KEY=<openai-api-key>
When the app runs inside Docker and connects to a database on the host machine, use host.docker.internal instead of 127.0.0.1 in SPRING_DATASOURCE_URL.
Database Schema
The schema file is:
postgres/init/001_schema.sql
It creates the required extensions and tables:
CREATE EXTENSION IF NOT EXISTS vector;
CREATE EXTENSION IF NOT EXISTS pg_trgm;
For a new local database, run the schema file before starting the app:
psql -h 127.0.0.1 -U lightflare -d lightflare -f postgres/init/001_schema.sql
The schema file is intended for initializing a new database. It drops and recreates Lightflare tables.
Option 1: Run The Docker Image
Use this when you already have PostgreSQL running and initialized.
Prerequisites:
- PostgreSQL is reachable from the container.
- The
vectorandpg_trgmextensions are available. postgres/init/001_schema.sqlhas been applied to the database.- Required environment variables are set.
Example:
docker run --rm \
-p 8066:8066 \
-e SPRING_DATASOURCE_URL='jdbc:postgresql://host.docker.internal:5432/lightflare' \
-e SPRING_DATASOURCE_USERNAME='lightflare' \
-e SPRING_DATASOURCE_PASSWORD='lightflare' \
-e LIGHTFLARE_BOOTSTRAP_SUPERADMIN_USERNAME='admin' \
-e LIGHTFLARE_BOOTSTRAP_SUPERADMIN_EMAIL='admin@localhost' \
-e LIGHTFLARE_BOOTSTRAP_SUPERADMIN_PASSWORD='changeme' \
-e LIGHTFLARE_LLM_PROVIDER='openai' \
-e LIGHTFLARE_LLM_OPENAI_ENABLED='true' \
-e LIGHTFLARE_LLM_OPENAI_MODEL='gpt-5.4-nano' \
-e LIGHTFLARE_LLM_OPENAI_API_KEY='<openai-api-key>' \
shzlwio/lightflare:0.2
Open the app:
http://localhost:8066
Option 2: Run With Docker Compose
Use this when you want the repository to start both Lightflare and PostgreSQL.
Prerequisites:
- Run Compose from the repository root.
- The repository contains
postgres/init/001_schema.sql. - Docker can access the repository folder for the Postgres initialization bind mount.
- Required environment variables are set in your shell or a local
.envfile.
The app service uses the published image:
image: shzlwio/lightflare:0.2
The Postgres service uses pgvector/pgvector:pg17, so the vector extension is available. The pg_trgm extension is provided by PostgreSQL. On first database initialization, Postgres runs the files under ./postgres/init.
Create a local .env file next to compose.yaml:
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_USERNAME=admin
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_EMAIL=admin@localhost
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_PASSWORD=changeme
LIGHTFLARE_LLM_PROVIDER=openai
LIGHTFLARE_LLM_OPENAI_ENABLED=true
LIGHTFLARE_LLM_OPENAI_MODEL=gpt-5.4-nano
LIGHTFLARE_LLM_OPENAI_API_KEY=<openai-api-key>
Start the stack:
docker compose up -d
Open the app:
http://localhost:8066
To reset the local database volume and rerun initialization:
docker compose down -v
docker compose up -d
On Docker Desktop for macOS, allow Docker to access the folder where the repository is checked out if you see a mount permission error.
Option 3: Build From Source
Use this when you want to validate local code changes or run the app outside Docker.
Prerequisites:
- Java 25.
- Maven 3.9 or newer.
- Node.js 25 or newer.
- PostgreSQL 17 with
vectorandpg_trgmavailable. postgres/init/001_schema.sqlhas been applied to the database.- Required environment variables are set.
Build the web UI and server jar:
./build.sh
The build script:
- Installs web UI dependencies with
npm ci. - Builds the web UI.
- Places the web UI static assets into the Spring Boot app resources.
- Builds the server jar with Maven.
- Prints the backend build time.
The output jar is:
server/lightflare-app/target/lightflare-app-<version>.jar
Start the local jar with environment variables set:
SPRING_DATASOURCE_URL='jdbc:postgresql://127.0.0.1:5432/lightflare' \
SPRING_DATASOURCE_USERNAME='lightflare' \
SPRING_DATASOURCE_PASSWORD='lightflare' \
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_USERNAME='admin' \
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_EMAIL='admin@localhost' \
LIGHTFLARE_BOOTSTRAP_SUPERADMIN_PASSWORD='changeme' \
LIGHTFLARE_LLM_PROVIDER='openai' \
LIGHTFLARE_LLM_OPENAI_ENABLED='true' \
LIGHTFLARE_LLM_OPENAI_MODEL='gpt-5.4-nano' \
LIGHTFLARE_LLM_OPENAI_API_KEY='<openai-api-key>' \
java -jar server/lightflare-app/target/lightflare-app-0.2.jar