Skip to main content

Quickstart

Get Legible running locally in under 5 minutes.

Prerequisites

  • Docker with Docker Compose v2
  • At least 8 GB RAM available for containers
  • An LLM API key (Gemini, OpenAI, Anthropic, or Ollama for local models)

1. Clone the Repository

git clone https://github.com/kubeworkz/legible.git
cd legible

2. Configure Environment

cp docker/.env.example docker/.env
cp docker/config.example.yaml docker/config.yaml

Edit docker/.env and set your LLM API key:

# For Google Gemini
GEMINI_API_KEY=your-gemini-api-key

# Or for OpenAI
OPENAI_API_KEY=your-openai-api-key

3. Build & Start

./start.sh

This builds all Docker images and starts the stack. On first run, this takes a few minutes.

4. Open the UI

Once all services are healthy, open:

5. Connect Your Data

  1. Open the UI and create a new project
  2. Choose your data source (PostgreSQL, DuckDB, BigQuery, etc.)
  3. Enter connection credentials
  4. Legible will discover your tables and columns

6. Ask a Question

Navigate to the Home thread and type a natural language question like:

What are the top 10 customers by total order amount?

Legible will generate SQL, execute it against your data source, and display the results.

Next Steps

Sample Datasets

Legible ships with several sample datasets for testing and evaluation:

DatasetDescription
E-commerceOrders, customers, products, and transactions
Human ResourceEmployees, departments, salaries, and positions
Card TransactionCredit card transactions with merchants and categories
Hotel RatingInternational hotel booking analytics
Supply ChainSupply chain operations and logistics

During project setup, choose "Use sample dataset" to load one of these datasets into a built-in DuckDB instance — no external database required.