Intermediate Free

Real-Time Sales Dashboard

Build a live Metabase dashboard with auto-refresh, date filters, and scheduled email reports. Answer "What did we sell yesterday?" in seconds β€” no more running SQL by hand.

MetabasePostgreSQLPython

πŸ“š Before you start, learn these courses:

SQL & PostgreSQL β†’ Python β†’ Metabase β†’

The Scenario

Imagine this…

You work at a retail company. Sales managers keep asking: "What did we sell yesterday?" β€” "Which store is underperforming?" β€” "Can I get a weekly summary in my inbox?" Right now you're running SQL by hand and sending screenshots. Your boss wants a dashboard they can open anytime, see live numbers, and drill from region β†’ store β†’ product.

Your job: build a live Metabase dashboard that connects to PostgreSQL, shows revenue and sales at a glance, refreshes automatically, and sends a weekly email report. That's exactly what we're going to build together, step by step.

What You'll Build

Data β†’ Postgres β†’ Metabase β†’ Dashboard
  • PostgreSQL database: Tables for stores, products, and sales with ~500 sample rows generated by Python.
  • Metabase connection: Connect Metabase to Postgres as a data source.
  • Saved questions: "Total Revenue by Day," "Sales by Store," "Top 10 Products" β€” each with full SQL and date filters.
  • Live dashboard: KPI cards, charts, date filter, and drill-through (region β†’ store).
  • Auto-refresh: Dashboard refreshes every 5 minutes so it feels "live."
  • Email subscription: Scheduled email report so stakeholders get the summary without opening Metabase.

In plain English

Think of it like a TV news ticker. Your database is the newsroom with all the data. Metabase is the studio that turns it into charts and numbers. The dashboard is the screen everyone watches. Auto-refresh means the ticker keeps updating. The email subscription is like a daily newspaper β€” same info, delivered to your inbox.

Prerequisites

Before you start, make sure you're comfortable with these. If not, we link to the exact courses below β€” finish those first, then come back.

SQL basics SELECT, WHERE, JOIN, GROUP BY, ORDER BY, date functions. You should be able to write a query that aggregates sales by day. PostgreSQL Course
PostgreSQL A running PostgreSQL instance (local or cloud). You'll create a database and tables. PostgreSQL Course
Metabase intro What Metabase is, how to run a query. You'll learn the rest here. Metabase Course

Step-by-Step Plan

Here's the high-level road-map. The Build It tab walks you through every single step with copy-paste code, line-by-line explanations, and verification queries.

  1. 1
    Prepare the database. Create a retail_analytics database with tables for stores, products, and sales. Complete SQL CREATE TABLE statements included.
  2. 2
    Create sample data with Python. A full Python script using random/datetime to insert ~500 rows of realistic sales data into the tables. Copy, run, done.
  3. 3
    Install and connect Metabase. Docker command to run Metabase, how to add Postgres as a data source. Tips for non-Docker users.
  4. 4
    Create saved questions. "Total Revenue by Day," "Sales by Store," "Top 10 Products" β€” each with full SQL in project-code blocks and layman explanations.
  5. 5
    Build the dashboard. Create a new dashboard, add cards, add a date filter, enable drill-through so users can click region β†’ store.
  6. 6
    Enable auto-refresh. Set the dashboard to refresh every 5 minutes (clock icon). It stays "live" without manual refresh.
  7. 7
    Set up email subscription. Create a subscription in Metabase so stakeholders receive the dashboard or a summary by email on a schedule.
1

Prepare the Database

Create retail_analytics database and tables

What we're doing and why

Every dashboard needs data. We'll create a dedicated database called retail_analytics with three tables: stores (where we sell), products (what we sell), and sales (each transaction). This structure is standard in retail β€” you'll see it at real companies. Once we have the schema, Python will fill it with sample data.

1a. Create the database

Connect to PostgreSQL (psql, pgAdmin, or DBeaver) and run:

SQL
-- Create a dedicated database for retail sales analytics
CREATE DATABASE retail_analytics;

-- Connect to it (in psql: \c retail_analytics)

1b. Create the stores table

The stores table holds each physical (or virtual) store location. We need an ID, name, region, and city.

SQL
CREATE TABLE stores (
    store_id   SERIAL PRIMARY KEY,
    store_name VARCHAR(100) NOT NULL,
    region     VARCHAR(50)  NOT NULL,
    city       VARCHAR(80)  NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

Line-by-line explanation

SERIAL PRIMARY KEY: Auto-incrementing unique ID (1, 2, 3…) β€” PostgreSQL handles it. VARCHAR(100): Text up to 100 characters. NOT NULL: This column can't be empty. TIMESTAMP DEFAULT CURRENT_TIMESTAMP: When a row is inserted, this column automatically gets the current date and time.

1c. Create the products table

SQL
CREATE TABLE products (
    product_id   SERIAL PRIMARY KEY,
    product_name VARCHAR(150) NOT NULL,
    category     VARCHAR(80)  NOT NULL,
    unit_price   DECIMAL(10, 2) NOT NULL,
    created_at   TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

DECIMAL(10, 2): Money values with 2 decimal places (e.g. 29.99). Never use FLOAT for money β€” rounding errors cause bugs.

1d. Create the sales table

Each row = one sale. We link to store and product, store quantity and amount, and the sale date.

SQL
CREATE TABLE sales (
    sale_id    SERIAL PRIMARY KEY,
    store_id   INTEGER NOT NULL REFERENCES stores(store_id),
    product_id INTEGER NOT NULL REFERENCES products(product_id),
    quantity   INTEGER NOT NULL CHECK (quantity > 0),
    amount     DECIMAL(10, 2) NOT NULL,
    sale_date  DATE NOT NULL,
    created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);

What do REFERENCES and CHECK mean?

REFERENCES stores(store_id): Every store_id in sales must exist in the stores table. If you try to insert a sale with store_id=99 but that store doesn't exist, PostgreSQL rejects it. This is referential integrity β€” no orphan records.

CHECK (quantity > 0): We can't sell negative or zero items. The database enforces this automatically.

1e. Create indexes for fast queries

Dashboards often filter by date and aggregate by store. Indexes make those queries fast.

SQL
CREATE INDEX idx_sales_sale_date ON sales(sale_date);
CREATE INDEX idx_sales_store_id  ON sales(store_id);
CREATE INDEX idx_sales_product_id ON sales(product_id);

1f. Verify

SQL
SELECT table_name
FROM information_schema.tables
WHERE table_schema = 'public'
  AND table_name IN ('stores', 'products', 'sales');

You should see three rows. If yes β€” step 1 is done!

Step 1 complete!

  • Database retail_analytics created
  • Tables: stores, products, sales
  • Foreign keys and constraints in place
  • Indexes created for date and store queries
2

Generate Sample Data with Python

Insert ~500 sales rows using random/datetime

What we're doing and why

Empty tables = empty dashboard. We need data! Instead of typing 500 rows by hand, we'll use a Python script that uses random and datetime to generate realistic sales. The script inserts stores, products, and ~500 sales spread over the last 90 days. Run it once, and you're done.

2a. Install the PostgreSQL adapter

Terminal
pip install psycopg2-binary

psycopg2-binary lets Python talk to PostgreSQL. The -binary version includes pre-built libraries so you don't need to compile.

2b. Create the Python script

Save this as seed_retail_data.py in a folder on your computer. Update the connection parameters (host, user, password, dbname) to match your setup.

Python
import psycopg2
import random
from datetime import datetime, timedelta

# --- CONFIG: Change these to match your PostgreSQL setup ---
CONN_PARAMS = {
    'host': 'localhost',
    'port': 5432,
    'user': 'postgres',
    'password': 'your_password',
    'dbname': 'retail_analytics'
}

def main():
    conn = psycopg2.connect(**CONN_PARAMS)
    cur = conn.cursor()

    # 1. Insert stores (5 stores across 3 regions)
    stores_data = [
        (1, 'Downtown Store', 'North', 'New York'),
        (2, 'Mall Outlet', 'North', 'Boston'),
        (3, 'Southside Plaza', 'South', 'Miami'),
        (4, 'West Coast Hub', 'West', 'Los Angeles'),
        (5, 'Central Square', 'Central', 'Chicago'),
    ]
    for row in stores_data:
        cur.execute(
            "INSERT INTO stores (store_id, store_name, region, city) VALUES (%s, %s, %s, %s) ON CONFLICT (store_id) DO NOTHING",
            row
        )

    # 2. Insert products (12 products across 4 categories)
    products_data = [
        (1, 'Wireless Mouse', 'Electronics', 29.99),
        (2, 'USB-C Hub', 'Electronics', 49.99),
        (3, 'Desk Lamp', 'Home', 34.99),
        (4, 'Coffee Maker', 'Home', 79.99),
        (5, 'Cotton T-Shirt', 'Clothing', 19.99),
        (6, 'Running Shoes', 'Clothing', 89.99),
        (7, 'Bluetooth Speaker', 'Electronics', 59.99),
        (8, 'Yoga Mat', 'Home', 24.99),
        (9, 'Backpack', 'Clothing', 44.99),
        (10, 'Keyboard', 'Electronics', 89.99),
        (11, 'Throw Pillow', 'Home', 29.99),
        (12, 'Sunglasses', 'Clothing', 39.99),
    ]
    for row in products_data:
        cur.execute(
            "INSERT INTO products (product_id, product_name, category, unit_price) VALUES (%s, %s, %s, %s) ON CONFLICT (product_id) DO NOTHING",
            row
        )

    # 3. Generate ~500 sales over the last 90 days
    start_date = datetime.now().date() - timedelta(days=90)
    sale_id = 1
    for _ in range(500):
        store_id = random.randint(1, 5)
        product_id = random.randint(1, 12)
        quantity = random.randint(1, 5)
        cur.execute("SELECT unit_price FROM products WHERE product_id = %s", (product_id,))
        unit_price = cur.fetchone()[0]
        amount = round(quantity * unit_price, 2)
        days_offset = random.randint(0, 90)
        sale_date = start_date + timedelta(days=days_offset)
        cur.execute(
            "INSERT INTO sales (sale_id, store_id, product_id, quantity, amount, sale_date) VALUES (%s, %s, %s, %s, %s, %s)",
            (sale_id, store_id, product_id, quantity, amount, sale_date)
        )
        sale_id += 1

    conn.commit()
    cur.close()
    conn.close()
    print("Done! Inserted stores, products, and ~500 sales.")

if __name__ == '__main__':
    main()

Line-by-line explanation

psycopg2.connect: Opens a connection to your PostgreSQL database using the config. cur = conn.cursor(): A cursor is how you run SQL from Python β€” think of it as a pointer for "next command."

INSERT ... ON CONFLICT DO NOTHING: If a store_id already exists (e.g. you ran the script twice), skip it instead of erroring. Safe for re-running.

random.randint(1, 5): Picks a random integer from 1 to 5 β€” we have 5 stores. Same for products (1–12). quantity = random.randint(1, 5): Each sale is 1–5 units.

cur.execute("SELECT unit_price...": We need the product's price to compute amount = quantity Γ— unit_price. sale_date = start_date + timedelta(days=days_offset): Spreads sales across 90 days randomly.

conn.commit(): Saves all changes. Without this, nothing is written. cur.close(); conn.close(): Clean up β€” always close connections.

Tip: ON CONFLICT

If your tables use SERIAL and you get "duplicate key" errors, truncate first: TRUNCATE stores, products, sales RESTART IDENTITY CASCADE; Then re-run the script. Or change the script to use INSERT ... ON CONFLICT (store_id) DO NOTHING β€” but stores/products need a unique constraint for that.

2c. Run the script

Terminal
python seed_retail_data.py

2d. Verify

SQL
SELECT 'stores' AS tbl, COUNT(*) AS cnt FROM stores
UNION ALL SELECT 'products', COUNT(*) FROM products
UNION ALL SELECT 'sales', COUNT(*) FROM sales;

Expected: stores=5, products=12, sales=500. If your counts match β€” step 2 is done!

Step 2 complete!

  • Python script created and executed
  • 5 stores, 12 products inserted
  • ~500 sales spread over 90 days
  • Database ready for Metabase
3

Install and Connect Metabase

Run Metabase via Docker and add Postgres as data source

What we're doing and why

Metabase is an open-source BI tool that connects to your database and turns SQL into charts and dashboards. You run it as a web app (like a mini "Tableau" or "Power BI"). Docker is the easiest way β€” one command, and Metabase is running. Then we add our retail_analytics Postgres database as a "data source" so Metabase can query it.

3a. Run Metabase with Docker

If you have Docker installed, run:

Terminal
docker run -d -p 3000:3000 --name metabase metabase/metabase

What each flag does

-d: Run in background (detached). -p 3000:3000: Map port 3000 on your machine to Metabase's port 3000. --name metabase: Name the container so you can stop it with docker stop metabase. metabase/metabase: The official Metabase image from Docker Hub.

Open http://localhost:3000 in your browser. On first load, Metabase will ask you to create an admin account β€” do that.

3b. Add PostgreSQL as a data source

In Metabase: Settings (gear icon) β†’ Admin Settings β†’ Databases β†’ Add database.

  • Database type: PostgreSQL
  • Display name: Retail Analytics
  • Host: If Metabase runs in Docker on your machine, use host.docker.internal (Mac/Windows) or your machine's IP (Linux). For local Metabase + local Postgres, use localhost.
  • Port: 5432
  • Database name: retail_analytics
  • Username / Password: Your Postgres credentials

Click Save. Metabase will test the connection. If it's green, you're connected!

Tip: Metabase in Docker, Postgres on host

Docker containers can't reach localhost on your machine β€” localhost means "inside the container." Use host.docker.internal (Mac/Windows) to point to your host. On Linux, you may need --add-host=host.docker.internal:host-gateway when running the Docker command.

Don't have Docker?

You can run Metabase as a JAR: download from metabase.com/start/oss, then java -jar metabase.jar. Or use Metabase Cloud β€” they host it for you and you just add your database connection string.

3c. Test the connection

Go to Browse data β†’ select your database β†’ click the sales table. You should see rows. Run a quick "New question" β†’ "SQL query" with:

SQL
SELECT COUNT(*) FROM sales;

You should see 500 (or however many you inserted).

Step 3 complete!

  • Metabase running (Docker or JAR)
  • PostgreSQL added as data source
  • Connection tested β€” can query sales
4

Create Saved Questions

Total Revenue by Day, Sales by Store, Top 10 Products

What we're doing and why

In Metabase, a "question" is a saved query (SQL or point-and-click). We create three questions that our dashboard will use: (1) total revenue by day (line chart), (2) sales by store (bar chart), (3) top 10 products (table). Each uses a date filter so the dashboard can show "last 7 days" or "this month." We'll use Metabase's SQL variables {{ start_date }} and {{ end_date }} for that.

4a. Total Revenue by Day

Create a new question β†’ Native query (SQL). Paste this and add two "Field Filter" variables: start_date (type: Date) and end_date (type: Date).

SQL
SELECT
    sale_date AS date,
    SUM(amount) AS total_revenue,
    COUNT(*) AS order_count
FROM sales
WHERE sale_date BETWEEN {{ start_date }} AND {{ end_date }}
GROUP BY sale_date
ORDER BY sale_date

What this does

We group sales by sale_date and sum the amounts. The WHERE clause filters by the date range the user picks. In Metabase, {{ start_date }} and {{ end_date }} are placeholders β€” when you add them as "Variables," Metabase shows date pickers and substitutes the values. Save this question as "Total Revenue by Day."

4b. Sales by Store

SQL
SELECT
    s.store_name,
    s.region,
    SUM(f.amount) AS total_revenue,
    COUNT(f.sale_id) AS order_count
FROM sales f
JOIN stores s ON f.store_id = s.store_id
WHERE f.sale_date BETWEEN {{ start_date }} AND {{ end_date }}
GROUP BY s.store_id, s.store_name, s.region
ORDER BY total_revenue DESC

We join sales to stores to get store name and region. One row per store with total revenue and order count. Save as "Sales by Store." Add the same start_date and end_date variables.

4c. Top 10 Products

SQL
SELECT
    p.product_name,
    p.category,
    SUM(f.amount) AS total_revenue,
    SUM(f.quantity) AS units_sold,
    COUNT(f.sale_id) AS times_ordered
FROM sales f
JOIN products p ON f.product_id = p.product_id
WHERE f.sale_date BETWEEN {{ start_date }} AND {{ end_date }}
GROUP BY p.product_id, p.product_name, p.category
ORDER BY total_revenue DESC
LIMIT 10

Joins sales to products, aggregates by product, sorts by revenue, and limits to 10. Save as "Top 10 Products." Add the same date variables.

Tip: Default values for variables

In the variable settings, set default values: start_date = 30 days ago, end_date = today. That way the dashboard loads with "last 30 days" by default instead of requiring the user to pick dates first.

Step 4 complete!

  • "Total Revenue by Day" β€” line or bar chart over time
  • "Sales by Store" β€” bar chart by store
  • "Top 10 Products" β€” table of best sellers
  • All three support date filters for the dashboard
5

Build the Dashboard

Create dashboard, add cards, date filter, drill-through

What we're doing and why

A dashboard is a page that holds multiple "cards" (each card = one saved question visualized). We'll create a new dashboard, add our three questions as cards, add a dashboard-level date filter so one filter controls all cards, and enable drill-through so users can click a region to see stores in that region.

5a. Create a new dashboard

Click + New β†’ Dashboard. Name it "Real-Time Sales Dashboard."

5b. Add cards

Click Add a saved question. Add "Total Revenue by Day," "Sales by Store," and "Top 10 Products." Arrange them: put "Total Revenue by Day" at the top (full width), then "Sales by Store" and "Top 10 Products" side by side below.

5c. Add a date filter

Click Add a filter (filter icon in top right). Choose "Time" β†’ "All Options" or "Date Range." Name it "Date range." Connect it to each card: when you add a filter, Metabase asks which cards it should apply to β€” link it to all three. Map the filter to the start_date and end_date variables in each card's SQL.

How the filter connects

Each of our questions has {{ start_date }} and {{ end_date }}. When you add a dashboard date filter, Metabase lets you map "Filter 1" to those variables. So when the user picks "Last 7 days," Metabase passes that range into every question. One filter, all cards update.

5d. Enable drill-through

For "Sales by Store": click the card β†’ three dots β†’ "Edit question" β†’ in the visualization, enable "Click behavior" or "Drill-through." If you have a "Sales by Region" card, you can set it so clicking "North" filters the store card to show only North stores. Metabase's drill-through lets you pass a column value (e.g. region) to another card as a filter.

Simpler drill-through

If your Metabase version has "Dashboard click behavior," you can set: when user clicks "North" on the region chart, filter the "Sales by Store" card to show only North region. The exact UI varies by version β€” look under card settings β†’ "Click behavior" or "Drill-through."

Step 5 complete!

  • Dashboard created with three cards
  • Date filter added and connected to all cards
  • Layout arranged for readability
  • Drill-through enabled where supported
6

Enable Auto-Refresh

Set dashboard to refresh every 5 minutes

What we're doing and why

By default, a Metabase dashboard loads once and stays static. For a "real-time" feel, we want it to refresh automatically β€” e.g. every 5 minutes. That way, if new sales are inserted (by your Python script or a live system), the dashboard shows updated numbers without the user hitting refresh.

6a. Turn on auto-refresh

Open your dashboard. Look for the clock icon or "Auto-refresh" in the top-right area. Click it. Choose Every 5 minutes (or 1 minute, 15 minutes β€” whatever fits your use case).

Once enabled, the dashboard will re-run all card queries at that interval. The page updates automatically. Users can leave it open on a TV or second monitor β€” it stays "live."

Tip

Don't set refresh too aggressive (e.g. every 10 seconds) β€” it hits your database often and can slow things down. For "yesterday's sales" type reporting, 5–15 minutes is usually enough.

Step 6 complete!

  • Auto-refresh enabled (clock icon)
  • Dashboard refreshes every 5 minutes
  • No manual refresh needed
7

Set Up Email Subscription

Schedule the dashboard to be emailed on a schedule

What we're doing and why

Managers don't always open Metabase. They want the report in their inbox β€” "Every Monday at 8 AM, send me the Sales Dashboard." Metabase has a subscription feature: you pick a dashboard (or individual question), add email recipients, and set a schedule (daily, weekly, etc.). Metabase sends a PDF or link. No more manual screenshots!

7a. Create a subscription

Open your dashboard. Click the Subscribe button (bell or envelope icon, usually in the top right). In the subscription dialog:

  • Recipients: Add email addresses (yourself for testing, or stakeholders).
  • Schedule: Choose "Daily" (e.g. 8:00 AM) or "Weekly" (e.g. Monday 8:00 AM).
  • Format: Attach PDF snapshot of the dashboard, or send a link. PDF is often preferred for email.

Save. Metabase will send the dashboard on that schedule. The date filter will use "yesterday" or "last 7 days" depending on how you configured the filter β€” you can set filter defaults for subscriptions.

Email must be configured

Metabase needs to send email. In Admin Settings β†’ Email, configure SMTP (Gmail, SendGrid, your company server, etc.). Without it, subscriptions won't send. Check the Metabase email docs for setup.

Step 7 complete!

  • Subscription created for the dashboard
  • Recipients and schedule configured
  • Stakeholders get the report by email automatically

Project complete!

You just built a real-time sales dashboard end-to-end. Here's what you can put on your resume:

  • Designed and implemented a PostgreSQL schema for retail analytics (stores, products, sales)
  • Generated 500+ rows of sample data with Python (random, datetime, psycopg2)
  • Connected Metabase to PostgreSQL and built saved questions with parameterized date filters
  • Created a live dashboard with KPI cards, charts, and drill-through
  • Enabled auto-refresh and email subscriptions for stakeholder reporting

What's next?

Unlock the Full Build Guide

Get the complete step-by-step implementation: copy-paste code, line-by-line explanations, tips, warnings, and verification queries. Build this project from zero to production.

Upgrade to Pro
Back to all projects