Skip to content

Installation & Setup

Get Dataface installed and configured in your environment.


Prerequisites

Before installing Dataface, ensure you have:

  • Python 3.10+ installed
  • dbt-core installed (version 1.5+ recommended)
  • dbt Semantic Layer configured (MetricFlow)
  • A dbt project with Semantic Layer metrics and dimensions defined

Installation

Install Dataface

pip install dataface

Verify Installation

dft --version

You should see the Dataface version number.


Configuration

dbt Project Setup

Dataface works with your existing dbt project. Ensure you have:

  1. dbt project initialized:

    dbt init my_project
    

  2. Semantic Layer configured in dbt_project.yml:

    semantic-models:
      - name: orders
        # ... semantic model definition
    
    metrics:
      - name: total_revenue
        # ... metric definition
    

  3. dbt profiles configured in ~/.dbt/profiles.yml:

    my_project:
      outputs:
        dev:
          type: snowflake  # or your database adapter
          # ... connection details
      target: dev
    

Create Faces Directory

Create a directory for your dashboards (called "faces" in Dataface):

mkdir faces

Place your dashboard YAML files in this directory. The faces/ directory is the canonical location for all Dataface dashboard files. The CLI defaults to this directory for validate, serve, and render commands.

Project Configuration (Optional)

Dataface supports project-wide configuration via a dataface.yml file in your project root. This allows you to set default values for dashboard rendering.

Create dataface.yml in your project root (same level as dbt_project.yml):

# Default board rendering settings
board:
  width: 1200.0      # Board width in pixels
  padding: 20.0      # Padding around boards
  row_gap: 20.0      # Gap between rows

Configuration Discovery: - Dataface automatically searches for dataface.yml starting from the current working directory - It walks up the directory tree until it finds the config file or reaches the filesystem root - If no config file is found, sensible defaults are used (1200px width, 20px padding) - If you're working in a dbt project, Dataface will also detect dbt_project.yml as a project root indicator

Settings: - board.width: Default board width in pixels (default: 1200.0) - board.padding: Default padding around boards in pixels (default: 20.0) - board.row_gap: Default gap between rows in pixels (default: 20.0) - board.col_gap: Default gap between columns in pixels (default: 20.0)

These settings apply to all dashboards in your project unless overridden in individual dashboard YAML files.


Adding Dataface to an Existing dbt Project

If you already have a dbt project, adding Dataface takes three steps:

cd my-dbt-project

# 1. Install Dataface
pip install dataface

# 2. Create the faces directory (where dashboard YAML files live)
mkdir -p faces

# 3. Create your first dashboard
cat > faces/hello.yml << 'EOF'
title: "Hello World"
rows:
  - content: |
      # Welcome to Dataface
      Your first dashboard is live.
EOF

# 4. Preview it
dft serve

Your dbt project should now look like this:

my-dbt-project/
├── dbt_project.yml          # dbt config (existing)
├── models/                  # dbt models (existing)
├── profiles.yml             # dbt profiles (existing)
├── dataface.yml             # Optional: Dataface config (board width, padding, etc.)
├── faces/                   # Dataface dashboards
│   ├── hello.yml
│   ├── sales_dashboard.yml
│   └── partials/            # Reusable dashboard fragments (prefixed with _)
│       └── _header.yml
└── assets/                  # Optional: images, CSV data files
    ├── images/
    └── data/

Key conventions: - faces/ is the canonical directory for all dashboard YAML files. The dft CLI defaults to this directory. - Partials live in faces/partials/ and are prefixed with _ (e.g., _header.yml). They're reusable fragments imported by other dashboards. - Subdirectories are fine — faces/sales/overview.yml maps to the URL /faces/sales/overview/ when served. - dataface.yml is optional — it sets project-wide defaults (board width, padding). Place it next to dbt_project.yml.

Dataface reads your dbt project automatically. When you run dft serve or dft validate inside a dbt project directory, Dataface discovers dbt_project.yml, connects to your database via profiles.yml, and makes your Semantic Layer metrics and dimensions available in dashboard queries.


Verification

Test Installation

  1. Create a simple dashboard file faces/test.yml:

    title: "Test Dashboard"
    
    queries:
      test:
        metrics: [total_revenue]
        dimensions: [month]
    
    charts:
      test_chart:
        title: "Revenue by Month"
        query: test
        type: bar
        x: month
        y: total_revenue
    
    rows:
      - test_chart
    

  2. Validate the dashboard:

    dft validate faces/test.yml
    

  3. Preview the dashboard:

    dft serve faces/test.yml
    

  4. Open your browser to http://localhost:8080

If you see the dashboard, installation is successful!


Troubleshooting Common Issues

dbt Not Found

Error: dbt: command not found

Solution: Install dbt-core:

pip install dbt-core

MetricFlow Not Available

Error: MetricFlow not found or semantic layer errors

Solution: Ensure MetricFlow is installed:

pip install "dbt-core[metricflow]"

Or install separately:

pip install metricflow

Database Connection Issues

Error: Cannot connect to database

Solution: - Check your profiles.yml configuration - Verify database credentials - Test dbt connection: dbt debug

YAML Syntax Errors

Error: YAML parsing errors

Solution: - Check YAML indentation (use spaces, not tabs) - Validate YAML syntax with a YAML validator - Use dft validate to check for errors


AI / MCP Setup for IDEs

Dataface includes an MCP (Model Context Protocol) server that gives AI coding assistants access to your data catalog, queries, and dashboard tools. To configure your IDE:

# Auto-detect installed AI clients and configure all of them
dft mcp init

# Or configure a specific client
dft mcp init cursor     # Cursor
dft mcp init vscode     # VS Code / GitHub Copilot
dft mcp init claude     # Claude Desktop
dft mcp init codex      # OpenAI Codex CLI
dft mcp init claude-code # Claude Code (.mcp.json)

This writes the appropriate MCP config file for each client (e.g., .cursor/mcp.json, .vscode/mcp.json). After running this, your AI assistant can: - Browse your data catalog (catalog tool) - Execute queries against your database (execute_query tool) - Render dashboards from YAML (render_dashboard tool) - Search existing dashboards (search_dashboards tool) - Save new dashboards to your project (save_dashboard tool)

Tip: Run dft mcp init after cloning any repo that uses Dataface — it auto-detects which AI clients you have installed.

Manual Setup

If you prefer to configure manually, add the Dataface MCP server to your client's config:

{
  "mcpServers": {
    "dataface": {
      "command": "dft",
      "args": ["mcp", "serve"]
    }
  }
}

Config file locations: - Cursor: .cursor/mcp.json - VS Code / Copilot: .vscode/mcp.json (use "servers" instead of "mcpServers") - Claude Desktop: ~/.config/claude/config.json - Claude Code: .mcp.json (project root) - Codex CLI: .codex/mcp.json

Starting the MCP Server Manually

dft mcp serve

This starts the MCP server in stdio mode (for IDE integration). It also starts an embedded HTTP server on port 8765 for dashboard preview rendering.


Next Steps


Getting Help

If you encounter issues:

  1. Check the Troubleshooting Guide
  2. Validate your dashboard: dft validate
  3. Check dbt configuration: dbt debug
  4. Review the Field Reference