2 free questions to get you started. Unlock all sections (Normal, Code & Logic) with Pro.
dbt (data build tool) is used for the “T” in ELT: it runs SQL transformations inside your data warehouse. You write SQL in models, use ref() to build a DAG of dependencies, and dbt handles run order, testing, and documentation. It doesn’t move data from source to warehouse—an orchestrator (e.g. Airflow) does that. dbt runs in the warehouse (Snowflake, BigQuery, Redshift, etc.) and produces tables/views that analytics and BI use.
View: Model is built as a view (no physical table; always fresh, can be slower). Table: Full rebuild each run; good for smaller datasets. Incremental: Only new/changed rows are processed after the first run; good for large fact tables. Ephemeral: Not materialized in the warehouse; compiled into a CTE in dependent models. Choose by data volume and how often you need to refresh.
25+ questions across models, testing, incremental, macros, and project design — with full answers.
Upgrade to ProUnlock with Pro for the answer.
Unlock with Pro for the code.
Unlock with Pro for the answer.
Unlock with Pro for the answer.
Unlock with Pro for the YAML.
Unlock with Pro for the answer.
Unlock with Pro for the answer.
Unlock with Pro for the code.
Unlock with Pro for the answer.
Unlock with Pro for the answer.
Unlock with Pro for the answer.
Unlock with Pro for the answer.