Make your DAGs smart with Jinja templates, macros, and dynamic variables!
Imagine you have a daily ETL job that loads data for "today." Without templating, you'd have to hardcode the date in your DAG — and every single day, you'd need to change it manually. That's a recipe for disaster!
Hardcoded values are like writing your birthdate on a form and handing it to everyone in your class. "Born on March 15, 2010" — that only works for one person! Templating is like having a blank line where each person fills in their own date. In Airflow, the "blank" gets filled automatically with the execution date — and it changes every single run!
bash_command='echo "2024-01-15"' — Always prints the same date. Useless for a daily job!
bash_command='echo "{{ ds }}"' — Prints the execution date. Jan 15, Jan 16, Jan 17... automatically!
A retail company runs a daily pipeline to load sales data from s3://bucket/sales/2024-01-15.csv. With templating: s3://bucket/sales/{{ ds }}.csv — Airflow substitutes the correct date for each run. No manual edits, no mistakes, no "wrong date" bugs!
Jinja2 is a template engine used by Python (and Airflow). It lets you embed placeholders like {{ variable }} in strings. When the task runs, Airflow replaces those placeholders with actual values.
Jinja is like a fill-in-the-blank worksheet. You write: "Hello, _____! Today is _____." The teacher (Airflow) fills in the blanks: "Hello, Alice! Today is Monday." Next time: "Hello, Bob! Today is Tuesday." Same template, different fill-ins!
Remember mail merge? You create one letter: "Dear {{ first_name }}, your order {{ order_id }} will arrive on {{ delivery_date }}." Then you merge with a spreadsheet, and Word generates 100 personalized letters. Airflow does the same thing for your DAGs — one template, infinite personalized executions!
Airflow provides a set of built-in template variables you can use inside template_fields. These are available automatically when a task runs.
| Variable | Description | Example Output |
|---|---|---|
{{ ds }} | Execution date as string (YYYY-MM-DD) | 2024-01-15 |
{{ ds_nodash }} | Same date, no dashes (for file paths) | 20240115 |
{{ data_interval_start }} | Start of the data interval | 2024-01-15 00:00:00+00:00 |
{{ data_interval_end }} | End of the data interval | 2024-01-16 00:00:00+00:00 |
{{ ts }} | Timestamp (ISO format) | 2024-01-15T00:00:00+00:00 |
{{ params.key }} | User-defined DAG params | Whatever you passed in params |
ds_nodash is perfect for S3 paths, e.g. s3://bucket/data/{{ ds_nodash }}/ → s3://bucket/data/20240115/. Some systems don't like dashes in paths!
Not every operator argument supports templating! Only arguments that are in the operator's template_fields can use {{ ... }}.
Each operator defines which of its arguments get rendered with Jinja. Check the operator docs or the source code for template_fields.
bash_command is templated. You can use {{ ds }} directly in the command!templates_dict is templated — a dict whose values are rendered with Jinja before being passed to the callable. The callable itself receives context via **kwargs or ti (TaskInstance).op_kwargs with pre-computed values, or access kwargs['ds'], kwargs['logical_date'], etc. inside your callable.If an argument is in template_fields, you can put {{ ds }} and friends in it. If not, templating won't work — you'll see the literal {{ ds }} in your output!
The PythonOperator is special: its python_callable runs Python code, so you get execution context directly. You don't need Jinja for everything!
You can pass templated values via op_kwargs if the operator supports it. More commonly, you pass the context and pull values inside the function.
Your Python callable can accept **kwargs or named args like ti (TaskInstance), execution_date, logical_date, etc. Airflow injects these automatically!
provide_context=True (old): Explicitly passed context. Deprecated in Airflow 2.x.
New style: Just use **kwargs in your callable. Airflow always passes context. Access kwargs['ds'], kwargs['logical_date'], kwargs['params'], etc.
Airflow provides macros — pre-built functions you can use inside Jinja templates. They live in the macros namespace.
macros.ds_add(ds, days) — Add days to a date string. Example: {{ macros.ds_add(ds, -1) }} gives yesterday.macros.ds_format(ds, input_fmt, output_fmt) — Reformats a date. Example: {{ macros.ds_format(ds, '%Y-%m-%d', '%Y/%m/%d') }}You can add your own macros when defining the DAG: user_defined_macros={'my_func': my_func}. Then use {{ my_func(ds) }} in templates!
Jinja filters transform values. Use the pipe | followed by the filter name.
{{ value | default('fallback') }} — Use 'fallback' if value is undefined or empty.{{ value | int }} — Convert to integer.{{ value | upper }} — Uppercase string.{{ value | truncate(10) }} — Truncate to 10 chars.After a task runs (or when you hover in the UI), you can see the rendered template — what the command actually looked like after substitution. This is incredibly useful for debugging!
In the Airflow UI: open a task instance → click "Rendered" tab (or "Task Instance Details" and look for rendered template). You'll see the final bash_command or other templated fields with all {{ ... }} replaced by real values.
{{ ds }} isn't interpreted by the shell. Use double quotes for inner strings if needed.{{ "{{ ds }}" }} won't work the way you expect. Keep templates simple.{{ ds }} vs {{ date }}).How a template flows from your DAG code to the actual executed command:
Template rendering flow: DAG → Jinja → Substitution → Execution
Variable substitution: placeholders become actual values at runtime
Task Instance → Rendered tab shows the final rendered template.
Graph View → Hover over a task → tooltip may show key info.
Grid View → Click task → Task Instance Details → Rendered tab.
from airflow import DAG from airflow.operators.bash import BashOperator from datetime import datetime with DAG( dag_id="templating_bash_dag", start_date=datetime(2024, 1, 1), schedule="@daily", catchup=False, ) as dag: # bash_command is a template_field - use {{ ds }} directly! echo_date = BashOperator( task_id="echo_date", bash_command='echo "Execution date: {{ ds }}"', ) # Build a dynamic path (e.g. for S3 or local files) list_files = BashOperator( task_id="list_files", bash_command='echo "Path: s3://bucket/data/{{ ds_nodash }}/"', )
from airflow import DAG from airflow.operators.python import PythonOperator from datetime import datetime def print_context(**kwargs): ds = kwargs.get("ds") logical_date = kwargs.get("logical_date") ti = kwargs.get("ti") params = kwargs.get("params", {}) print(f"Execution date: {ds}") print(f"Logical date: {logical_date}") print(f"Params: {params}") with DAG( dag_id="templating_python_dag", start_date=datetime(2024, 1, 1), schedule="@daily", catchup=False, params={"env": "production"}, ) as dag: python_task = PythonOperator( task_id="print_context", python_callable=print_context, )
from airflow import DAG from airflow.operators.bash import BashOperator from datetime import datetime with DAG( dag_id="macros_dag", start_date=datetime(2024, 1, 1), schedule="@daily", catchup=False, ) as dag: # macros.ds_add: yesterday = ds + (-1) days echo_yesterday = BashOperator( task_id="echo_yesterday", bash_command='echo "Yesterday: {{ macros.ds_add(ds, -1) }}"', )
from airflow import DAG from airflow.operators.bash import BashOperator from datetime import datetime def custom_format(ds): return ds.replace("-", "/") with DAG( dag_id="custom_macros_dag", start_date=datetime(2024, 1, 1), schedule="@daily", catchup=False, user_defined_macros={"my_format": custom_format}, params={"bucket": "my-bucket"}, ) as dag: task = BashOperator( task_id="task", bash_command='echo "Path: s3://{{ params.bucket }}/{{ my_format(ds) }}"', )
Write a BashOperator that echoes the execution date in the format: "Processing data for YYYY-MM-DD". Use the {{ ds }} variable. Trigger the DAG and verify the Rendered tab shows the correct date.
Use single quotes for the outer bash_command string so the shell doesn't interpret {{ ds }}. Example: bash_command='echo "Processing data for {{ ds }}"'
Create a PythonOperator callable that accepts **kwargs, extracts ds and data_interval_end, and prints them. Verify they match the execution date and interval.
Write a BashOperator that prints "Yesterday: YYYY-MM-DD" using {{ macros.ds_add(ds, -1) }}. Then try macros.ds_add(ds, 7) to get "One week from execution date".
Define a DAG with params={'bucket': 'my-data-bucket', 'prefix': 'raw'}. In a BashOperator, build and echo the full path: s3://{{ params.bucket }}/{{ params.prefix }}/{{ ds_nodash }}/.
Test your understanding! Click on the answer you think is correct.