Skip to main content

Enable Snowflake cost metrics on Observe

This feature is available only on Astro Hosted.

Observe can track metrics for aggregate Snowflake Cost, which appear on the Data Pipelines Health Overview dashboard, and the cost per asset, which appears in the Asset metrics. This allows you to correlate costs with your pipeline health and performance and see how operational changes affect cost.

Reporting Snowflake Costs on Observe requires that you run a DAG in the Astro Deployments that you want to collect information about. This DAG pulls info only for the specific Snowflake connection you configure for it. So if you have two Deployments that use different Snowflake accounts, and you want to see metrics for both in Observe, you need multiple copies of this DAG running.

At-a-glance view of the Snowflake cost.

Prerequisites

Step 1: Upload the cost attribution DAG

  1. Download the cost_attribution.py DAG from the Observe Cost Attribution GitHub Repo.

  2. Add the cost_attribution.py DAG to your Astro project's dags folder.

  3. Deploy your DAG to Astro.

astro deploy

Step 2: Configure Astro environment variables

  1. Open the Astro Environment manager by selecting a specific Deployment in the Astro UI, then select Environment > Environment Variables.

  2. Set the following environment variables:

Step 3: Configure Airflow environment variables

  1. Open the Astro Environment manager by selecting a specific Deployment in the Astro UI, then select Environment > Airflow Variables.

  2. Set the following environment variables:

  • AIRFLOW_VAR_AUTH_TOKEN: Your Organization API token. This API token must be created by an Organization Owner, and provides credentials

Step 3: Create and verify your Snowflake connection

  1. Open the Astro environment manager by selecting a specific Deployment in the Astro UI, then select Environment > Connections.

  2. Create a Snowflake connection with your Snowflake account information.

    • The Connection ID you configure in Astro must match the conn_id value in the cost_attribution DAG.
    • The connection Role must have ACCOUNT_ADMIN or higher permissions so that it can execute account_usage.query_attribution_history and account_usage.query_history queries in your Snowflake account.
tip

You can find detailed information about Snowflake connections and additional instructions for configuring connections in Airflow in Create a Snowflake connection in Airflow.

Step 4: Test your DAG

Trigger the DAG manually in the Astro or Airflow UI. When the DAG executes it:

  • Fetches query metadata from Astronomer’s API
  • Queries Snowflake for cost metrics
  • Posts cost and query attribution data to the Observe Cost Metrics section of the Health dashboard and the Cost section of Asset metrics

Was this page helpful?