Skip to content

astronomer/cosmos-ebook-companion

Repository files navigation

Cosmos companion repository

This repository serves as a companion to the eBook:

How to run this repository

  1. Fork this repository

  2. Clone the forked repository to your local machine

  3. Make sure you have the Astro CLI installed and are at least on version 1.34.0 to be able to run this Airflow 3 based project.

  4. Create a copy of the .env_example file and name it .env, this file contains environment variables. If you'd like to run dags using other backends then Postgres or Spark, you will need to updated the connections in this file with your values, for example to connect to a Snowflake instance.

  5. In the root of the project, run astro dev start to start the project locally. This command will spin up 8 containers on your machine, using Docker or Podman. 5 containers that run Airflow:

    • Postgres: Airflow's Metadata Database
    • API Server: The Airflow component responsible for rendering the Airflow UI and serving 3 APIs, one of which is needed for task code to interact with the Airflow metadata database.
    • Scheduler: The Airflow component responsible for monitoring and triggering tasks
    • Dag processor: The Airflow component responsible for parsing dags.
    • Triggerer: The Airflow component responsible for triggering deferred tasks

    As well as 3 additional containers defined in the docker-compose.override.yml file to test the dags against:

    • Spark Master and Worker: Spark containers to run the dags example_DbtDag_spark and example_DbtTaskGroup_spark dags.
    • Postgres: A postgres database that most of the dag example can be tested with locally.

    The connections to the Spark and Postgres containers are listed in the .env_example file and will be automatically set up if you created a .env file with the same content.

  6. You can now access the Airflow UI at http://localhost:8080 and run the dags. All dags that are tagged with out-of-the-box are ready to run without further setup.

Contents

This repository contains 26 dags.

All dbt projects are located in the include/dbt folder.

About

Companion repository to the Practical Guide: Orchestrating dbt with Apache Airflow® with Cosmos

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors