

Dockerfile: This is where you specify your version of Astro Runtime, which is a runtime software based on Apache Airflow that is built and maintained by Astronomer.For more information on DAGs, see Introduction to Airflow DAGs. Each Astro project includes two example DAGs: example-dag-basic and example-dag-advanced. For this tutorial, you only need to know the following files and folders: The default Astro project structure includes a collection of folders and files that you can use to run and customize Airflow. All you need to know is that Airflow runs on the compute resources of your machine, and that all necessary files for running Airflow are included in your Astro project. For this tutorial, you don't need an in-depth knowledge of Docker. When you run Airflow on your machine with the Astro CLI, Docker creates a container for each Airflow component that is required to run DAGs. Docker is a service to run software in virtualized containers within a machine. The Astro project is built to run Airflow with Docker. To run data pipelines on Astro, you first need to create an Astro project, which contains the set of files necessary to run Airflow locally.Ĭreate a new directory for your Astro project: A local installation of Python 3 to improve your Python developer experience. An integrated development environment (IDE) for Python development, such as VSCode.This is pre-installed on most operating systems. A terminal that accepts bash commands.To get the most out of this tutorial, make sure you have an understanding of: This tutorial takes approximately 1 hour to complete. Run a local Airflow environment using the Astro CLI.This tutorial is for people who are new to Apache Airflow and want to run it locally with open source tools.Īfter you complete this tutorial, you'll be able to: Getting started with Apache Airflow locally is easy with the Astro CLI. Get started with Apache Airflow, Part 1: Write and run your first DAG
