Airflow Docker Compose environment
Download the docker-compose.yml file
curl -LfO --url https://airflow.apache.org/docs/apache-airflow/2.5.1/docker-compose.yaml
Create the directory structure under home
mkdir -p ./dags ./logs ./plugins
On Linux, configure the host user id
echo -e "AIRFLOW_UID=$(id -u)" > .env
Run the following to initialise the database:
docker compose up airflow-init
docker compose up
Access the web interface at http://localhost:8080
docker compose down --volumes --rmi all
docker compose down --volumes --remove-orphans
Then remove the entire (working) directory. Start again with downloading the docker-compose.yml file
https://airflow.apache.org/docs/apache-airflow/stable/howto/docker-compose/index.html
On the docker compose environment we can access the API using localhost:8080 or $ENDPOINT_URL on Linux
(https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html)
curl -X GET --user "airflow:airflow" "${ENDPOINT_URL}/health"
curl -X GET --user "airflow:airflow" "${ENDPOINT_URL}/api/v1/pools"
curl -X GET --user "airflow:airflow" "${ENDPOINT_URL}/api/v1/dags"
curl -X POST --user "airflow:airflow" "${ENDPOINT_URL}/api/v1/dags/tutorial_dag/dagRuns"
-H 'Content-Type: application/json'
--user "airflow:airflow"
-d '{
"dag_run_id": "test_run",
"logical_date": "2023-03-14T14:15:22Z",
"conf": { }
}'
docker-compose run airflow-worker airflow info
- down load the bash shortcut
curl -LfO 'https://airflow.apache.org/docs/apache-airflow/2.1.1/airflow.sh'
chmod +x airflow.sh
- try the following commands
./airflow.sh info
./airflow.sh bash
./airflow.sh python
To view a dag definition - you can find the names using the web UI, or for self-code dags, the name is in the python definition file in the dags directory.
airflow dags show <dag_name>