Running Apache Airflow Using Docker Compose

As DevOps engineers, automation and orchestration are at the core of what we do. Recently, I set up Apache Airflow using Docker Compose, and I thought of documenting the process to help fellow learners and engineers.

What is Apache Airflow?

Apache Airflow is an open-source platform to programmatically author, schedule, and monitor workflows. It’s widely used in data engineering and ETL pipelines to automate complex tasks.

But installing and managing Airflow manually can be overwhelming. That’s where Docker Compose helps—making it easy to spin up all required services with a single command.


🐳 Why Use Docker Compose for Airflow?

  • No manual configuration hassle

  • Isolated containerized setup

  • Easy to scale and manage

  • Replicable across environments


🧱 Docker Compose Setup for Airflow

Here’s a minimal yet production-ready docker-compose.ymlthat runs Airflow with PostgreSQL as the metadata database:

services:
  postgres:
    image: postgres:latest
    container_name: postgres
    environment:
      POSTGRES_USER: airflow
      POSTGRES_PASSWORD: airflow
      POSTGRES_DB: airflow
    ports:
      - "5432:5432"
    volumes:
      - postgres_data:/var/lib/postgresql/data

  airflow-webserver:
    image: apache/airflow:latest
    container_name: airflow-webserver
    depends_on:
      - postgres
    environment:
      AIRFLOW__CORE__EXECUTOR: LocalExecutor
      AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
      AIRFLOW__CORE__FERNET_KEY: 'e55K3HQpCQA_MhXo96Ch-CIIC2LqRWdCZkrq9va76cw='
      AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
    ports:
      - "8080:8080"
    command: webserver
    volumes:
      - airflow_dags:/opt/airflow/dags
      - airflow_logs:/opt/airflow/logs
      - airflow_config:/opt/airflow/config

  airflow-scheduler:
    image: apache/airflow:latest
    container_name: airflow-scheduler
    depends_on:
      - postgres
    environment:
      AIRFLOW__CORE__EXECUTOR: LocalExecutor
      AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres:5432/airflow
      AIRFLOW__CORE__FERNET_KEY: 'e55K3HQpCQA_MhXo96Ch-CIIC2LqRWdCZkrq9va76cw='
      AIRFLOW__CORE__LOAD_EXAMPLES: 'false'
    command: scheduler
    volumes:
      - airflow_dags:/opt/airflow/dags
      - airflow_logs:/opt/airflow/logs
      - airflow_config:/opt/airflow/config

volumes:
  postgres_data:
  airflow_dags:
  airflow_logs:
  airflow_config:

⚙️ How to Run It

  • Save the above content as docker-compose.yml

  • Run the following command in the directory:

docker-compose up -d
  • Access the Airflow UI at:
    http://localhost:8080

  • Default credentials (unless changed):

    • Username: airflow

    • Password: airflow

Verify Airflow Is Running

Use the command:

docker ps

You should see three running containers: postgres, airflow-webserver, and airflow-scheduler.

✅ Conclusion

Running Airflow using Docker Compose is a clean and scalable approach—ideal for local testing, learning, or even lightweight production environments. You can easily version control your work docker-compose.ymland share it across teams.

This setup is a stepping stone to building powerful ETL workflows, automating data pipelines, and integrating with cloud storage or APIs.

Connect and Follow Me on Socials Network

LINKEDIN | GITHUB | TWITTER

1
Subscribe to my newsletter

Read articles from priyadarshi ranjan directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

priyadarshi ranjan
priyadarshi ranjan

Greetings! 👋 I'm Priyadarshi Ranjan, a dedicated DevOps Engineer embarking on an enriching journey. Join me as I delve into the dynamic realms of cloud computing and DevOps through insightful blogs and updates. 🛠️ My focus? Harnessing AWS services, optimizing CI/CD pipelines, and mastering infrastructure as code. Whether you're peers, interns, or curious learners, let's thrive together in the vibrant DevOps ecosystem. 🌐 Connect with me for engaging discussions, shared insights, and mutual growth opportunities. Let's embrace the learning curve and excel in the dynamic realm of AWS and DevOps technology!