How I Solved a 2-Day Airflow DAG Import Error in Docker

No module named ‘airflow\.providers.postgres.operators’ — and why using SQLExecuteQueryOperator saved the day

The Frustrating Error That Blocked My DAGs

If you’ve ever set up Apache Airflow in Docker and tried to use the Postgres operator in your DAGs, you may have hit this roadblock:

from airflow.providers.postgres.operators.postgres import PostgresOperator

Seems correct, right? But when I ran my project, this was the error I kept seeing in the UI under “DAG Import Errors”:

𝘔𝘰𝘥𝘶𝘭𝘦𝘕𝘰𝘵𝘍𝘰𝘶𝘯𝘥𝘌𝘳𝘳𝘰𝘳: 𝘕𝘰 𝘮𝘰𝘥𝘶𝘭𝘦 𝘯𝘢𝘮𝘦𝘥 ‘𝘢𝘪𝘳𝘧𝘭𝘰𝘸.𝘱𝘳𝘰𝘷𝘪𝘥𝘦𝘳𝘴.𝘱𝘰𝘴𝘵𝘨𝘳𝘦𝘴.𝘰𝘱𝘦𝘳𝘢𝘵𝘰𝘳𝘴’

This blocked all DAGs from running. I spent 2 full days trying everything from installing Postgres manually inside the container to tweaking import paths, but nothing worked.

What Didn’t Work (So You Don’t Waste Time Like I Did)

\ Trying to install postgresql binaries locally
* Running pip install apache-airflow-providers-postgres inside a running container (not persistent)
\
Changing import paths back and forth
* Modifying the DAG with fallbacks

Despite these efforts, the error persisted.

The Actual Fix: Use the Common SQL Operator

The breakthrough came when I stopped trying to force PostgresOperator to work and instead used the unified operator from the common-sql provider:

from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator

This operator abstracts over SQL-based providers like Postgres, MySQL, MSSQL, and Snowflake. It’s compatible with all of them as long as the right connection ID is defined.

FROM apache/airflow:3.0.1

USER root
RUN apt-get update && apt-get install -y \
  build-essential \
  libmysqlclient-dev \
  libpq-dev \
  libsasl2-dev \
  unixodbc-dev \
  libffi-dev \
  libssl-dev \
  gcc \
  curl \
  git \
  && apt-get clean

USER airflow
COPY requirements.txt /requirements.txt
RUN pip install --no-cache-dir -r /requirements.txt

My requirements.txt

This file made the magic happen:

apache-airflow==3.0.1

# Common SQL should not be pinned unless you're troubleshooting
apache-airflow-providers-common-sql

# Specific providers
apache-airflow-providers-postgres==5.10.1
apache-airflow-providers-mysql[mysql]==5.6.1
apache-airflow-providers-microsoft-mssql[mssql]==3.6.1
apache-airflow-providers-google[gcp]==10.14.0
apache-airflow-providers-amazon[amazon]==8.13.0
apache-airflow-providers-snowflake[snowflake]==5.6.0

Final Steps to Rebuild and Launch

After updating the Dockerfile and requirements:

docker-compose build
docker-compose up airflow-init
docker-compose up

Then to verify everything:

docker exec -it airflow-webserver bash
pip freeze | grep airflow

All provider packages were installed and the DAG loaded without errors.

Summary

Here’s the simplified takeaway for anyone dealing with ModuleNotFoundError for PostgresOperator in Airflow 3:

  • Don’t rely on PostgresOperator directly. Use:

from airflow.providers.common.sql.operators.sql import SQLExecuteQueryOperator

  • Include the correct provider packages in requirements.txt
    * Rebuild your Docker image (not just docker-compose up)
    * Confirm packages are installed inside your container

If you want your DAGs to support multiple SQL engines without changing code, the SQLExecuteQueryOperator is your best friend. Simply set the conn_id appropriately:

SQLExecuteQueryOperator(
  task_id="run_query",
  conn_id="postgres_default",
  sql="SELECT * FROM users;"
)

Change conn_id to mysql_default or mssql_default and you’re good to go!

Thanks for Reading

I wrote this because I spent 2 full days searching for this exact resolution and didn’t find it clearly explained.

0
Subscribe to my newsletter

Read articles from Emmanuel Justice directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Emmanuel Justice
Emmanuel Justice

I'm Emmanuel Justice, a passionate and results-driven data analyst with a huge drive for transforming raw data into actionable insights. As an introvert and one with a strong knack for numbers, switching to a data analytics career has been the best path for me. I have honed my analytical skills and cultivated a deep appreciation for the power of data-driven decision-making. I'm excited about the opportunity to leverage my skills and experience to help organizations unlock the true potential of their data. Let's connect and explore how we can work together to make data work for you!