1

I run airflow using official docker compose file.

I would like to run a simple command with Airflow BashOperator to clean my logs:

    clean_database = BashOperator(
        task_id="clean-database",
        bash_command=f'airflow db clean --clean-before-timestamp {clean_before.isoformat()} --yes',
    )

But I got the following error:

sqlalchemy.exc.ArgumentError: Could not parse SQLAlchemy URL from string 'airflow-db-not-allowed:///'

Which seems to be an explicit forbidden while this command used to work nicely in Airflow 2.x.

Anyway if I issue the command by hand inside the worker container, it works:

docker exec -it airflow-airflow-worker-1 bash
airflow db clean --clean-before-timestamp 20250930 --yes

I have cross checked, the BashOperator runs within the same container airflow-airflow-worker-1 (they have the same id).

So I am wondering what is wrong with this BashOperator?

Update

I have found the following claim by googling (probably AI generated) which seems to point out that something has changed in Airflow 3+:

The error 'airflow-db-not-allowed:///' typically arises in Apache Airflow 3.0+, where direct database access via the ORM (Object Relational Mapper) is no longer permitted. This restriction enforces better practices by requiring users to interact with the metadata database through REST APIs or the Python client instead of direct queries.

Anyway, it does not explain why I can issue a cli by hand and not reproduce it with a BashOperator within the same container.

3
  • So how do you know that the bash operator is running where your airflow is running? It so much depends on the executor. I would do ssh or kubectl exec or nomad exec or docker exec into the context with my airflow, and then do airflow db commands. Commented Sep 30 at 20:53
  • @KamilCuk, I have updated the context of the command execution. I expected the BashOperator to have the same context as the Airflow Worker, but it seems it is not the case. Commented Oct 1 at 5:56
  • @KamilCuk I have checked, the BashOperator does run in the Worker container which is properly configured as I can perform a docker exec by hand. Commented Oct 1 at 7:14

1 Answer 1

1

It seems it is an intended limitation of Airflow 3+. Details are available in this discussion.

Here is the recommanded baseline as per October 2025:

Recommended Approach

For production, the safest and most maintainable way is:

Keep airflow db clean outside DAGs, running either via cron or as a one-off container job.

If you need to automate via DAG, use DockerOperator or KubernetesPodOperator with explicit DB connection, not BashOperator.

Avoid passing destructive commands directly inside DAGs, because Airflow 3 intentionally prevents it for safety.

Sign up to request clarification or add additional context in comments.

Comments

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.