I run airflow using official docker compose file.
I would like to run a simple command with Airflow BashOperator to clean my logs:
clean_database = BashOperator(
task_id="clean-database",
bash_command=f'airflow db clean --clean-before-timestamp {clean_before.isoformat()} --yes',
)
But I got the following error:
sqlalchemy.exc.ArgumentError: Could not parse SQLAlchemy URL from string 'airflow-db-not-allowed:///'
Which seems to be an explicit forbidden while this command used to work nicely in Airflow 2.x.
Anyway if I issue the command by hand inside the worker container, it works:
docker exec -it airflow-airflow-worker-1 bash
airflow db clean --clean-before-timestamp 20250930 --yes
I have cross checked, the BashOperator runs within the same container airflow-airflow-worker-1 (they have the same id).
So I am wondering what is wrong with this BashOperator?
Update
I have found the following claim by googling (probably AI generated) which seems to point out that something has changed in Airflow 3+:
The error 'airflow-db-not-allowed:///' typically arises in Apache Airflow 3.0+, where direct database access via the ORM (Object Relational Mapper) is no longer permitted. This restriction enforces better practices by requiring users to interact with the metadata database through REST APIs or the Python client instead of direct queries.
Anyway, it does not explain why I can issue a cli by hand and not reproduce it with a BashOperator within the same container.
sshorkubectl execornomad execordocker execinto the context with my airflow, and then do airflow db commands.BashOperatordoes run in the Worker container which is properly configured as I can perform adocker execby hand.