'How to import databricks operators into airflow container?

I am trying to run an simple dag with only a dummy- and a databricksoperator (DatabricksRunNowOperator) just to test it. I uploaded the dag into the airflow container, but the databricks operator is not part of the ordinary airflow package. I installed it (locally) with pip install apache-airflow-providers-databricks. Accordingly, the package is not present in the container and an error occurs.

Does anyone know how I provide the mentioned package to the airflow container?



Solution 1:[1]

if you use docker compose as recommended by the official Airflow documentation on Docker setup, then you can specify additional dependencies with _PIP_ADDITIONAL_REQUIREMENTS environment variable (also could be put into .env file in the same folder). For example, I have following in my testing environment:

_PIP_ADDITIONAL_REQUIREMENTS="apache-airflow-providers-databricks==2.4.0rc1"

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Alex Ott