'Airflow DockerOperator workaround for Docker-in-Docker Setup

I deployed Airflow (2.2.4) using Docker and I would like to use the DockerOperator in some of the DAG tasks. I also want to mount a volume with those DockerOperator tasks, but it seems that the mounts feature is not available for the Docker-in-Docker setup.

I am trying to think of an elegant workaround since I really need to get the created data (log files and possibly data quality reports by great_expecations) out of this container.

So far I've considered using scp with an SSH key as an environment variable and adding that as an extra command to the DockerOperator. I also looked at the XCom feature, but I don't think it would fit the purpose. Does anyone else have an idea?

Sample task:

etl_db_connections_check = DockerOperator(
    task_id="etl_db_connections_check", 
    image='sample_image:latest',
    api_version='auto',
    auto_remove=True,
    environment=environment,
    command='/bin/bash -c \'python /app/src/main.py connection_check\'',
    docker_url='unix://var/run/docker.sock',
    network_mode='bridge',
    docker_conn_id='private_registry',
    xcom_all=True,
    privileged=True,
    mount_tmp_dir=False,
    mounts=[
        Mount(
        source='/data/airflow/dags/sample-logs', 
        target='/logs:rw', 
        type='bind'
    )]
)


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source