I have a deployment of Airflow running in a kubernetes cluster. I deployed it there using the official helm chart (described here). I manage DAGs using the reco
I try to configure Secret Manager for my Composer (ver 1.16, airflow 1.10) but I have a weird situation like below. In my Composer, I've used a variable.json fi
Trying to compile DAG from Link to AirFlow official documentation Got Airflow dashboard error: Broken DAG: [/ua/sv/airflow/dags/example_mysql.py] Traceback (mos
I'm trying to understand the Arch of Airflow on Kubernetes. Using the helm and Kubernetes executor, the installation mounts 3 pods called: Trigger, WebServer, a
i've a task that returns a tuple. passing one element of that tuple to another task is not working. i can pass the entire tuple, but not an element from the ret
I've been noticing that some of the DAG runs for an hourly DAG are being skipped, I checked the log for the DAG run before it started skipping and noticed it h
So I want to use airflow to display my model training. I created a model in a python function and now I want to pass it to another function which will train it.
I am trying to make the airflow KubernetesPodOperator work with minikube. But unfortunately, the operator does not find the kubernetes cluster. The dag returned
I am using airflow 1.10. I would like to add images to the email send from the email operator. I saw that I can attach files, and that I can send HTML content o
I am running Airflowv1.10.15 on Cloud Composer v1.16.16. My DAG looks like this : from datetime import datetime, timedelta # imports from airflow import DAG fr
When I create a dummy DAG following the Apache guide to airflow in Docker and run docker-compose up, the webserver container repeatedly fails and restarts with
I am trying to run a few days using airflow 2.0.2 and I want to install all requirements from this file(https://github.com/aws/aws-mwaa-local-runner/blob/main/d
Let's imaging the response from the server { "overriding_parameters": { "jar_params": [ "{\"aggregationType\":\"Type1\",\"startDate\":\"
I need help in passing parameters (xcom pushed from previous task), to a SQL query in a .sql file. However, I am unable to do so using the "parameters" option,
Error - *** Failed to verify remote log exists s3://airflow_test/airflow-logs/demo/task1/2022-05-13T18:20:45.561269+00:00/1.log. An error occurred (403) when ca
Can someone provide a YAML file of the same mentioned above? I need it for a project. I am trying to execute my tasks parallelly on each core of the workers, as
Is it possible to pass an XCom to an operator parameter without using a Jinja template? I have a dict stored in an XCom and I need to pass it to an Operator tha
Let's say I have the follow dummy DAG defined as below: @dag(default_args=default_args, schedule_interval=None, start_date=days_ago(2)) def airflow_ta
I have a DAG that shall check if a file has been uploaded to Azure DataLake in a specific directory. If so, it allow other DAGs to run. I thought about using a
I have the following SimpleHttpOperator inside my dag: extracting_user = SimpleHttpOperator( task_id='extracting_user', http_conn_id='user_api',