Category "airflow"

How to automatically refresh DAGs through docker image in helm chart deployed Airflow

I have a deployment of Airflow running in a kubernetes cluster. I deployed it there using the official helm chart (described here). I manage DAGs using the reco

GCP-Cloud Composer: Secret Manager access variable.json

I try to configure Secret Manager for my Composer (ver 1.16, airflow 1.10) but I have a weird situation like below. In my Composer, I've used a variable.json fi

Have broken DAG when try to import MySqlOperator

Trying to compile DAG from Link to AirFlow official documentation Got Airflow dashboard error: Broken DAG: [/ua/sv/airflow/dags/example_mysql.py] Traceback (mos

Airflow kubernetes architecture understanding

I'm trying to understand the Arch of Airflow on Kubernetes. Using the helm and Kubernetes executor, the installation mounts 3 pods called: Trigger, WebServer, a

issue with passing return value from a task as an argument to another task

i've a task that returns a tuple. passing one element of that tuple to another task is not working. i can pass the entire tuple, but not an element from the ret

Airflow 2.0.2 - Hourly DAG getting stuck seeing Refreshing TaskInstance repeatedly

I've been noticing that some of the DAG runs for an hourly DAG are being skipped, I checked the log for the DAG run before it started skipping and noticed it h

Passing a trained model to another function in airflow

So I want to use airflow to display my model training. I created a model in a python function and now I want to pass it to another function which will train it.

Accessing minikube via the KubernetesPodOperator in airflow

I am trying to make the airflow KubernetesPodOperator work with minikube. But unfortunately, the operator does not find the kubernetes cluster. The dag returned

Add images to email operator on airflow

I am using airflow 1.10. I would like to add images to the email send from the email operator. I saw that I can attach files, and that I can send HTML content o

Airflow DAG fails when PythonOperator with error "Negsignal.SIGKILL"

I am running Airflowv1.10.15 on Cloud Composer v1.16.16. My DAG looks like this : from datetime import datetime, timedelta # imports from airflow import DAG fr

Why can't my airflow webserver initialize properly?

When I create a dummy DAG following the Apache guide to airflow in Docker and run docker-compose up, the webserver container repeatedly fails and restarts with

Cannot install alembic==1.5.8 because these package versions have conflicting dependencies

I am trying to run a few days using airflow 2.0.2 and I want to install all requirements from this file(https://github.com/aws/aws-mwaa-local-runner/blob/main/d

Valid JSON giving JSONDecodeError: Expecting , delimiter from variable not raw string

Let's imaging the response from the server { "overriding_parameters": { "jar_params": [ "{\"aggregationType\":\"Type1\",\"startDate\":\"

How to render a .sql file with parameters in MySqlOperator in Airflow?

I need help in passing parameters (xcom pushed from previous task), to a SQL query in a .sql file. However, I am unable to do so using the "parameters" option,

An error occurred (403) when calling the HeadObject operation: Forbidden in airflow (2.0.0)+

Error - *** Failed to verify remote log exists s3://airflow_test/airflow-logs/demo/task1/2022-05-13T18:20:45.561269+00:00/1.log. An error occurred (403) when ca

Docker-compose file of Airflow with DaskExecutor

Can someone provide a YAML file of the same mentioned above? I need it for a project. I am trying to execute my tasks parallelly on each core of the workers, as

Is it possible to use an XCom from an operator parameter without using Jinja?

Is it possible to pass an XCom to an operator parameter without using a Jinja template? I have a dict stored in an XCom and I need to pass it to an Operator tha

Defining complex workflow dependency in airflow 2.0 taskflow API

Let's say I have the follow dummy DAG defined as below: @dag(default_args=default_args, schedule_interval=None, start_date=days_ago(2)) def airflow_ta

What is the best way to check if a file exists on an Azure Datalake using Apache Airflow?

I have a DAG that shall check if a file has been uploaded to Azure DataLake in a specific directory. If so, it allow other DAGs to run. I thought about using a

Airflow SimpleHttpOperator is not pushing to xcom

I have the following SimpleHttpOperator inside my dag: extracting_user = SimpleHttpOperator( task_id='extracting_user', http_conn_id='user_api',