I've been noticing that some of the DAG runs for an hourly DAG are being skipped, I checked the log for the DAG run before it started skipping and noticed it h
So I want to use airflow to display my model training. I created a model in a python function and now I want to pass it to another function which will train it.
I am trying to make the airflow KubernetesPodOperator work with minikube. But unfortunately, the operator does not find the kubernetes cluster. The dag returned
I am using airflow 1.10. I would like to add images to the email send from the email operator. I saw that I can attach files, and that I can send HTML content o
I am running Airflowv1.10.15 on Cloud Composer v1.16.16. My DAG looks like this : from datetime import datetime, timedelta # imports from airflow import DAG fr
When I create a dummy DAG following the Apache guide to airflow in Docker and run docker-compose up, the webserver container repeatedly fails and restarts with
I am trying to run a few days using airflow 2.0.2 and I want to install all requirements from this file(https://github.com/aws/aws-mwaa-local-runner/blob/main/d
Let's imaging the response from the server { "overriding_parameters": { "jar_params": [ "{\"aggregationType\":\"Type1\",\"startDate\":\"
I need help in passing parameters (xcom pushed from previous task), to a SQL query in a .sql file. However, I am unable to do so using the "parameters" option,
Error - *** Failed to verify remote log exists s3://airflow_test/airflow-logs/demo/task1/2022-05-13T18:20:45.561269+00:00/1.log. An error occurred (403) when ca
Can someone provide a YAML file of the same mentioned above? I need it for a project. I am trying to execute my tasks parallelly on each core of the workers, as
Is it possible to pass an XCom to an operator parameter without using a Jinja template? I have a dict stored in an XCom and I need to pass it to an Operator tha
Let's say I have the follow dummy DAG defined as below: @dag(default_args=default_args, schedule_interval=None, start_date=days_ago(2)) def airflow_ta
I have a DAG that shall check if a file has been uploaded to Azure DataLake in a specific directory. If so, it allow other DAGs to run. I thought about using a
I have the following SimpleHttpOperator inside my dag: extracting_user = SimpleHttpOperator( task_id='extracting_user', http_conn_id='user_api',
My Windows 10 machine has Airflow 1.10.11 installed within WSL 2 (Ubuntu-20.04). I have a BashOperator task which calls an .EXE on Windows (via /mnt/c/... or vi
Hi All so my dag actully runs fine, all the outputs are working but airflow's UI does not change to succes and fails due to the following issue. Reading online
I am working with Airflow 2.2.3 in GCP (Composer) and I am seeing inconsistent behavior which I can't explain when trying to use template values. When I referen
I have a dag that I want to run multiple times say 30. But airflow can parallelly execute 16 dag runs at a time. Suppose one dag run takes longer time to execut
I have two tasks inside a TaskGroup that need to pull xcom values to supply the job_flow_id and step_id. Here's the code: with TaskGroup('execute_my_steps') a