'AIRFLOW trigger DAG (dynamically) multiple times
I have a DAG where data elements are generated and stored in a REDIS database. When the storage of all elements are done, I want to trigger a further processing DAG. Right now, I take the top entry from REDIS and give its key as argument to my secondary DAG. Then, the secondary DAG processes this entry and triggers itself again if there are further entries in the REDIS.
trigger_local_sync = TriggerDagRunOperator(
task_id="trigger_local_sync",
trigger_dag_id="local_sync",
conf={
"entry_id": "{{ task_instance.xcom_pull(key='entry_id') }}"
},
)
parse_term_entries >> get_top_entry_id >> trigger_local_sync
However, is it also possible to dynamically trigger the local_sync DAG multiple times for each entry in the REDIS database so the tasks can then run in parallel?
Solution 1:[1]
library(tidyverse)
df %>%
mutate(State=str_c('State', cumsum(lag(District=='State', default = T))),
District = str_remove(District, 'District ')) %>%
filter(District != 'State')
District State
1 A State1
2 B State1
3 C State1
4 D State2
5 E State2
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | onyambu |
