'Is there an alternative to "dag_dir_list_interval" in Airflow to upload dags from storage to scheduler?

hope you are all doing well.

I am using an airflow instance deployed on Kubernetes using Helm Chart. I setup my dag folder inside a rook nfs storage. I need these dags to be processed instantly by the airflow scheduler. Airflow provide an environment variable, namely "dag_dir_list_interval". In my configuration I set this variable to 1 which means that the scheduler will check every seconds if there is a new dag file inside the dag folder.

It works but as you can imagine it is very not efficiency as it costs a lot in terms of CPU Usage.

I wanted to know if there were any alternative to this environment variable, for example, let's say a call API that allows me to tell to the scheduler "hey there is a new dag to be processed" without checking every seconds for new file inside the nfs storage.

Thank you for your suggestions.



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source