'How to fix Airflow logging?

I have S3 remote logging enabled, Airflow is installed on an EC2. My dags are running however, they don't always create a log and then fails. The error is as follows:

*** Falling back to local log

*** Log file does not exist: /home/ec2-user/airflow/logs/REMOVED/REMOVED/2022-03-07T07:00:00+00:00/2.log

*** Fetching from: http://ip-10-105-32-92.eu-west-1.compute.internal:8793/log/REMOVED/REMOVED/2022-03-07T07:00:00+00:00/2.log

*** Failed to fetch log file from worker. Client error '404 NOT FOUND' for url 'http://ip-10-105-32-92.eu-west-1.compute.internal:8793/log/REMOVED/REMOVED/2022-03-07T07:00:00+00:00/2.log' For more information check: https://httpstatuses.com/404

After a few attempts (3-5) it eventually does end up working.

I have even disabled the remote logging in an attempt to debug and it still doesn't work? Any suggestions?

Airflow version:

Apache Airflow version | 2.2.4

We use a describe stacks API call to get the latest ECS Task Definition and I've noticed we have lots of these errors:

An error occurred (Throttling) when calling the DescribeStacks operation (reached max retries: 4): Rate exceeded



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source