'Spark Submit Executor AllocationManager Warning
I'm running a Spark job on EMR Cluster but i keep getting this in logs
Is this an important warning and how can i fix it i think it has to do with Cluster Scaling
Also after looking in Spark Job History I found that executors got removed before the job finished
I run the job with :
spark-submit --master yarn --deploy-mode client --executor-cores 4 --num-executors 7 myJob.py
And also the job takes over 1h is it normal ? the Job is : i'm reading csv file ( 1gb) and filling some empty fields and then returning new csv file
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|


