'How does spark.python.worker.memory relate to spark.executor.memory?
This diagram is quite clear on the relationship between the different YARN and Spark memory related settings, except when it comes to spark.python.worker.memory.
How does spark.python.worker.memory fit into this memory model?
Are the Python processes governed by spark.executor.memory or yarn.nodemanager.resource.memory-mb?
Update
This question explains what the setting does, but doesn't answer the question concerning the memory governance, or how it relates to other memory settings.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
