'apache spark: Completed application history deleted after restarting

When I restart spark cluster all of history of completed application in web ui are deleted. How can I preserve this history from deleting when restarting?



Solution 1:[1]

Spark itself doesn't store logs. If you want to store them then you need to enable that config by using "spark.eventLog":

./bin/spark-submit --class org.apache.spark.examples.SparkPi \
--master spark://10.129.6.11:7077 \
--conf spark.eventLog.enabled=true \
--conf spark.eventLog.dir="hdfs://your path" \
/home/spark/spark-3.2.1-bin-hadoop3.2/examples/jars/spark-examples_2.12-3.2.1.jar 8

Solution 2:[2]

Don't restart spark master. Just make it got query like Zeppelin.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Utkarsh I.
Solution 2 Juhong Jung