'Getting error while running the log4j logger in pyspark
Im trying to setup the log4j in my pyspark on my local PC.
Spark Binaries Version: spark-3.1.2-bin-hadoop3.2
Python : 3.7
I have done setup using C:\spark-3.1.2-bin-hadoop3.2\conf\spark-defaults.conf to pass parameters
spark.executor.extraJavaOptions -Dlog4j.configuration=file:log4j.properties -Dspark.yarn.app.container.log.dir=app-logs -Dlogfile.name=hello-spark
Also I have log4j.properties in my project folder & creating the objects for logging. But getting below error. Any help would be appreciated. Where I'm doing wrong configuration.
22/02/01 10:44:30 ERROR SparkContext: Error initializing SparkContext.
java.lang.Exception: spark.executor.extraJavaOptions is not allowed to set Spark options
(was '-Dlog4j.configuration=file:log4j.properties -Dspark.yarn.app.container.log.dir=app-logs -Dlogfile.name=hello-spark').
Set them directly on a SparkConf or in a properties file when using ./bin/spark-submit.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|
