'Migrating from lo4j-1x to log4j-2x

I'm trying to run a Spark job using 3.2.0 and I was trying to migrate from log4j-1x to log4j-2x as Spark 3.2.0 uses log4j-2x instead of log4j-1x.

Below are contents of my log4j.properties file:


log4j.appender.stdout.layout.extractFieldsFromMessage=false
log4j.appender.stdout.layout.mdcKeys=*
log4j.appender.stdout.layout=<some-package>
log4j.appender.stdout.Target=System.out
log4j.appender.stdout=org.apache.log4j.ConsoleAppender


log4j.logger.org.apache.parquet=ERROR
log4j.logger.org.apache.spark.executor.CoarseGrainedExecutorBackend=WARN
log4j.logger.org.apache.spark.executor.Executor=WARN
log4j.logger.org.apache.spark.network.client.TransportClientFactory=WARN
log4j.logger.org.apache.spark.scheduler.DAGScheduler=WARN
log4j.logger.org.apache.spark.scheduler.TaskSetManager=WARN
log4j.logger.org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport=WARN
log4j.logger.org.apache.spark.sql.execution.streaming.state=WARN
log4j.logger.org.apache.spark.storage=WARN
log4j.logger.pie.spark.orchestra.driver.rest.controllers.EventLogController=WARN

How can I re-write the above properties into log4j-2x?

I used some reference and tried to convert into below syntax, but my spark job failed with below error.

New syntax used:

rootLogger.level = INFO
logging.packages=<some-package>

rootLogger.appenderRef.stdout.ref = stdout

appender.stdout.type = Console
appender.stdout.name = stdout
appender.stdout.target = SYSTEM_OUT
appender.stdout.layout = <some-package>
appender.stdout.layout.mdcKeys = *
appender.stdout.layout.extractFieldsFromMessage = false

logger.parquet.name = org.apache.parquet
logger.parquet.level = WARN
logger.spark_executor_CoarseGrainedExecutorBackend.name = org.apache.spark.executor.CoarseGrainedExecutorBackend
logger.spark_executor_CoarseGrainedExecutorBackend.level = WARN
logger.spark_executor_Executor.name = org.apache.spark.executor.Executor
logger.spark_executor_Executor.level = WARN
logger.spark_network.name = org.apache.spark.network.client.TransportClientFactory
logger.spark_network.level = WARN
logger.spark_scheduler_DAGScheduler.name = org.apache.spark.scheduler.DAGScheduler
logger.spark_scheduler_DAGScheduler.level = WARN
logger.spark_scheduler_TaskSetManager.name = org.apache.spark.scheduler.TaskSetManager
logger.spark_scheduler_TaskSetManager.level = WARN
logger.spark_sql_execution_datasources.name = org.apache.spark.sql.execution.datasources.parquet.ParquetWriteSupport
logger.spark_sql_execution_datasources.name.level = WARN
logger.spark_sql_execution_streaming.name = org.apache.spark.sql.execution.streaming.state
logger.spark_sql_execution_streaming.level = WARN
logger.spark_storage.name = org.apache.spark.storage
logger.spark_storage.level = WARN
logger.spark_orchestra.name = pie.spark.orchestra.driver.rest.controllers.EventLogController
logger.spark_orchestra.level = WARN

Spark Error:

{"lastObservedDriverPodSummary":{"containerStates":
[{"exitCode":1,"message":"nfigurationBuilder.createAppender(PropertiesConfigurationBuilder.java:222)\n\tat
org.apache.logging.log4j.core.config.properties.PropertiesConfigurationBuilder.build(PropertiesConfigurationBuilder.java:158)\n\tat 
org.apache.logging.log4j.core.config.properties.PropertiesConfigurationFactory.getConfiguration(PropertiesConfigurationFactory.java:56)\n\tat 
org.apache.logging.log4j.core.config.properties.PropertiesConfigurationFactory.getConfiguration(PropertiesConfigurationFactory.java:35)\n\tat 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:523)\n\tat 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:498)\n\tat 
org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:422)\n\tat 
org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:323)\n\tat 
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:695)\n\tat 
org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:716)\n\tat 
org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:270)\n\tat 
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:155)\n\tat 
org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:47)\n\tat org.apache.logging.log4j.LogManager.getContext(LogManager.java:196)\n\tat 
org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(Abst","name":"spark-driver",
"reason":"Error"}],"phase":"Running"}}


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source