'Dataproc job Java API method setLoggingConfig has no effect

I'm using groovy script with dependency com.google.cloud:google-cloud-dataproc:2.3.2 and trying to set logging config using code like this:

    import com.google.cloud.dataproc.v1.*
    ...
    final def LOGGING_LEVELS = ['com.example.Myclass': LoggingConfig.Level.DEBUG,'org.apache.spark': LoggingConfig.Level.WARN]
final def args = 'programArg1 programArg2'
    
    def sparkJob = SparkJob
            .newBuilder()
            .addJarFileUris(jarLocation)
            .setMainClass(className)
            .addAllArgs(args.split(" ") as Iterable<String>)
            .setLoggingConfig(LoggingConfig.newBuilder().putAllDriverLogLevels(LOGGING_LEVELS).build())
            .build()
    def job = Job.newBuilder().setPlacement(jobPlacement).setSparkJob(sparkJob).build()
    ...

and it doesn't make any effect. However, when I submit the job via gcloud utility, it works fine:

gcloud dataproc jobs submit spark \
--driver-log-levels com.example.Myclass=DEBUG,org.apache.spark=WARN \
--project=my-project \
--cluster=my-cluster \
--region=us-central1 \
--class=Myclass \
--jars=gs://mybucket/Myclass-0.1-SNAPSHOT.jar \
-- programArg1 programArg2


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source