'Object Mapper in Java Spark Application not working as expected

I am new to spark and have been trying to develop an application using Spark Java API. In there, I am trying to create an Object Mapper with certain SerializationInclusion and SerializationFeature properties.

When I am running it on local, it works perfectly. But on spark cluster in client mode and master as yarn, it is not working as expected. I am doing a spark-submit.

ObjectMapper objectMapper;
objectMapper = new ObjectMapper()
  .setSerializationInclusion(JsonInclude.Include.NON_NULL)
  .setSerializationInclusion(JsonInclude.Include.NON_EMPTY)
  .configure(DeserializationFeature.FAIL_ON_UNKNOWN_PROPERTIES, false)
  .configure(SerializationFeature.WRITE_EMPTY_JSON_ARRAYS, false)
  .configure(SerializationFeature.INDENT_OUTPUT,Boolean.FALSE.booleanValue());

objectMapper.setVisibilityChecker(objectMapper.getSerializationConfig()
  .getDefaultVisibilityChecker()
  .withFieldVisibility(JsonAutoDetect.Visibility.ANY)
  .withGetterVisibility(JsonAutoDetect.Visibility.NONE)
  .withSetterVisibility(JsonAutoDetect.Visibility.NONE)
  .withCreatorVisibility(JsonAutoDetect.Visibility.NONE));

Have tried declaring it inside dataset.map(new MapFunction<T, U>(){}) as well, but no luck.

Expectation: {"id":"1"}    Vs     Reality: {"id":"1","key1":null,"key2":[]}

Spark Version: 2.3.0 Java: 8 Scala: 2.11



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source