'Spring Batch - define reader/writer stepscope programmatically without annotation

FIRST VERSION


I would define multiple Spring Batch readers/writers programmatically, especially for StepScope.

It's common to define a reader with annotation @StepScope and so on, but this is not my scenario.

AS-IS:

@Bean
public Job exporterJobForLUS149A() {
    Job jobWithStep = buildJobWithStep(LUS149A.class, readerForLUS149A(), writerForLUS149A());
    return jobWithStep;
}

@StepScope
@Bean
public EntityMongoItemReader<LUS149A> readerForLUS149A() {
    final EntityMongoItemReader<LUS149A> reader = reader(LUS149A.class);
    return reader;
}

@StepScope
@Bean
public CommonEntityToFlatFileItemWriter<LUS149A> writerForLUS149A() {
    final CommonEntityToFlatFileItemWriter<LUS149A> writerForLUS149A = writer(LUS149A.class);
    return writerForLUS149A;
}

But I have a lot of job (with its reader and writer) to define (~20) and further could arrive... and in general I would not modify code for each new job/reader/writer I have to provide.

So, the idea, TO-BE (pseudo-code):

for (String entityClass: entities) {
    // this provides the entityClass, in some way
    Class<? extends AccountSessionAware> entityClass = buildEntityClass(entitiesDefaultPackage, entityName, 
    output.getEntityPackage());
    
    // for readers
    EntityMongoItemReader<? extends AccountSessionAware> reader = buildReader(entityClass);
    GenericBeanDefinition readerBeanDefinition = new GenericBeanDefinition();
    readerBeanDefinition.setBeanClassName(EntityMongoItemReader.class.getName());
    readerBeanDefinition.setInstanceSupplier(() -> reader);
    // this setScope allows scope from BeanDefinition (application/singleton, infrastructure, prototype, etc...) but I would use StepScope
    readerBeanDefinition.setScope(STEP_SCOPE_I_HAVE_NOT);
    registry.registerBeanDefinition("readerFor" + entityName, readerBeanDefinition);
    log.info("registered: " + "readerFor" + entityName);

    // for writers
    CommonEntityToFlatFileItemWriter<? extends AccountSessionAware> writer = buildWriter(entityClass);
    GenericBeanDefinition writerBeanDefinition = new GenericBeanDefinition();
    writerBeanDefinition.setBeanClassName(EntityMongoItemReader.class.getName());
    writerBeanDefinition.setInstanceSupplier(() -> writer);
    // same problem as reader above
    writerBeanDefinition.setScope(STEP_SCOPE_I_HAVE_NOT);
    registry.registerBeanDefinition("writerFor" + entityName, writerBeanDefinition);
    log.info("registered: " + "writerFor" + entityName);


    Job jobWithStep = buildJobWithStep(entityClass, reader, writer);
    GenericBeanDefinition jobBeanDefinition = new GenericBeanDefinition();
    jobBeanDefinition.setBeanClassName(Job.class.getName());
    jobBeanDefinition.setInstanceSupplier(() -> jobWithStep);
    // for Job we have singleton scope 
    jobBeanDefinition.setScope(BeanDefinition.SCOPE_SINGLETON);
    registry.registerBeanDefinition("jobFor" + entityName, jobBeanDefinition);
    log.info("registered: " + "jobFor" + entityName);
}

Below, common methods to build reader and writer, common to AS-IS and TO-BE scenarios:

private public <eASAx extends AccountSessionAware> EntityMongoItemReader<eASAx> reader(final Class<eASAx> entityToReadClass) {
    LinkedHashMap<String, SortType> commonSort = outputFromDomain.getSort().getCommon();
    final LinkedHashMap<String, SortType> furtherSort = csvExporterType.getOutputs().get(entityToReadClass.getSimpleName()).getAdditionalSort();
    final EntityMongoItemReader<eASAx> reader = new EntityMongoItemReader<>(entityToReadClass, readingPageSize, commonSort, furtherSort,            mongoTemplate);
    return reader;
}

private <eASAx extends AccountSessionAware> CommonEntityToFlatFileItemWriter<eASAx> writer(final Class<eASAx> eASAxxxClass) {
    final String[] headerColumnNames = csvExporterType.getOutputs().get(eASAxxxClass.getSimpleName()).getColumns().split(ExporterConstants.COMMA);
    final String outputFileName = eASAxxxClass.getSimpleName();

    final EntityToFlatFileItemWriter<eASAx> writer = new EntityToFlatFileItemWriter<>(eASAxxxClass, headerColumnNames, outputDir, outputFileName,   fieldsSeparator, writingChunkSize);

    // if required, create multifile writer instance
    if (resourceLimit > 0) {
        final EntityToMultiResourceFlatFileItemWriter<eASAx> entityToMultiResourceFlatFileItemWriter = new EntityToMultiResourceFlatFileItemWriter<>(writer, resourceLimit);
        return entityToMultiResourceFlatFileItemWriter;
    }

    return writer;
}

Unfortunately, I am not able to find a way to specify the StepScope to apply to reader/writer GenericBeanDefinition. Also, I found org.springframework.batch.core.scope.StepScope class, which seems to be responsible to apply that scope to @StepScope annotated bean, and declared in a @Configuration annotated class also having the @EnableBatchProcessing annotation on top of class itself, among the @Configuration. Unfortunately, I was also not able to find a way to use that StepScope. And I need StepScope for reader/writer because they need some parameters from job context, such as:

@Value("#{jobParameters['session']}")
private String session;
@Value("#{jobParameters['sort']}")
private boolean toSort;

Not providing the step scope for reader/writer, of course, those injection will not work and I will have null value, as for as the data passing between step will not work (using StepExecutionContext).

So, any suggestion? Also using other way, of course... a different approach than mine.


SECOND VERSION (UPDATED)

now the code is something as:

@EnableBatchProcessing
@Configuration
public class ExporterBatchConfigDynamic extends DefaultBatchConfigurer {

  @Autowired
  private GenericApplicationContext genericApplicationContext;

[...]

  @PostConstruct
  public void init() {
    for (String entityClass: entities) {

      GenericEntityMongoItemReader reader = reader(entityClass);                 
      genericApplicationContext.registerBean("readerFor" + entityName, GenericEntityMongoItemReader.class, () -> reader, Customizers::stepScoped);

      
CommonEntityToFlatFileItemWriter<AccountSessionAware> writer = writer(entityClass);
      genericApplicationContext.registerBean("writerFor" + entityName, CommonEntityToFlatFileItemWriter.class, () -> writer, Customizers::stepScoped);

      Job jobWithStep = buildJobWithStep(entityClass, reader, writer, namesToJobsMapping());
      genericApplicationContext.registerBean("jobFor" + entityName, Job.class, () -> jobWithStep, Customizers::jobScoped);
      log.info(entityName + ":: registered: " + "readerFor" + entityName + ", writerFor" + entityName + ", jobFor" + entityName);       
    }
  }
  
  [...]

  public static class Customizers {
    public static void stepScoped(final BeanDefinition bd) {
      bd.setScope(SCOPE_STEP);
    }

    public static void jobScoped(final BeanDefinition bd) {
      bd.setScope(SCOPE_JOB);
    }
  }

  // after this, also the ScopeConfiguration is declared, including Step and Job from comment below
}

Now the readers and writers are (apparently) step-scoped.. the application works, but only for the first call. On second call, the reader reads 0 entries, and pass 0 chunks to writer, which wrotes 0, of course. I know this happens when readers/writers are not step-scoped: so, my Customizer used for bean registration is not working properly, or in general that registration way is not right.

Any further suggestion?

Thanks



Solution 1:[1]

You can use GenericBeanDefinition#setScope(String). So in your case it could be something like:

readerBeanDefinition.setScope("step");

Note that the scope "step" should be registered in the application context as it is not registered by default (it is a custom scope from Spring Batch).

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Mahmoud Ben Hassine