'Using parallel stream inside Spring batch process
World. We are using spring batch to read from a table (Message), and write the modifications on the table (Site), the functionality is working perfectly, but when am having more than 1000 Site to modify for a single Message, the process is taking too long.
To reduce the processing time, I used multiThreading inside the Process of the batch.
But am having an entity detached error.
- the batch processor (updateSite change the values of the Site based on the Message data, but it does not save the entities)
public DataWrapper process(MessageZbusOras messageZbusOras) { List<Site> sites = siteService.getSites(messageZbusOras); sites.parallelStream().forEach(site -> { updateSite(messageZbusOras,site); }); return new DataWrapper(messageZbusOras); } - the batch writer
public void write(List<? extends DataWrapper> items) throws Exception { List<MessageZbusOras> messageZbusOras = new ArrayList<>(); for (DataWrapper wrapper : items) { messageZbusOras.add(wrapper.getMessageZbusOras()); } messageZbusOrasRepository.saveAll(messageZbusOras); messageZbusOrasRepository.flush(); }
Apparently, the error occur when the site toked by another thread, automaticly it is detached from Hibernate session, so I can't flush the modification in the writer.
RefEvtLibelleCommentaire is an entity linked to Site by LazyInit.
If you need more clarifications please ask me.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|

