'Streaming Database resultsets to client response chunk by chunk async way

My goal is to fetch the data in chunks and pass it to response and append the data until I fetch all the records from data with async calls. So that client would keep on getting results until all the records are read from db.

I have an API which takes an id as input and fetch data from the data base, records for this id in database are more than 500000. After reading I am process each records to get desired fields/data and then returning the data in ResponseEntity. (current implementation using Springboot, RestAPI, posgresql)

Can we do this asynchronous way?

API:

@GetMapping(value = "/v1/batch/{batchId}")
public ResponseEntity<?> getBatchById(
        @PathVariable("batchId") long batchId,
) {
    log.info("received request for getting batch: {}", batchId);
    try {
        var resp = discountService.getBatchById(batchId);
            var fileInputStream = new InputStreamResource((InputStream) resp.data());
            DateFormat dateFormatter = new SimpleDateFormat("yyyy-MM-dd_HH-mm-ss");
            String currentDateTime = dateFormatter.format(new Date());

            return ResponseEntity.ok()
                                 .header(HttpHeaders.CONTENT_DISPOSITION,
                                         "attachment; filename=" + batchId + "_" + currentDateTime + ".csv")
                                 .contentType(new MediaType("application", "csv", StandardCharsets.UTF_8))
                                 .body(fileInputStream);
        
    } catch (Exception e) {
        log.error("caught exception while getting batch by id: ", e);
        return ResponseEntity.ok(new Resp<>(new Err("9009", "GENERAL EXCEPTION")));
    }
}

Method which gets data from database(postgresql):

public Data getBatchById(long batchId){
  final var pstmt = conn.prepareStatement(SELECT_BATCH_BY_ID)) {
        pstmt.setLong(1, batchId); 
  var rs = pstmt.executeQuery();  // execution time for this line is around 60 sec 
            if (!rs.next()) {
                log.warn("no batch: {} found", batchId);
                return null;
            }
            return processresult(rs);

}

  • SELECT_BATCH_BY_ID is simple select query joined 2 tables.
  • I have analyzed query it is taking hardly 150ns for execution on query console.
  • But the same query from java method call using prepared statement to get the resultset is taking almost 50-60sec of time. (I have 500000 records in total)


Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source