'Batch Job Performance testing - what parameters to measure?
I know about the Performance test for an "Online" application or APIs where we measure response time, throughput and CPU/Memory utilization.
My question is - What are the parameters to measure for performance testing of a "Batch" job? The job I am talking about reads a file (nightly process) and update the database (RDBMS) with new records. What is the criteria of performance testing for such batch processes?
Solution 1:[1]
In a batch scenario, the most important performance testing attributes in my opinion are the throughput of your tasks (i.e. worker / thread) and the endurance of the system.
In terms of throughput, you would want to identify how much throughput can be produced by an individual worker and therefore you are able to size your production batch jobs based on batch size accurately. Also, if the throughput is not in an acceptable level, it means that the system has room for improvement in Logic / IO (ex: query performance, indexes, connection pools and so forth)
In terms of endurance, you would want to ensure that the batch jobs are able to run long with consistent throughput depending on the size of batches. If the performance degrades as the batch size grows, that would mean that there are bottlenecks that you need to fix before you feed large batches to your system.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
| Solution | Source |
|---|---|
| Solution 1 | Chinthaka Dharmasiri |
