'Query millions of Rows | PostgreSQL |

I'm querying through millions of rows of data, and the tomcat in the server just drops. The fix I found was to fetch 1000 rows of data from the database at a time so as not to load the tomcat in the server. I am aware of setFetchSize() method which I am using and setAutoCommit() is set to false.

However, it doesn't seem to be working.

Perhaps, you can help me with more ideas to get that many rows of data without slowing down the performance?



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source