Hi, i am trying to fetch data using "executeQuery()" into a ResultSet from the database. But since the data in that table is large, i am recieving "java.lang.OutOfMemory" Error. So, to resolve that, i have used "setMaxRows()" for my statement object. This resolved the error but i don't recieve the entire data. If i call "executeQuery()" again, i recieve the same data. I don't even know a filtering criterion where by i can filter the data for each "executeQuery()".. How can i resolve this problem
You are selecting huge amount of data at one go. So the exception is expected. But will you be using all of them? you could resort to some sort of pagination , if your application permits, and fetch fewer records as and when needed.
Originally posted by Pravin Panicker: You are selecting huge amount of data at one go. So the exception is expected. But will you be using all of them? you could resort to some sort of pagination , if your application permits, and fetch fewer records as and when needed.
I iterate through hundreds of thousands of rows without any problems, even when my code was very...ummm...inefficient ( new to programming ). Iterating through a resultset itself is not grounds for getting out of memory errors. The problem is when you accidentally hold that record in memory. You can avoid holding the data in memory by using a NON-scrollable resultset, not adding each row to a collection, and not using a rowset. If you are holding the records in memory ( intentional or unintentional ) then you can expect at some point for your computer or JVM to run out of memory. Jamie
I’ve looked at a lot of different solutions, and in my humble opinion Aspose is the way to go. Here’s the link: http://aspose.com
subject: java.lang.OutOfMemory error while fetching data from large tables