File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
http://aspose.com/file-tools
The moose likes JDBC and the fly likes Retrieving huge ResultSet in chunks Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login


Win a copy of Murach's Java Servlets and JSP this week in the Servlets forum!
JavaRanch » Java Forums » Databases » JDBC
Bookmark "Retrieving huge ResultSet in chunks" Watch "Retrieving huge ResultSet in chunks" New topic
Author

Retrieving huge ResultSet in chunks

Jaikiran Pai
Marshal

Joined: Jul 20, 2005
Posts: 9924
    
158

Hi,
We have a J2EE application deployed on Jboss. We have a requirement wherein, we have to execute a query to retrieve a huge ResultSet. This can run into millions of records. Naturally, we have already seen the OutOfMemory error. Is there any way(API) through which i can retrieve the ResultSet in chunks?


[My Blog] [JavaRanch Journal]
Jaikiran Pai
Marshal

Joined: Jul 20, 2005
Posts: 9924
    
158

Was thinking about various approaches on handling this, one thing that came to my mind was converting the table rows into a xml file. Is it possible to do this(i.e. without using the ResultSet object, to avoid memory related problems).

Any other suggestions are also welcome.
Thank you
Roger Chung-Wee
Ranch Hand

Joined: Sep 29, 2002
Posts: 1683
How can your client process millions of records? I suspect it can't, so you do not need to obtain so much. One option is to do filtering of the client's request.


SCJP 1.4, SCWCD 1.3, SCBCD 1.3
Jaikiran Pai
Marshal

Joined: Jul 20, 2005
Posts: 9924
    
158

Actually, this is a asynchronous activity and not visible to the end user(on the UI). After retrieving this data, we need to write it out to a file
stu derby
Ranch Hand

Joined: Dec 15, 2005
Posts: 333


There's probably some drivers that will buffer eveything anyway, but that should work on all the major databases (Oracle, SQL Server, postgres, mysql, db2, etc, etc)

If by "retrieve the ResultSet in chunks", you mean some way to explicitly limit the size of ResultSet to a fixed size and repeat execution to get back a ResultSet that will give you different rows each time, then no.

[ March 22, 2006: Message edited by: stu derby ]
[ March 22, 2006: Message edited by: stu derby ]
R. M. Menon
Author
Ranch Hand

Joined: Mar 15, 2006
Posts: 50
Originally posted by jaikiran pai:
Hi,
We have a J2EE application deployed on Jboss. We have a requirement wherein, we have to execute a query to retrieve a huge ResultSet. This can run into millions of records. Naturally, we have already seen the OutOfMemory error. Is there any way(API) through which i can retrieve the ResultSet in chunks?


Which DB? In Oracle, you could set the fetch size and it won't run out of memory (assuming you use the Oracle supplied JDBC drivers.)
Maximilian Xavier Stocker
Ranch Hand

Joined: Sep 20, 2005
Posts: 381
As StuDerby alludes set up your cursor as type forward only. Many driver implementations for scrollable cursors do scrolling by loading all the rows into memory.

So don't do that. Before playing with fetch size or any other settings with large result sets make sure your cursor is forward only first.
Jaikiran Pai
Marshal

Joined: Jul 20, 2005
Posts: 9924
    
158

Which DB?

We are using Teradata.

First of all, thank you "Stu Derby" for proposing a solution.
If by "retrieve the ResultSet in chunks", you mean some way to explicitly limit the size of ResultSet to a fixed size and repeat execution to get back a ResultSet that will give you different rows each time, then no.


Well, thats exactly, what i wanted. Suppose it had 10000 rows and i set a fetch size of 500. One first invocation, i need to get the 1-500 rows then on second invocation 501-1000 rows and so on, which i think is not possible with the approach mentioned by "Stu Derby".
R. M. Menon
Author
Ranch Hand

Joined: Mar 15, 2006
Posts: 50
Originally posted by jaikiran pai:


Well, thats exactly, what i wanted. Suppose it had 10000 rows and i set a fetch size of 500. One first invocation, i need to get the 1-500 rows then on second invocation 501-1000 rows and so on, which i think is not possible with the approach mentioned by "Stu Derby".


That is what fetch size does. It fetches in chunks that you specify. i don't see where the problem is.
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: Retrieving huge ResultSet in chunks
 
Similar Threads
java.lang.outofmemory
How can I do this effectively?
"Scrolling" a page which has huge data
Handling huge resultset
AJAX