This week's book giveaway is in the Design forum.
We're giving away four copies of Design for the Mind and have Victor S. Yocco on-line!
See this thread for details.
Win a copy of Design for the Mind this week in the Design forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

Retrieving huge ResultSet in chunks

 
Jaikiran Pai
Marshal
Pie
Posts: 10447
227
IntelliJ IDE Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,
We have a J2EE application deployed on Jboss. We have a requirement wherein, we have to execute a query to retrieve a huge ResultSet. This can run into millions of records. Naturally, we have already seen the OutOfMemory error. Is there any way(API) through which i can retrieve the ResultSet in chunks?
 
Jaikiran Pai
Marshal
Pie
Posts: 10447
227
IntelliJ IDE Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Was thinking about various approaches on handling this, one thing that came to my mind was converting the table rows into a xml file. Is it possible to do this(i.e. without using the ResultSet object, to avoid memory related problems).

Any other suggestions are also welcome.
Thank you
 
Roger Chung-Wee
Ranch Hand
Posts: 1683
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
How can your client process millions of records? I suspect it can't, so you do not need to obtain so much. One option is to do filtering of the client's request.
 
Jaikiran Pai
Marshal
Pie
Posts: 10447
227
IntelliJ IDE Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Actually, this is a asynchronous activity and not visible to the end user(on the UI). After retrieving this data, we need to write it out to a file
 
stu derby
Ranch Hand
Posts: 333
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator


There's probably some drivers that will buffer eveything anyway, but that should work on all the major databases (Oracle, SQL Server, postgres, mysql, db2, etc, etc)

If by "retrieve the ResultSet in chunks", you mean some way to explicitly limit the size of ResultSet to a fixed size and repeat execution to get back a ResultSet that will give you different rows each time, then no.

[ March 22, 2006: Message edited by: stu derby ]
[ March 22, 2006: Message edited by: stu derby ]
 
R. M. Menon
Author
Ranch Hand
Posts: 50
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Originally posted by jaikiran pai:
Hi,
We have a J2EE application deployed on Jboss. We have a requirement wherein, we have to execute a query to retrieve a huge ResultSet. This can run into millions of records. Naturally, we have already seen the OutOfMemory error. Is there any way(API) through which i can retrieve the ResultSet in chunks?


Which DB? In Oracle, you could set the fetch size and it won't run out of memory (assuming you use the Oracle supplied JDBC drivers.)
 
Maximilian Xavier Stocker
Ranch Hand
Posts: 381
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
As StuDerby alludes set up your cursor as type forward only. Many driver implementations for scrollable cursors do scrolling by loading all the rows into memory.

So don't do that. Before playing with fetch size or any other settings with large result sets make sure your cursor is forward only first.
 
Jaikiran Pai
Marshal
Pie
Posts: 10447
227
IntelliJ IDE Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Which DB?

We are using Teradata.

First of all, thank you "Stu Derby" for proposing a solution.
If by "retrieve the ResultSet in chunks", you mean some way to explicitly limit the size of ResultSet to a fixed size and repeat execution to get back a ResultSet that will give you different rows each time, then no.


Well, thats exactly, what i wanted. Suppose it had 10000 rows and i set a fetch size of 500. One first invocation, i need to get the 1-500 rows then on second invocation 501-1000 rows and so on, which i think is not possible with the approach mentioned by "Stu Derby".
 
R. M. Menon
Author
Ranch Hand
Posts: 50
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Originally posted by jaikiran pai:


Well, thats exactly, what i wanted. Suppose it had 10000 rows and i set a fetch size of 500. One first invocation, i need to get the 1-500 rows then on second invocation 501-1000 rows and so on, which i think is not possible with the approach mentioned by "Stu Derby".


That is what fetch size does. It fetches in chunks that you specify. i don't see where the problem is.
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic