File APIs for Java Developers
Manipulate DOC, XLS, PPT, PDF and many others from your application.
http://aspose.com/file-tools
The moose likes JDBC and the fly likes resultset processing gives java.lang.OutOfMemoryError Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » JDBC
Bookmark "resultset processing gives java.lang.OutOfMemoryError" Watch "resultset processing gives java.lang.OutOfMemoryError" New topic
Author

resultset processing gives java.lang.OutOfMemoryError

Shamu Somasundaram
Ranch Hand

Joined: Aug 25, 2004
Posts: 41
Hi,

I am facing a problem with java.lang.outOfmemoryError.

I have a resultset which will have huge data.

It retuns nearly 30000 records with 300 columns.

I am processing it like this.

while(rs.next())
{
List temp = new ArrayList();
for int i=0;i<=300;i++)
{
temp.add(rs.getString(i));

}
}
report.add(temp);
}
and I will return this report list to my other java program.
The while loop is not finishing. I gave SOPs before and after the loops.
When the resultset is getting processed, I am getting Servlet Error: java.lang.OutOfMemoryError.

Can anyone pls help me how to recover this problem?
Generally how this much of data can be handled without memory issues?

Thanks,
Shanmugavel.
Steven Bell
Ranch Hand

Joined: Dec 29, 2004
Posts: 1071
Just process one line at a time.
Shamu Somasundaram
Ranch Hand

Joined: Aug 25, 2004
Posts: 41
Originally posted by Steven Bell:
Just process one line at a time.


Thanks Steven, but how do you want me to do that. my resultset will have more than 25000 rows with many columns. I'm procesing a single row and put that in arraylist. once one row is completed, I will add that arraylist to another list.
this way, my outer arraylist will have 25000 arraylists. This is what giving problems.
Can any one suggesst an alternative for ths.
I tried with String instread of creating inner arraylist. I just append al the clolumns ina row to a string and put that string in outer arraylist. that helps a little bit only. But it works for some more records say 17500 records. but beyond that, this approach also gives out of memory error.


Thanks,
Shanmugavel.
Steven Bell
Ranch Hand

Joined: Dec 29, 2004
Posts: 1071
My point is you can't hold all that information in memory. You need to read one line, process it (I don't know what you're doing with it), drop that line from memory, and repeate till no more lines are available.

It really depends on what 'process it' means though.
Shailesh Chandra
Ranch Hand

Joined: Aug 13, 2004
Posts: 1081

do you really need 30,000 records with 300 columns at a time.

try to distribute your logic and if it is that mandatory then increase heap memory of your JVM

Shailesh


Gravitation cannot be held responsible for people falling in love ~ Albert Einstein
vu lee
Ranch Hand

Joined: Apr 19, 2005
Posts: 189
The problem is at the place where you store a huge number of records before processing it. Since your app processes too many records, I would recommend to use divide an conquer approach by querying 1000 records at a time. Process the records, close the resultset, and query the next 1000 records...

To have better performance, you can create multiple threads to handle the workload. To reduce resource utilization, you can use multiple resultsets for a single connection.

It's probably better to model a row as a Map instead of an array list.

Hope it helps
 
wood burning stoves
 
subject: resultset processing gives java.lang.OutOfMemoryError