Win a copy of Think Java: How to Think Like a Computer Scientist this week in the Java in General forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

resultset processing gives java.lang.OutOfMemoryError

 
Shamu Somasundaram
Ranch Hand
Posts: 41
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

I am facing a problem with java.lang.outOfmemoryError.

I have a resultset which will have huge data.

It retuns nearly 30000 records with 300 columns.

I am processing it like this.

while(rs.next())
{
List temp = new ArrayList();
for int i=0;i<=300;i++)
{
temp.add(rs.getString(i));

}
}
report.add(temp);
}
and I will return this report list to my other java program.
The while loop is not finishing. I gave SOPs before and after the loops.
When the resultset is getting processed, I am getting Servlet Error: java.lang.OutOfMemoryError.

Can anyone pls help me how to recover this problem?
Generally how this much of data can be handled without memory issues?

Thanks,
Shanmugavel.
 
Steven Bell
Ranch Hand
Posts: 1071
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Just process one line at a time.
 
Shamu Somasundaram
Ranch Hand
Posts: 41
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Originally posted by Steven Bell:
Just process one line at a time.


Thanks Steven, but how do you want me to do that. my resultset will have more than 25000 rows with many columns. I'm procesing a single row and put that in arraylist. once one row is completed, I will add that arraylist to another list.
this way, my outer arraylist will have 25000 arraylists. This is what giving problems.
Can any one suggesst an alternative for ths.
I tried with String instread of creating inner arraylist. I just append al the clolumns ina row to a string and put that string in outer arraylist. that helps a little bit only. But it works for some more records say 17500 records. but beyond that, this approach also gives out of memory error.


Thanks,
Shanmugavel.
 
Steven Bell
Ranch Hand
Posts: 1071
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
My point is you can't hold all that information in memory. You need to read one line, process it (I don't know what you're doing with it), drop that line from memory, and repeate till no more lines are available.

It really depends on what 'process it' means though.
 
Shailesh Chandra
Ranch Hand
Posts: 1082
Java Oracle Spring
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
do you really need 30,000 records with 300 columns at a time.

try to distribute your logic and if it is that mandatory then increase heap memory of your JVM

Shailesh
 
vu lee
Ranch Hand
Posts: 206
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
The problem is at the place where you store a huge number of records before processing it. Since your app processes too many records, I would recommend to use divide an conquer approach by querying 1000 records at a time. Process the records, close the resultset, and query the next 1000 records...

To have better performance, you can create multiple threads to handle the workload. To reduce resource utilization, you can use multiple resultsets for a single connection.

It's probably better to model a row as a Map instead of an array list.

Hope it helps
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic