This week's book giveaway is in the Clojure forum.
We're giving away four copies of Clojure in Action and have Amit Rathore and Francis Avila on-line!
See this thread for details.
Win a copy of Clojure in Action this week in the Clojure forum!
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic

java.lang.OutOfMemoryError

 
vivek ja
Ranch Hand
Posts: 80
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am trying to retrieve data from an Oracle table and convert it to XML.
Am using the Oracle utility called XML SQL Utility(XSU)
Everything is going smooth until i get to a point where large amounts of data are accessed from the database.

It gives me the following error
"Exception in thread "main" java.lang.OutOfMemoryError: Java heap space"

When i extract smaller volumes of data from the same table , I am not getting this error and the XML gets created fine.
Can anyone help me.

Thank you.
 
David O'Meara
Rancher
Posts: 13459
Android Eclipse IDE Ubuntu
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Have you tried giving the application more memory using the java -Xmx options?
 
vivek ja
Ranch Hand
Posts: 80
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I tried giving this command "java -Xmx512m <Classname>"
Its still giving the same error.

Is there anything I can do about this?
 
David O'Meara
Rancher
Posts: 13459
Android Eclipse IDE Ubuntu
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
It depends on the amount of information you're reading. You may want to look at paging your resultset - ie doing a query to find how many results there are then load 1-1000, 1001 to 2000 etc.

There are other mechanisms available, but the best may be to work out some way to reduce the information sent by the database.

Or it may be a memory leak...
 
vivek ja
Ranch Hand
Posts: 80
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Actually I am trying to read huge amounts of data.
Thats why it is failing.
Now the approach I am taking is, I am looping through the database and extracting data and appending to the file for few rows at a time.

Now it is not giving the error, but yes its taking a very long time to write the file.

Is this a correct approach?
 
Jeanne Boyarsky
author & internet detective
Marshal
Posts: 33694
316
Eclipse IDE Java VI Editor
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Vivek,
Yes! Limiting the amount of data in memory at a time is definitely a good thing.

Now you are in a position to tune if it is taking too long. I would start by logging the time for the query vs network transfer vs writing to file to see what is taking the longest.
 
Sandeep D Karkera
Greenhorn
Posts: 1
  • 0
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

try increasing the "MaxPermSize" and forcing "compactgc" and "UseParallelGC".

MaxPermSize since by default is 64 mb might not be enough for large enterprise application try setting it to 200mb - 300mb or higher based on ur application.
compactgc will help u if memory fragmentation is happning, though is a little expensive on cpu and might slow ur app a bit.
UseParallelGC might help if have a SMP server to increase GC performance.

eg : -XX:MaxPermSize=100m -Xcompactgc -XX:+UseParallelGC

Regards
Sandeep DK
 
  • Post Reply
  • Bookmark Topic Watch Topic
  • New Topic