aspose file tools*
The moose likes JDBC and the fly likes java.lang.OutOfMemoryError Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login
JavaRanch » Java Forums » Databases » JDBC
Bookmark "java.lang.OutOfMemoryError" Watch "java.lang.OutOfMemoryError" New topic
Author

java.lang.OutOfMemoryError

vivek ja
Ranch Hand

Joined: Feb 24, 2005
Posts: 80
I am trying to retrieve data from an Oracle table and convert it to XML.
Am using the Oracle utility called XML SQL Utility(XSU)
Everything is going smooth until i get to a point where large amounts of data are accessed from the database.

It gives me the following error
"Exception in thread "main" java.lang.OutOfMemoryError: Java heap space"

When i extract smaller volumes of data from the same table , I am not getting this error and the XML gets created fine.
Can anyone help me.

Thank you.
David O'Meara
Rancher

Joined: Mar 06, 2001
Posts: 13459

Have you tried giving the application more memory using the java -Xmx options?
vivek ja
Ranch Hand

Joined: Feb 24, 2005
Posts: 80
I tried giving this command "java -Xmx512m <Classname>"
Its still giving the same error.

Is there anything I can do about this?
David O'Meara
Rancher

Joined: Mar 06, 2001
Posts: 13459

It depends on the amount of information you're reading. You may want to look at paging your resultset - ie doing a query to find how many results there are then load 1-1000, 1001 to 2000 etc.

There are other mechanisms available, but the best may be to work out some way to reduce the information sent by the database.

Or it may be a memory leak...
vivek ja
Ranch Hand

Joined: Feb 24, 2005
Posts: 80
Actually I am trying to read huge amounts of data.
Thats why it is failing.
Now the approach I am taking is, I am looping through the database and extracting data and appending to the file for few rows at a time.

Now it is not giving the error, but yes its taking a very long time to write the file.

Is this a correct approach?
Jeanne Boyarsky
internet detective
Marshal

Joined: May 26, 2003
Posts: 30516
    
150

Vivek,
Yes! Limiting the amount of data in memory at a time is definitely a good thing.

Now you are in a position to tune if it is taking too long. I would start by logging the time for the query vs network transfer vs writing to file to see what is taking the longest.


[Blog] [JavaRanch FAQ] [How To Ask Questions The Smart Way] [Book Promos]
Blogging on Certs: SCEA Part 1, Part 2 & 3, Core Spring 3, OCAJP, OCPJP beta, TOGAF part 1 and part 2
Sandeep D Karkera
Greenhorn

Joined: Apr 29, 2005
Posts: 1
Hi,

try increasing the "MaxPermSize" and forcing "compactgc" and "UseParallelGC".

MaxPermSize since by default is 64 mb might not be enough for large enterprise application try setting it to 200mb - 300mb or higher based on ur application.
compactgc will help u if memory fragmentation is happning, though is a little expensive on cpu and might slow ur app a bit.
UseParallelGC might help if have a SMP server to increase GC performance.

eg : -XX:MaxPermSize=100m -Xcompactgc -XX:+UseParallelGC

Regards
Sandeep DK
 
I agree. Here's the link: http://aspose.com/file-tools
 
subject: java.lang.OutOfMemoryError