Win a copy of GANs in ActionE this week in the AI forum
or WebAssembly in Action in the JavaScript forum!
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Bear Bibeault
  • Paul Clapham
  • Jeanne Boyarsky
  • Knute Snortum
Sheriffs:
  • Liutauras Vilda
  • Tim Cooke
  • Junilu Lacar
Saloon Keepers:
  • Ron McLeod
  • Stephan van Hulst
  • Tim Moores
  • Tim Holloway
  • Carey Brown
Bartenders:
  • Joe Ess
  • salvin francis
  • fred rosenberger

java.lang.OutOfMemoryError

 
Ranch Hand
Posts: 80
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I am trying to retrieve data from an Oracle table and convert it to XML.
Am using the Oracle utility called XML SQL Utility(XSU)
Everything is going smooth until i get to a point where large amounts of data are accessed from the database.

It gives me the following error
"Exception in thread "main" java.lang.OutOfMemoryError: Java heap space"

When i extract smaller volumes of data from the same table , I am not getting this error and the XML gets created fine.
Can anyone help me.

Thank you.
 
Rancher
Posts: 13459
Android Eclipse IDE Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Have you tried giving the application more memory using the java -Xmx options?
 
vivek ja
Ranch Hand
Posts: 80
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
I tried giving this command "java -Xmx512m <Classname>"
Its still giving the same error.

Is there anything I can do about this?
 
David O'Meara
Rancher
Posts: 13459
Android Eclipse IDE Ubuntu
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
It depends on the amount of information you're reading. You may want to look at paging your resultset - ie doing a query to find how many results there are then load 1-1000, 1001 to 2000 etc.

There are other mechanisms available, but the best may be to work out some way to reduce the information sent by the database.

Or it may be a memory leak...
 
vivek ja
Ranch Hand
Posts: 80
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Actually I am trying to read huge amounts of data.
Thats why it is failing.
Now the approach I am taking is, I am looping through the database and extracting data and appending to the file for few rows at a time.

Now it is not giving the error, but yes its taking a very long time to write the file.

Is this a correct approach?
 
author & internet detective
Posts: 39789
797
Eclipse IDE VI Editor Java
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Vivek,
Yes! Limiting the amount of data in memory at a time is definitely a good thing.

Now you are in a position to tune if it is taking too long. I would start by logging the time for the query vs network transfer vs writing to file to see what is taking the longest.
 
Greenhorn
Posts: 1
  • Mark post as helpful
  • send pies
  • Quote
  • Report post to moderator
Hi,

try increasing the "MaxPermSize" and forcing "compactgc" and "UseParallelGC".

MaxPermSize since by default is 64 mb might not be enough for large enterprise application try setting it to 200mb - 300mb or higher based on ur application.
compactgc will help u if memory fragmentation is happning, though is a little expensive on cpu and might slow ur app a bit.
UseParallelGC might help if have a SMP server to increase GC performance.

eg : -XX:MaxPermSize=100m -Xcompactgc -XX:+UseParallelGC

Regards
Sandeep DK
 
Politics n. Poly "many" + ticks "blood sucking insects". Tiny ad:
Java file APIs (DOC, XLS, PDF, and many more)
https://products.aspose.com/total/java
  • Post Reply Bookmark Topic Watch Topic
  • New Topic
Boost this thread!