I am trying to retrieve data from an Oracle table and convert it to XML. Am using the Oracle utility called XML SQL Utility(XSU) Everything is going smooth until i get to a point where large amounts of data are accessed from the database.
It depends on the amount of information you're reading. You may want to look at paging your resultset - ie doing a query to find how many results there are then load 1-1000, 1001 to 2000 etc.
There are other mechanisms available, but the best may be to work out some way to reduce the information sent by the database.
Or it may be a memory leak...
Joined: Feb 24, 2005
Actually I am trying to read huge amounts of data. Thats why it is failing. Now the approach I am taking is, I am looping through the database and extracting data and appending to the file for few rows at a time.
Now it is not giving the error, but yes its taking a very long time to write the file.
try increasing the "MaxPermSize" and forcing "compactgc" and "UseParallelGC".
MaxPermSize since by default is 64 mb might not be enough for large enterprise application try setting it to 200mb - 300mb or higher based on ur application. compactgc will help u if memory fragmentation is happning, though is a little expensive on cpu and might slow ur app a bit. UseParallelGC might help if have a SMP server to increase GC performance.