Hi all, I need a bit of advice on an application I have that reads in an XML file and based on its content generates a text file which is used by another application. I have written my application to handle the XML file using JDOM. So when the XML file is read in it creates a JDOM Document which I then use to access the Elements I need to create the text file. However I have now run into some memory problems and am questioning the wisdom of using JDOM. The XML files I have to handle are much bigger then I had anticipated (they are around 1MB) and as a result I was getting OutOfMemory Exceptions when executing my application. I have had to increase the Java heap size to 400 MB to get my application to execute successfully. I have watched the memory usage of the application while it is running and it hits 393MB. This seems a bit crazy. Should I definately migrate my code to use the SAX API?? Also as an aside, if I had to deal with an even larger XML file which required me to increase the Heap size to greater than 400MB whats the max size that I could set the heap given that my machine has 522MB of RAM?? Thanks for your help... John
Should I definately migrate my code to use the SAX API??
I would say yes. I faced the problem in weblogic 6.0 server. My parser code using DOM worked fine as a standalone but when I used it in weblogic server the parser crashed saying -"Reduce the file size". I had to migrate to SAX. The DOM part will work fine for smaller XML documents but as the doc becomes bigger the trouble starts.