I am trying to develop the application which is supposed to process large XML files-50 to 100MB each file on average.Application is supposed to receive Zip file which contains multiple XML files of above size.The number of XML files in ZIP files vary from 1 to 25.
I am using Tibco BW tool(which uses DOM parser for validating and parsing).Successive operations after parsing are:
creating flat files based on soem rules.
My question is-If multiple zip files arrive which contains multiple large XML files, application consumes lot of memory and processing takes logn time.
Can using of Stax an better alternative to DOM/SAX in such cases?
An event oriented parser such as SAX or StaX will certainly take less memory and run faster than building and manipulating a DOM.
The feasibility really depends on the type of data manipulation you are doing - can you imagine doing it by hand - reading a tape which shows you one XML element at a time while you make notes on a small pad of paper? Thats SAX/Stax.
OR - would the "by hand" manipulation be more like reading a book, bouncing back and forth to the index, table of contents, and multiple chapters to assemble your output. Thats DOM.
"Streaming pull parsing refers to a programming model in which a client application calls methods on an XML parsing library when it needs to interact with an XML infoset--that is, the client only gets (pulls) XML data when it explicitly asks for it. "
Joined: Mar 13, 2003
Thanks for help.
Another question- Is it possible to validate XML using Stax? In case of DOM, we can validate XML against XSD using validator API .Is that possible in Stax? If yes, then memory again will be a problem i think as entire file needs to be brought in memory.
Author and all-around good cowpoke