Hi, I have question regrading the SAX and DOM parsers. Dom is mainly used for inserting and deleting the elements and draw back is it load entire xml file into memory and starts processing. SAX saves memory and faster execution.
My question here is: we have situation like we have to parse 512 MB XML file and we have to insert and delete node.But my ram is 256 MB in the computer.
above situation which parser is best.How can i get the correct solution for the problem.
Thanks, Viswanath Sigamala
Author and all-around good cowpoke
posted 9 years ago
we have situation like we have to parse 512 MB XML file and we have to insert and delete node.But my ram is 256 MB in the computer.
I think the answer depends on the logic you have to follow in inserting and deleting nodes.
Typically people think of SAX processing as a "pipeline" in which a new document output stream is produced as the original document is read and parsed.
Keeping in mind that SAX can only "move" forward through a XML document, you can insert the text of a new node or delete a node at any point, but not make the parser "back up."
Therefore the first question is - can your problem be handled by this sort of straight through processing?
Bill (try a google search for "xml pipeline" to get some possible inspirations) added: the smallx project looks interesting for this problem. [ May 31, 2006: Message edited by: William Brogden ]