I have a huge XML file containing say millions of records. I get this file on daliy basis from the customer. My requirement is to parse the file and store all the records in it in database. Because of the size of the file, it can run into memory issues. Is there any other way to parse the file in chuncks, store in DB w/o running into memory problems. I believe any XML technique like XPopinter, XQuery or XPATH will hold it in DOM and that will be a problem.
Please let me know if anyone has had such implementation done in his work.
Like Paul Clapham said, you want some SAX or STAX, or any XML parsing library which is event based(xpp3, etc.). Trying to load the document inside a tree based XML API will probably give you a outofmemoryerror, you'll try playing with the heap size and get nowhere... It will be less convenient/easy depending on the XML document structure and complexity, but at least you'll be able to process the file.