A friendly place for programming greenhorns!
Big Moose Saloon
Register / Login
Win a copy of
Elasticsearch in Action
this week in the
XML and Related Technologies
Parsing XML file in Chunks.
Joined: Mar 21, 2008
Jun 19, 2013 10:00:09
I have parsed the huge XML file using
and loaded the information in my POJO's. I used these POJO and committed data to database.
Now, I have a requirement to parse the huge XML file in Chunks. Meaning, parse partially to a threshold limit and commit to database and then again parse the left out. Can we do this using STAX API.
For example, I have 50K entries and want to set threshold limit for parse for 500. In this case, can i parse 500 entries and store in database and then continue parsing remaining.
Author and all-around good cowpoke
Joined: Mar 22, 2000
Jun 19, 2013 13:18:23
SAX or Stax is exactly what you want then. Since the parse only deals with one Node at a time, it takes very little memory.
The down side will be somewhat more complex programming for you.
Beware the trap of thinking that the characters() method grabs a complete text node.
I agree. Here's the link:
subject: Parsing XML file in Chunks.
How to output to an xml file after parsing
UTF8 Encoding While Writing in File - Out Of MemoryError
parsing huge xml files
Parse XML using StaX insert into Hsql DB use maven built
breaking a huge xml file into multiple small xml files
All times are in JavaRanch time: GMT-6 in summer, GMT-7 in winter
| Powered by
Copyright © 1998-2015