This week's book giveaway is in the JDBC forum.
We're giving away four copies of Make it so: Java DB Connections & Transactions and have Marcho Behler on-line!
See this thread for details.
The moose likes XML and Related Technologies and the fly likes Parsing XML file in Chunks. Big Moose Saloon
  Search | Java FAQ | Recent Topics | Flagged Topics | Hot Topics | Zero Replies
Register / Login

Win a copy of Make it so: Java DB Connections & Transactions this week in the JDBC forum!
JavaRanch » Java Forums » Engineering » XML and Related Technologies
Bookmark "Parsing XML file in Chunks." Watch "Parsing XML file in Chunks." New topic

Parsing XML file in Chunks.

Skanda Raman
Ranch Hand

Joined: Mar 21, 2008
Posts: 205

I have parsed the huge XML file using STAX and loaded the information in my POJO's. I used these POJO and committed data to database.

Now, I have a requirement to parse the huge XML file in Chunks. Meaning, parse partially to a threshold limit and commit to database and then again parse the left out. Can we do this using STAX API.

For example, I have 50K entries and want to set threshold limit for parse for 500. In this case, can i parse 500 entries and store in database and then continue parsing remaining.

Please advise.

William Brogden
Author and all-around good cowpoke

Joined: Mar 22, 2000
Posts: 13028
SAX or Stax is exactly what you want then. Since the parse only deals with one Node at a time, it takes very little memory.

The down side will be somewhat more complex programming for you.

Beware the trap of thinking that the characters() method grabs a complete text node.

I agree. Here's the link:
subject: Parsing XML file in Chunks.
It's not a secret anymore!