hi, I have a situation here, I am currently storing huge amounts of data (half GB, one GB at most 2 GB) in text file (csv style)�and then I parse them using simple java streams �I would read them once and calculate some summaries and fill in oracle table. However, now I am thinking of storing the data in XML format instead of text format and use SAX for parsing , the file size would surely shoot up ..maybe double �.but more important is parsing performance �is XML suited for this amount of data ?? will SAX parsing be any better than simply reading text file using java streams and tokenizing them ??
Can some one please throw some light on this issue thanks,
"Let the one among you who has never sinned throw the first stone.." -A Hero
Yes, you can write huge amounts of XML but not by using SAX or DOM. but by using SAX extensions that are available. I would suggest you to read this article This is a good one that talks about that. Hope it helps..
SCJP 1.4, SCDJWS , SCJA<br />I can do ALL things through CHRIST who strengthens me.
Since XML parsing will add LOTS of overhead I can't imagine how you could avoid a major slowdown. Any XML processing will involve creation of lots of objects, conversion to and from String etc. IF (big if) your data is all ASCII, you will be much faster handling the input as byte streams and byte buffers, not character streams and staying well away from String conversion until the last minute. XML shines when the data structure is complex, anything that can be represented as CSV is not a good candidate. Bill