This week's book giveaway is in the Mac OS forum. We're giving away four copies of a choice of "Take Control of Upgrading to Yosemite" or "Take Control of Automating Your Mac" and have Joe Kissell on-line! See this thread for details.
Hello, our team is doing a JAVA project which needs to use XML to store all the data. In this project, we need not only to read data from XML files, but also write to XML files. In addtion, some of XML files are very big, which might have 1000 rows, more than 30000 nodes. Currently, we are using DOM to parse XML files, but it takes too long to get data, and some times, out of memory. We have thinked about JDOM, but it also has to build the entire tree to the memory, which might cause the memory problem. And SAX can not update the data. Do some body have good suggetions for us to improve our project? Thanks!
Originally posted by lei jiong: some XML files are very big, which might have 1000 rows, more than 30000 nodes.
Just wondering why you went for XML to carry such big data instead of using standard database approach. In any case, please check data binding tools like Castor or Jbind, which is capable of processing large files in an efficient way. castor: http://www-106.ibm.com/developerworks/xml/library/x-bindcastor/ jbind:http://java.sun.com/xml/jaxp/dist/1.0.1/docs/binding/DataBinding.html
Exactly what kind of update do you want to do? You can use SAX events to write a new database XML file on the fly. If your update changes are simple, such as changing a few attributes or the contents of an element, they can also be done on the fly. The advantage being that the memory requirement will be much less and speed greater. Bill