This week's giveaway is in the EJB and other Java EE Technologies forum. We're giving away four copies of EJB 3 in Action and have Debu Panda, Reza Rahman, Ryan Cuprak, and Michael Remijan on-line! See this thread for details.
You mention in the Semantics section that if performance is too much of an issue, that one should possibly revert to tab delimited files. My question is this: You won't know you that you have a performance problem until you actually have a performance problem, which is pretty late in the game--so is there some kind of base rule so you know when the processing may be overwhelming? Does is deal with so many levels, so long of a file, what? I know this is a universal question that may have been asked of other authors, but I am curious about your take. Obviously one would want to POC a system before getting too far, which would help divert this problem, but what about the case when you can't POC? Thanks in advance for your response, Wendy
I'm very skeptical of performance problems. In these days of PCs with dual 2 GHz 64-bit processors and gigabytes of RAM there are very few applications that really have performance issues, especially performance issues related to XML. You certainly shouldn't start out with the assumption that you'll have a performance problem, and most ones that are discovered in testing can be fixed by simple changes. The only heuristic I trust for detemrining when there'll be a performance problem is when I'm executing a known hard problem, like modelling the weather or some such. If the math is so difficult that performance is an issue, then I should be writing the code in Fortran and running it on a supercomputer. But that has little to do with XML. There really aren't that many hard problems that are XML-bound. The bottleneck tends to be in either I/O (most often) or the algorithm (less often). It's rare to encounter a problem where the speed of XML parsing is the limiting factor.
Elliotte Rusty Harold<br />Author of <a href="http://cafe.elharo.com/web/refactoring-html/" target="_blank" rel="nofollow">Refactoring HTML</a>
I'm very skeptical of performance problems. In these days of PCs with dual 2 GHz 64-bit processors and gigabytes of RAM there are very few applications that really have performance issues, especially performance issues related to XML. Well, I beg to disagree on that one. As developers, we certainly don't have access to those for everyday work. I say this because recently on one of the projects I was involved, there was much unhapiness expressed on the performance (of the entire project, not just the XSLT approach we used)..... We had the choice of using XSLT or a DOM model to exchange data between two systems using a common XML format. On one of the systems, that I work with, we choose the XSLT approach, while the other system went with the DOM approach. There, ofcourse, were a lot of valid reasons for our choice - time-to-market, server-side, multi-platform support etc etc... Our team is now in the performance testing phase...one of these days we will know better and educate ourselves. - m
Madhav, how long does it take for your server to parse/transform a typical XML message? 200ms? 300ms? 500ms? How significant a portion is that of the whole response time of the application (including network latency and request processing time)? XSLT is notoriously slow, no doubt about that... If you've got very big pockets, you might want to consider using dedicated hardware.
So, Lasse, do u suggest to use DOM model, instead of XSLT? I do hope that people will be aware of speed, when dealing with data exchange using XML... I guess dedicated hardware is not possible for most of the industry and they might want to consider to use DOM model... Actually I don't know details about the speed difference between them... I've been thinking that it's developers' choice to decide which one to use... :roll:
Madhav, how long does it take for your server to parse/transform a typical XML message? We still are in performance testing phase. I don't have valid numbers. I, personally, beg to differ from some of my team members on the numbers they have for reasons that I think are valid - Testing on Dev machines (instead of real servers), we have a geographically distributed team and at any given time I have three to four apps (and servers) runing on my dev machine. So, the testing numbers we have are not good, but that's just me(!). XSLT is notoriously slow, no doubt about that... So, you are saying XSLT is slow compared to DOM ? I have no experience with XSLT Performance, to be honest. Any additional info on this will be extremely helpful. Are there tool to "optimize" style-sheets, some pointers/links/articles on this would be greatly appreciated. If you've got very big pockets, you might want to consider using dedicated hardware. You don't always have access to dedicated hardware. Damn economy(!). - m
Joined: Jan 23, 2002
I didn't mean to imply that DOM would be automatically better -- XSL is a great tool for transforming XML documents and using DOM and Java to do the same job can bite you in the long run. One additional technique for improving the performance of XSL transformation is to do it "statically". By compiling the XSL stylesheet into Java code that does the same, thus, bypassing the stylesheet interpretation at runtime (although I think XSLT engines tend to cache the internal "compiled" stylesheet object anyway). I can't remember a tool that does this right now, but I'm sure it won't take long to Google for it...