Altova Mudslinging

By Michael Kay on October 28, 2006 at 02:18p.m.

A number of users have written to me suggesting that I respond to Altova's latest customer newsletter in which they make the rather suprising claim that their XSLT processor is three times faster than Saxon. At least they had the decency to describe how they measured it - by running a thousand or so conformance tests from the W3C test suite using a batch script in which each transformation was run individually from the command line. Any reasonably competent user knows that that's a hopelessly inefficient way of running Java programs, because the initialisation costs for loading the Java VM a thousand times far exceeds the time spent actually doing useful work. But of course, Altova know that a lot of people will only read the headlines.

Well, I did respond: I've put a counter-claim on my web site (see the last item in the Questions and Answers section) that Saxon is ten times faster than Altova. I freely admit that my methodology is no more scientific than theirs is, though I think my numbers are probably closer to the truth. It would be nice to see some objective numbers produced carefully by an unbiased third party. Vendor figures, including my own, are not very useful, because a vendor always knows how to make their own product run at maximum speed, and they rarely have either the knowledge or the motivation to do the same for their competitors.

Meanwhile, I'm actually quite proud of the fact that Saxonica is now so highly regarded that Altova should feel it necessary to throw mud in this way. When I started the company, nearly three years ago now, I could almost have set that as a business objective: "in three years time, Saxonica will have developed such a strong reputation as a technology leader that competitors will resort to knocking us in their advertising". In marketing circles it's well known that knocking your competition is a strategy that can easily backfire, because it actually raises the profile of the competitor among your own customers, and acknowledges them as a leader. Let's hope that turns out to be true in this case. Certainly, it seems a strategic error to make competitive claims that are so easily disproved.

It's worth remembering also that performance is only one dimension. I've worked hard on Saxon performance over the years, and I think it's pretty good (and certainly very competitive), but performance has never been the number one objective. Correctness comes first, in the sense of 100% conformance to the W3C specifications, closely followed by usability, of which the most important ingredient in my view is the quality of the error messages seen by users during the development cycle. At the moment I think Saxon is well ahead of Altova on all three fronts. It's nice to see that Altova are running the W3C conformance tests, but slightly worrying that they are only running a subset; and if I were a user, I would certainly be asking to see their results.

I rather enjoy watching the robust way in which my friends at Stylus Studio attack Altova. But I prefer watching a good fight from the ringside to taking part in it! It's tempting to join in when people starting throwing things at you, but it's not really my style to do business in that way. I'll try and stand back and leave that to others. I prefer to think (perhaps it's an old-fashioned British attitude) that the way to achieve success is to focus on producing the best product.