Introduction
People
Downloads
Publications
Evaluation
Future work
Contact

Valid XHTML 1.0!

XPathMark
An XPath Benchmark for XMark

We evaluated some state-of-the-art XML engines using XPathMark benchmark. We ran all the tests on a 3.20 GHz Intel Pentium 4 with 2GB of main memory under Linux version 2.6.9-1.667 (Red Hat 3.4.2-6.fc3). All the times are response CPU times in seconds. For each engine, we ran all the supported queries on XMark documents of increasing sizes. The document series is the following (XMark factors):

(0.001, 0.002, 0.004, 0.008, 0.016, 0.032, 0.064, 0.128, 0.256, 0.512, 1)

corresponding to the following sizes (in MB):

(0.116, 0.212, 0.468, 0.909, 1.891, 3.751, 7.303, 15.044, 29.887, 59.489, 116.517)

In the computation of the completeness index we did not consider queries using the namespace axis, since this axis is no more supported in XQuery. In the following we report the evaluation outcomes for different XML engines.

Evaluation for Saxon
We tested version B 8.4, with Java 2 Platform, Standard Edition 5.0. The query test set (benchmark queries that are supported by the engine) is {Q1-Q13,Q15-Q47} of cardinality 46.
  • The evaluation outcome in XML format (according to the benchmark evaluation DTD)
  • The evaluation outcome in Gnuplot format:
    • The average benchmark response times [plot1, plot2, plot]: for each XML size in the document series (x-axis) we plot the average query response time over the entire benchmark (y-axis)
    • The benchmark response speeds [plot1, plot2, plot]: for each XML size in the document series (x-axis) we plot the response speed of the entire benchmark (y-axis)
    • The data scalability factors for the benchmark [plot]: for each pair of consecutive XML documents in the document series (x-axis) we plot the data scalability factor of the entire benchmark (y-axis)
    • The average query response speeds [plot]: for each supported query in the benchmark (x-axis) we plot the average query response speed over the document series (y-axis)
    • The average data scalability factors for the queries [plot]: for each supported query in the benchmark (x-axis) we plot the average data scalability factor over the document series (y-axis)
Evaluation for Galax
We tested version 0.5. The query test set (benchmark queries that are supported by the engine) is {Q1-Q9,Q12,Q13,Q15-Q24,Q30-Q47} of cardinality 39.
  • The evaluation outcome in XML format (according to the benchmark evaluation DTD)
  • The evaluation outcome in Gnuplot format:
    • The average benchmark response times [plot1, plot2, plot]: for each XML size in the document series (x-axis) we plot the average query response time over the entire benchmark (y-axis)
    • The benchmark response speeds [plot1, plot2, plot]: for each XML size in the document series (x-axis) we plot the response speed of the entire benchmark (y-axis)
    • The data scalability factors for the benchmark[plot]: for each pair of consecutive XML documents in the document series (x-axis) we plot the data scalability factor of the entire benchmark (y-axis)
    • The average query response speeds [plot]: for each supported query in the benchmark (x-axis) we plot the average query response speed over the document series (y-axis)
    • The average data scalability factors for the queries[plot]: for each supported query in the benchmark (x-axis) we plot the average data scalability factor over the document series (y-axis)
A comparison between Saxon and Galax
The query test set (benchmark queries that are supported by both the engines) is {Q1-Q9,Q12,Q13,Q15-Q24,Q30-Q47} of cardinality 39.
  • The average benchmark response times [plot1, plot2, plot]
  • The benchmark response speeds [plot1, plot2, plot]
  • The data scalability factors for the benchmark [plot]
  • The average query response speeds [plot]
  • The average data scalability factors for the queries [plot]