Storage benchmarks from the likes of the Storage Performance Council (SPC) can help users evaluate competing products, but you’ve got to be careful how you use them.
Vendors, of course, use them the tests to prove that their products are better than those of their competitors. Network Appliance, for example, commissioned VeriTest, a service of Lionbridge Technologies, to prove that the NetApp FAS3070 was superior to EMC’s Clariion CX3-80. Not to be outdone, EMC commissioned its own benchmark to prove the CX3-80 could outperform the midrange systems of its rivals, including the HP EVA8000 and IBM DS4800.
Why is it, then, that so many vendors can produce numbers to support their alleged leadership? One trick is to optimize the software running on storage hardware so that it performs exceptionally well on specific benchmarks.
“Those optimizations may do little to improve real-world performance,” says Geoffrey Noer, senior director of product marketing for Rackable Systems. “Similarly, hardware configurations may be unrealistic, with drives being short-stroked or being organized in an unusual RAID configuration in order to increase performance.”
Short stroking, for example, is using only a small portion of each hard drive to improve performance. This is usually done by placing data on the outer section of disk platters to reduce the seek time of the mechanical heads. Another way to boost benchmark results is to use a SQL database that is small enough to be held completely in cache on the controller so it does not always reflect the write-to-disk speed.
So what should end users watch out for when evaluating benchmark results?
“If the results from a benchmark do not include enough detailed information to independently reproduce the benchmark result, I would assert that any information provided is questionable and of marginal value,” says Walter Baker, an administrator and auditor for SPC.
It is important, therefore, to ask for complete technical details of the workload used, including workload parameter values used to provide the benchmark stimulus, as well as a comprehensive description of the system/storage configuration used for the benchmark measurement. In addition, demand to know all configuration parameter values used in the benchmark, with an emphasis on parameters changed from their default values.
Making Use of Benchmarks
Richard McCormack, senior vice president of marketing at Fujitsu Computer Systems, is a benchmarking fan. As well as storage benchmarks such as those from SPC, his company uses TPC tests on its servers to measure performance.
“TPC is an industry standard that allows you to compare one system to another,” he says. “We do them on lots of them, as they help us in our efforts to keep ahead of the Joneses.”
Plenty of users rely on benchmarks, too, and not just for product comparison during the selection process. James Yaple, IT specialist at the U.S. Department of Veterans Affairs (VA) Austin Automation Center, uses SPC results to determine chargeback rates for his storage customers.
But SPC isn’t the only game in town. As well as TPC for server transactional performance, there are plenty of other tests users can use to measure storage gear. VA, for example, utilizes the Oracle I/O Numbers (ORION) benchmarking tool (available free at Oracle’s Web site.
“The ORION benchmark helps me compare performance and throughput for different tiers and types of disk when using an Oracle database,” says Yaple.
Chip Nickolett, president of systems integrator Comprehensive Solutions Inc. of Brookfield, Wisc., encourages end users to consult benchmarks like SPC in making high-level vendor, product and configuration comparisons. The results allow you to get an idea about performance given a specific load profile. Based on the benchmark results, it is often possible to identify configurations or isolate components that make sense given your requirements and budget.
“It can be difficult to try to use benchmarks for more than that due to the very specific hardware and software configurations used, not to mention frequent changes and upgrades in hardware,” he says. “Device configuration that is ideal for a benchmark might not be well suited for your production environment. These types of tests help get you to the right neighborhood, but not necessarily to your exact destination.”
Once the hardware has been purchased and is being configured, other simple benchmarks can be used to assist with optimal configuration. Nickolett cautions that most default configurations are optimized for reading, yet everyday environments have significant amounts of write activity. As a result, he provides his customers with a simple write benchmark (“WriteBench,” available for free download at http://www.comp-soln.com/WriteBench.zip) to simulate various scenarios based on the size of the disk writes as well as the volume of activity.
“As long as the benchmarks are from an independent, recognized organization such as the SPC or SPEC [Standard Performance Evaluation Corporation], it is safe to assume that the tests are fair and valid,” says Nickolett. “The bigger question is whether or not they are representative of your environment.”
Moosa Matariyeh, an enterprise storage specialist at reseller CDW Corp., notes that not all vendors are part of SPC. Therefore, it is not always possible to compare manufacturers based on the same criteria using SPC benchmarks.
“For storage vendors that are not part of SPC, it would be appropriate to go to a third party group like Gartner Group, International Data Corp. or Enterprise Strategy Group,” he says. “CDW does use the benchmarks to an extent, but uses customer feedback even more.”
In the second article of this two-part series, we’ll discuss SPC benchmarks in more detail, including the well-known SPC-1 benchmark and the newer SPC-2, plus upcoming SPC tests called SPC-1c and SPC-2c.