Home > Blogs > Tribal Knowledge > Monthly Archives: April 2007

Monthly Archives: April 2007

The benchmarks are here, but don’t lose sight of the big picture

Eric_horschman_2
Posted by Eric Horschman
Director, Product Marketing

Performance testing of virtualization platforms is important to our customers and partners and their desire for benchmark data is easy to understand. Virtualization is still a new technology and they want assurances that the end user experience won’t suffer when applications are moved into VMs. They also want performance data to support capacity planning and to compare hardware platforms.  To meet those needs, VMware is publishing benchmarks of our products and supporting testing by our users and partners. VMware is also helping to meet the demand by publishing new performance resources regularly. Another effort to support expanding performance data comes from the work of our performance team in defining standard benchmarks with the SPEC organization and their delivery of VMmark — a comprehensive tool for measuring performance of virtualized systems that’s now in beta.

If you’re considering virtualization, make intelligent use of the performance data that VMware, our partners and our competitors are providing, but don’t get overly enamored with the benchmarks.  The bigger picture that includes the reliability, management scalability and security of your virtual infrastructure is always going to be more important than a single attribute like performance.  John Humphreys of IDC has made the same point, “While the focus in the vendor community has been on performance (i.e., should a customer chose a paravirtualized or native virtualized path), I would suggest the efforts and energy applied to wringing out a few extra performance percentage points would be better placed in demonstrating the underlying platform stability and reliability.”1 VMware has taken this approach from the beginning and we’re delivering the most stable hypervisor available -– one so solid that many of our customers proudly report uptimes for their VMs exceeding 1000 days.

Virtualization users are coming to the same conclusions about the relative importance of performance in their deployment decisions. Consider some findings from our own surveys that compare the concerns of virtualization newbies and veterans. The prospective users invariably rank performance as their top concern. They’ve heard about “virtualization overhead” and worry it will lead to a revolt of angry end users. What’s fascinating is how quickly performance concerns drop in the rankings as users progress from VMware evaluations to production deployments. The veterans have learned that when they’re virtualizing three year old physical machines that typically run at 10% utilization or less into VMs on newer hardware, performance just isn’t a problem. The ability of VMware Infrastructure to elevate utilization into the 50-80% range and the relentless march of Moore’s Law means you can comfortably migrate most of your servers into VMs.

If you’re planning to conduct your own virtualization benchmarks, you should consider the uniqueness of testing in virtual machines. Our own performance testing work has uncovered many potential pitfalls that can skew results. Something as simple as relying on virtual time in a VM rather than true real-world time can lead to benchmarks showing virtual machines performing faster than native systems.  There are also recommended physical and virtual machine configuration settings needed to prevent resources from being sapped by activity unrelated to the benchmark tests being performed.  Ensuring the equivalence of VMs and physical machines is important when measuring virtual performance relative to native. We’ve assembled some benchmarking guidelines and performance tuning best practices. We have also implemented a benchmark review process and the real-live people behind that process will help our users design valid test plans for any of our products.

The VMware benchmark review process is simple.  If you want to publish VMware product performance testing results, just fill out our web form to get started.  We’ll contact you to arrange a review of your test plan and results and we commit to completing the whole process within 30 days.

That all sounds nice, but does VMware really meet its commitment to support testing of our products?  Just ask the over 50 commercial and academic users who’ve had their benchmark tests reviewed and approved by VMware.  Better yet, just ask our friendly competitors at XenSource who recently published results comparing their latest product to VMware Infrastructure 3. We asked for a few changes –- some we got and some we didn’t -– but, in the end, both VMware and XenSource agreed to the validity of the results.

Benchmarks and performance test results are useful tools when planning a move to virtualization, but don’t get fixated on them. Virtualization overhead is an inevitable tradeoff made for the tremendous management and provisioning efficiencies of technology. However, the incredible pace of hardware performance improvements and the universally low CPU utilizations in most data centers reduce that overhead to insignificance. Sure, there will always be workloads that push the limits of a physical server that aren’t good virtualization candidates. But don’t spend too much time worrying how to bring them into your virtual infrastructure -– AMD, Intel and VMware will do the heavy lifting for you to turn a server that’s taxing a dedicated box today into a slam-dunk P2V candidate tomorrow.


1IDC Link, “The Battle for the Hypervisor,” Doc #lcUS20612507, March 22, 2007