Previously, I detailed how we were curious as to how two identically configured clusters — one vSphere/VSAN and one vSphere/Nutanix — might stack up in terms of relative cost and performance. The comparisons were eye-opening, especially for those of us who have spent time studying storage performance.
We hit a snag along the way, as Nutanix recently changed their EULA to expressly prohibit any form of publishing performance results without prior written permission. We applied for permission to publish, and they declined.
However, we still have useful data to share!
In this post, I shared the VSAN side, sampling results under a variety of very demanding synthetic workloads.
Recently, we completed an internal series of head-to-head VMmark tests. VMmark is a very interesting test, as it looks at overall consolidation ratios when running a variety of real application workloads. While not the easiest benchmark to set up and run, it has served as a useful estimator of overall cluster performance. While storage is certainly a component of VMmark, it is not a dominant component.
If you’re not familiar with VMmark, application bundles are composed into “tiles”. Each tile is gated in how much workload it can drive. More tiles, more workloads — and presumably better consolidation ratios. In addition to performance, resource utilization (CPU and memory) is tracked. The results give a wealth of data on how well a give configuration will perform, or — equivalently — how efficient it is.
Our premise was that VSAN would be not only faster, but much more efficient with CPU and memory consumption, thus enabling customers to achieve higher consolidation ratios with vSphere/VSAN than with vSphere/Nutanix — essentially doing more work for less money.
While we are not permitted to publish direct comparison results, our tests showed significant and meaningful differences in both performance and resource efficiency as the number of tiles increased. Interested parties are certainly encouraged to run their own tests and reach their own conclusions.
In the meantime, here’s a previous study from April showing how VSAN 6 behaves under heavy VMmark loads, as well a clear evidence of the performance improvements between VSAN 5.5 and VSAN 6.