At Nutanix Customers’ and Prospects’ often ask about performance data. There is a lot of performance data published, such as in best practice guides, on third party sites such as the SAP SD Benchmark site (where Nutanix has a published certified SAP 2-Tier benchmark), and on blogs such as this. We are able to readily provide insightful data across a wide range of applications, systems and situations. Yet there are still people that say for some reason that we don’t publish performance data (maybe their GoogleFoo is weak?). Nutanix performance is more than just latency and IOPS, we analyze real application characteristics. It’s about time this was resolved with a third party vetted report.
Nutanix has worked with ESG and had them independently review and provide a report on the performance of the Nutanix architecture and platform using real applications as the basis of the analysis. Andy Daniel (at PernixData before joining Nutanix in the acquisition) has done a great job of pulling together the various parties, including my team, to contribute to the work. Andy explains the basis of the testing in his article here, and thanks to Mike Leone from ESG for his expert analysis.
We use real applications because customers run real applications and a micro benchmark isn’t a production application. Martijn Bosschaart explains very well the why of this testing method in his article To Finish First, You First Have To Finish. The applications included cover Microsoft SQL Server, Oracle, Exchange and XenDesktop for VDI. The report shows the typical performance that can be expected from the platforms under testing, some of which are not the latest generation, so performance is likely to have improved on the latest models. The report also shows the predictability and consistency of performance across the test scenarios. If you are interested in the performance of the Nutanix platforms I would highly encourage you to review the ESG Lab Review – Performance Analysis: Nutanix. Hopefully this report can clarify any performance concerns that customers, partners and prospects have.
Performance isn’t just one element, it is the result of a whole set of elements across a business solution and should be considered when things are going well, in addition to when things are not going well. It should include upgrades, failed components, recovery operations, and provisioning times and other elements. All of these are part of measuring the success and performance of a business solution. In the absence of specific requirements and a specific solution to a business problem benchmarks and other performance data can give an indication of how a platform might perform or behave under certain specific scenarios. But it’s important to be able to relate those scenarios back to your individual requirements. Dheeraj Pandey, Nutanix CEO, explains how Nutanix is different in many ways to other HCI players in his Quora article. It can be quite nuanced, and architectural decisions of the core platform really make a difference, when things are going well, and when things are not going so well.
The Nutanix team is available to help answer any customer/partner/prospect questions on this report or any other topic relating to the Nutanix platform.
This post first appeared on the Long White Virtual Clouds blog at longwhiteclouds.com. By Michael Webster +. Copyright © 2012 – 2017 – IT Solutions 2000 Ltd and Michael Webster +. All rights reserved. Not to be reproduced for commercial purposes without written permission.