Dell DVDStore is a freely available Database Benchmark that you can use to test a variety of different databases, such as SQL, Oracle, PostgreSQL, MySQL, on either Windows or Linux. It has been around since 2007 and was originally developed by Dave Jaffe at Dell and Todd Muirhead at VMware. It is also included as one of the workloads in the VMmark 2 VMware virtualization benchmark. The current version is 2.1, released in 2011, and it has proved to be a solid and reliable tool. I’ve heard through reliable sources that an updated version may be in the works, so it would be worth looking out for it. This article will briefly cover some of the tips and tricks that I’ve found during testing that I’ve been conducting with DVDStore, such as in my recent article titled VMware vSphere: A Great Place to Run Enterprise Databases.
Before we get started, some important links. Firstly the link to where you can download Dell DVDStore – http://linux.dell.com/dvdstore/. There is also a community wiki, which includes a short video that you may find useful – http://en.community.dell.com/techcenter/extras/w/wiki/dvd-store. This article won’t go through all the steps in detail as they are well documented as part of the product documentation.
The DVD Store is described as follows in the readme “a complete online e-commerce test application, with a backend database component, a web application layer, and driver programs. The goal in designing the database component as well as the mid-tier application was to utilize many advanced database features (transactions, stored procedures, triggers, referential integrity) while keeping the database easy to install and understand. The DS2 workload may be used to test databases or as a stress tool for any purpose.”
After you’ve created your Database VM and downloaded DVDStore you’ll go and extract it. I suggest reading through all the readme files, FAQ and documentation that comes with it. There are a number of default scripts for various database sizes, and you can also customize the size of database you wish to test. The standard sizes are as follows:
For this purpose as my testing I used SQL Server on Windows 2008 R2, 2012 and 2012 R2. I had a VM template with 4 vCPU, 32GB RAM, and 6 virtual disks (OS/App, Backup/DS2 Files, DB Data, TempDB Data, TempDB Log, DB Log). I was testing against a 100GB database and dong long running tests, so I made sure I had plenty of space on the drives. I also used multiple virtual SCSI controllers, in this case PVSCSI, one for OS/App/Backup/DS2, one for DB Data, one for TempDB Data and one for Transaction Logs (TempDB and DB). For different tests I changed the number of vCPU’s configured and the number of database VM’s being tested, but all other configurations stayed the same. I used thin provisioned storage and virtual disks in all my tests, my back end storage also had compression enabled.
The way I ran the tests was to provision fresh databases from template each time. After each test I would power off and delete the databases, and then freshly provision them again from template for the next test run. Provisioning 12 x 300GB VM’s takes about 10 minutes in my lab environment, including the time to customize the OS and be ready to run a test. So provisioning from template each time really isn’t a problem and it means the tests can be more consistent.
You’ll need to create some data input files to load into the database which will set foundation for the test. The benchmark has been written to simulate an Online DVD Store and generates a transactional type workload, and the data generated isn’t just zeros but semi realistic random test data. However there is a catch. The generated data is in Unix/Linux text format. So if you are using Windows you’ll need to convert the data files to Windows text format prior to trying to run the database load scripts. I used Unix2Win to convert all the data files prior to loading them.
There are various options for a test run, including if you want to collect detailed statistics, and if you want to run the test via a web / app tier or direct to the database. For the testing I did, I went direct to the database. Before you run the test programs you’ll want to configure the test driver config. I have included the configuration I used below.
When I first started running tests I just watched the output on the screen. But I wanted to be able to record the output to a file for later analysis and also so I could come back to a test later and ensure it was successful (especially important when one of my tests is running for 28 days). I found that by just trying to pipe the output to a text file it wouldn’t work. After trying everything I could think of I contacted Todd Muirhead, one of the authors. He explained that during the original development and compiling the ran into a problem that meant they had to send output to StdErr, instead of StdOut. But there was an easy solution and I could just recompile the code (which comes with the benchmark). Here’s an extract from the very quick response I got from Todd:
“So the DVD Store driver program currently outputs everything via Console.Error.WriteLine() where it really should be using Console.WriteLine() function instead. This output to the error console causes redirects to text files to not work. So all that is needed is to do a search and replace on Console.Error.WriteLine with Console.WriteLine on the ds2xdriver.cs source file. Then just recompile following the command in the header of the ds2xdriver.cs file. For example this is what I used to compile the attached binary (you will have to change the paths match your environment):
\Windows\Microsoft.NET\Framework\v4.0.30319\csc.exe /out:ds2oracledriverNewOutput.exe ds2xdriverOutput.cs ds2oraclefns.cs /d:USE_WIN32_TIMER /d:GEN_PERF_CTRS /r:C:\app\client\Administrator\product\12.1.0\client_2\ODP.NET\bin\4\Oracle.DataAccess.dll”
At the time I was working with SQL and wondered if there was another way to get around this problem. The good news is I found a very easy method to pipe the output to a log file and I didn’t need to recompile the driver. I just used a command line option to redirect StdOut to a file and StdErr to StdOut. The command line looks like this:
c:\ds2\sqlserverds2\ds2sqlserverdriver.exe –config_file=driverconfig.txt 1> SQLServerPerf.Log 2>&1
There is more information available on output redirection in this Microsoft KB article – Redirecting Error Messages from Command Prompt: STDERR/STDOUT. This will create a lot file called SQLServerPerf.Log and all of the performance run data will be included, including orders per minute, latency etc.
The tests I’ve been running and am running are between 90 minutes and 28 days. But if you’re not in a position to destroy your test VM’s each time and recreate them as quickly as 10 minutes then you’ll want to clean up the data from the previous test runs. You can do this very easily using the osql command line tool as per below (assuming your DS2 files are in c:\ds2).
c:\ds2> osql -Usa -P<pw> -S <server> -i c:\ds2\sqlserverds2\build\sqlserverds2_cleanup_100GB.sql
This will get the database back to it’s pre test state. At this point you may want to update your statistics or do any other SQL cleanup operations (if you’re testing SQL).
The previous article I published, VMware vSphere: A Great Place to Run Enterprise Databases, used Dell DVDStore to test the performance of SQL Server VM’s on vSphere 5.5. I have recently upgraded to vSphere 6.0 on my test hosts and I’m seeing between 8% and 25% improvement in performance based on the same test method and on the same hardware. This is a great example of how software investments can pay big dividends in terms of performance by using new versions. As with anything of course, your results will vary based on your configuration and you may not see the same thing. But I’m happy with the results I’ve got from vSphere 6.0. Each major release VMware Engineers do everything they can to improve the performance. You get the benefit even if you don’t have the luxury of upgrading your hardware at this point in time.
Dell DVD Store is a great free tool to use to test an OLTP transactional type database on any platform. The authors have done a great job at putting together a consistent and reliable benchmark. It’s great for baselining database configuration changes, as well as just doing database performance benchmarking.
If you want tips on how to set up a SQL Database to achieve best performance for business critical applications then I would recommend you take a look at the book Virtualizing SQL Server with VMware: Doing IT Right (VMware Press 2014). I am one of the co-authors of this book.
This post first appeared on the Long White Virtual Clouds blog at longwhiteclouds.com. By Michael Webster +. Copyright © 2012 – 2015 – IT Solutions 2000 Ltd and Michael Webster +. All rights reserved. Not to be reproduced for commercial purposes without written permission.
[…] heard someone was testing database performance recently and then I stumbled across Michael’s article on the subject. I do in fact consider him to be a quite serious resource in the area so very good […]