Tag Archives: vSA

vSA vSphere Storage Appliance Performance Benchmark Test

The article below goes into depth with my experience with VMware vSA Performance Benchmark Testing. I’ve tried to be as detailed as possible to give you a complete picture of my findings. I believe that there is a space where storage virtualization may thrive but with recent experience with the VMware vSA product, I am less than satisfied with the results, manageability and most of all performance. I believe storage virtualization has a few more years until maturity until it can be truly considered a serious candidate in the small & remote office scenarios.  This statement holds true for other two/three node storage virtualization technologies including Falconstor’s storage virtualization.

VMware Version Information

VMware vCenter Server 5.1.0, 947673
VMware vStorage Appliance 5.1.3, 1090545
VMware ESXi 5.1 U1, HP OEM Bundle, 1065491 (VMware-ESXi-5.1.0-Update1-1065491-HP-5.50.26.iso)

HP ProLiant DL385 G2 Hardware Configuration
– 4 CPUs x 2.6 GHz
– Dual-Core AMD Opteron Processor 2218
– AMD Opteron Generation EVC Mode
– HP Smart Array P400, 512MB Cache, 25% Read / 75% Write
– RAID-5, 8x 72 GB 10K RPM Hard Drives
– HP Service Pack 02.2013 Firmware

vStorage Appliance Configuration
– 2 Node Cluster
– Eager Zero Full Format
– VMware Best Practices

IOZone Virtual Machine Configuration
– Oracle Linux 6.4 x86_64
– 2 vCPU
– 1 GB Memory
– 20 GB Disk, Thick Eager Zero Provisioned
– VMware Tool (build-1065307)

IOZone Test Paramaters
/usr/bin/iozone -a -s 5G -o

-a   Used to select full automatic mode. Produces output that covers all tested file operations for record sizes of 4k to 16M for file sizes of 64k to 512M.

-s #   Used to specify the size, in Kbytes, of the file to test. One may also specify -s #k (size in Kbytes) or -s #m (size in Mbytes) or -s #g (size in Gbytes).

-o   Writes are synchronously written to disk. (O_SYNC). Iozone will open the files with the O_SYNC flag. This forces all writes to the file to go completely to disk before returning to the benchmark.

VMware ESXi/vSA Network Configuration

VMware vSA Architecture

IOZone Performance Benchmark Results

vSA Read Graph

vSA Stride Read Graph

vSA Random Read Graph

vSA Backward Read Graph

vSA Fread Graph

vSA Write Graph

vSA Random Write Graph

vSA Record Rewrite Graph

vSA Fwrite Graph

Download RAW Excel Data


The vSA performed far less than the native onboard storage controller which was expected due to the additional layer of virtualization. I honestly expected better performance out of the 8-disk RAID-5 even without storage virtualization since they were 10,000 RPM drives. On average, across all the tests there is 76.3% difference between the native storage and the virtualized storage! Wow! That is an expensive down grade! I understand that the test bed was using not the latest and greatest hardware but in general terms of disk performance is generally limited by the spinning platter. I would really be interested in seeing the difference using newer hardware.

I believe this only depicts a fraction of the entire picture, performance. There is other concerns that I have at the moment with storage virtualization such as complexity and manageability. I found the complexity to be very frustrating while setting up the vSA, there are many design considerations and limitations with this particular storage virtualization solution most of which were observed during the test trails. The vSA management is a Flash-based application which had it’s quirks and crashes as well. Crashes at a storage virtualization layer left me thinking that this would be a perfect recipe for data loss and/or corruption. In addition, a single instance could not manage multiple vSA deployments due to IP addressing restrictions which was a must for the particular use-case which I was testing for.

For now, storage virtualization is not there yet in my opinion for any production use. It has alot of room to grow and I will certainly be interested in revisiting this subject down the road since I believe in the concept.

Reference Articles That May Help 

Tagged , , , , , , ,

VMware vSA vSphere Storage Appliance Installation Parameters

During recent troubleshooting of installing VMware’s vSphere Storage Appliance, I’ve under covered two installation parameters which could be needed depending on your environment and installation.

The first one allows you to change the username and password that the vSA will use to connect to vCenter with.

VMware-vsamanager.exe /v"VM_SHOWNOAUTH=1"

VMware vSA Install Screenshot

The other parameter, allows you to specify the vCenter IP addresss or FQDN.

VMware-vsamanager.exe /v"VM_IPADDRESS=<fqdn/ip>"

VMware vSA Install Screenshot

… or the two can be combined like

VMware-vsamanager.exe /v"VM_SHOWNOAUTH=1 VM_IPADDRESS=<fqdn/ip>"
Tagged , , , ,