I analyzed in a previous article which alternatives are available while designing our own home lab for VMware, and I ended up choosing the WhiteBox way. In the following weeks I looked through the products on the market, in order to find my own solution. I need to say thanks to my friend Luca Roman, co-owner of runstore.it, who helped me in choosing, searching and assembling the components.
The main component to be choosen is the motherboard/cpu combination. There is no motherboard that can host every cpu, and some choices of one of the two components influences the other.
My first choice has been the CPU, and I choose a AMD FX-6100. Is an ExaCore cpu at 3.3 ghz clock. The main reasons for choosing this were:
– 6 cores in a single socket allows a good vCPU/pCPU ratio at a fair price if compared to Intel; using whiteboxes, dual socket motherboards are too expensive, so a single proessor should be plenty of cores
– low power consumption: 95w vs Intel i7 Extreme Edition 6-core at 130w
– availability of AMD Cool’n’Quiet, nice for having low noise when running at home
The CPU choice was obviously temporary, since I needed to have a good motherboard for using this cpu. After some searches I picked a AS-Rock 970 Extreme4. These are the reasons:
– obviously the FX-6100 cpu is supported, but it also supports bigger CPUs. So in the future I will be able to install some of them (maybe with even more cores)
– it has 4 ram slots. Even if now it has 4gb module on each slot, it can reach 32 Gb of RAM when 8 Gb modules will become cheaper…
– It allows the use of IOMMU. This means I can use VMDirectPath on ESXi
– it has several PCI-E connectors, good for installing addon cards
I used the cheapest ATX case we found to lower down costs.
I installed the memory modules, and a 1Gb usb key I already had to install and run ESXi 5. I did not install any HDD to lower costs and also power consumtion.
Networking: since I would like to have a realistic production environment, I installed on my WhiteBox a Intel Pro1000 VT quad port gigabit ethernet, as you can see in the picture before the installation.
Together with the onboard nic (Realtek 8168 gigabit) identified by ESXi 5.0, I will have a total of 5 network uplinks to test teaming, failover, and manage storage network for iSCSI and NFS.
Finally, the price: the grand total was 449 EURO.
Hi,
I was reading your article in a search for a ESX5 White Box and your system is what I was looking for. Reasonable price !
One question, do you know if ESX5 works with the onboard storage controller of the ASROCK 970 EXTREME 4 ?
Hi Peter,
one of my friends tested the same WhiteBox with a simple sata disk and ESXi booted without a hitch, I have not tested it but sounds like you can use local storage with it.
Luca.
hello Sir,
i find your article very interesting.
I do have some questions.
What video card and memory type are you using?
Since you’ve made a good working config, i would like to build one based on your experiences.
Many thanks in advance
Bart
Great article! I use a even cheaper solution that doesn’t support all the VMWare advanced features but it’s good to run some virtual machines for testing:
http://www.misco.it/store/hp-633724-421-proliant-microserver-s130077.html
Bye
Gian Paolo
Hi Bart,
the video card is a small card I found around, no real need to have a dedicated one since the machine will stay without monitor for the most of the time.
RAM is this one:
http://www.corsair.com/vengeance-8gb-dual-channel-ddr3-memory-kit-cmz8gx3m2a1600c9b.html
Luca.
Hi,
did you try vga passthrough or pass to VM any other devices ?
no I’ve not tested and pci passthrough at the moment, would have to in the future…
Luca.
Hi,
I like this setup.
Dit you already test directpath.
Regards,
REmco
Hi, no did not test directpath. This article is quite old, if you look at the page describing my lab, now I have a different setup.
Luca.