==Hardware==
We mostly followed the original hardware spec from NVIDIA, updating the capacity of the drives and other minor things, as we had many of these parts available as salvage from other boxes. Though we had to buy the ASUS X99-E WS motherboard (as well as some new drives) just for this project. We would have had to buy a lot more parts to try to build the more recent variants from Lamdba Labs, Bizon-tech, etc.:
*https://developer.nvidia.com/devbox
*https://lambdalabs.com/deep-learning/workstations/4-gpu
*https://bizon-tech.com/us/bizon-g3000
We opted to use a Xeon e5-2620v3 processor, rather than the Core i7-5930K (which we did have available). Both support 40 channels and mount in the LGA 2011-v3 socket, and both have 6 cores, 15mb caches etc. The i7 has a faster clock speed but the Xeon takes registered (buffered), ECC DDR4 RDIMMs, which means we can put 256Gb on the board, rather than just 64Gb. For the GPUs we have a TITAN RTX and an older TITAN Xp available to start, and we can add a 1080Ti later, or buy some additional GPUs if needed. We also put the whole thing in a Rosewill RSV-L4000 case.
| 2 || ARCTIC F8 PWM Fluid Dynamic Bearing Case Fan, 80mm PWM Speed Control, 31 CFM at 22dBA
|}
We would have had to buy a lot more parts to try to build the more recent variants from Lamdba Labs, Bizon-tech, etc. And it isn't clear that they would get much more performance than we will. Their specs are here for your reference (it isn't clear which MoBo they are using):
*https://lambdalabs.com/deep-learning/workstations/4-gpu
*https://bizon-tech.com/us/bizon-g3000