Difference between revisions of "GPU Build"

From edegan.com
Jump to navigation Jump to search
Line 44: Line 44:
 
* [https://medium.com/@SocraticDatum/getting-started-with-gpu-driven-deep-learning-part-1-building-a-machine-d24a3ed1ab1e How to build a GPU deep learning machine]
 
* [https://medium.com/@SocraticDatum/getting-started-with-gpu-driven-deep-learning-part-1-building-a-machine-d24a3ed1ab1e How to build a GPU deep learning machine]
 
* [https://www.slideshare.net/PetteriTeikariPhD/deep-learning-workstation Deep Learning Computer Build] useful tips, long
 
* [https://www.slideshare.net/PetteriTeikariPhD/deep-learning-workstation Deep Learning Computer Build] useful tips, long
 +
* [https://www.tooploox.com/blog/deep-learning-with-gpu Another box]
  
  
Line 71: Line 72:
 
===GPU===
 
===GPU===
 
* 2x GTX 1080 Ti
 
* 2x GTX 1080 Ti
* Aspeed AST2400 with 32MB VRAM (comes with motherboard)
+
*  
  
 
===RAM===
 
===RAM===

Revision as of 14:21, 27 October 2017


McNair Project
GPU Build
Project logo 02.png
Project Information
Project Title GPU Build
Owner Oliver Chang, Kyran Adams
Start Date
Deadline
Primary Billing
Notes
Has project status Active
Copyright © 2016 edegan.com. All Rights Reserved.



Single vs. Multi GPU

  1. "I quickly found that it is not only very difficult to parallelize neural networks on multiple GPUs efficiently, but also that the speedup was only mediocre for dense neural networks. Small neural networks could be parallelized rather efficiently using data parallelism, but larger neural networks... received almost no speedup."
  2. Possible other use of multiple GPUs: training multiple different models simultaneously, "very useful for researchers, who want try multiple versions of a new algorithm at the same time."
  3. This source recommends GTX 1080 Tis and does cost analysis of it
  4. If the network doesn't fit in the memory of one GPU (11 GB),
  1. Want to get two graphics cards, one for development, one (crappy card) for operating system [1]
  1. Intra-model parallelism: If a model has long, independent computation paths, then you can split the model across multiple GPUs and have each compute a part of it. This requires careful understanding of the model and the computational dependencies.
  2. Replicated training: Start up multiple copies of the model, train them, and then synchronize their learning (the gradients applied to their weights & biases).

TL;DR

Pros of multiple GPUs:

  • Able to train multiple networks at once (either copies of the same network or modified networks). Allows for running long experiments while running new ones
  • Possible speed ups if the network can be split up (and is big enough), but tensorflow is not great for this
  • More memory for huge batches (not sure if necessary)

Cons of multiple GPUs:

  • Adds a lot of complexity.


Misc. Parts

  • Cases: Rosewill 1.0 mm Thickness 4U Rackmount Server Chassis, Black Metal/Steel RSV-L4000[2]
  • DVDRW (Needed?): Asus 24x DVD-RW Serial-ATA Internal OEM Optical Drive DRW-24B1ST [3]
  • Keyboard and Mouse: AmazonBasics Wired Keyboard and Wired Mouse Bundle Pack [4]


Other Builds/Guides


Questions to ask:

  • Approx. dataset/batch size
  • Network card?
  • DVD drive?
  • How much RAM/storage needed?
  • Do we need to get the same motherboard/CPU? There are cheaper/better CPUs. For example, V4 of same CPU is same price.

Single GPU Build

Double GPU Build

PC Partpicker build

Motherboard

  • Should have enough PCIe slots
  • Motherboards: ASUS Z10PE-D16 [5], Dual LGA 2011 R3, DDR4 - Up to 32GB RDIMM, 16 slots

CPU/Fan

  • Not a huge deal, but used for data preparation
  • If using multiple GPUs, at least one core (two threads) per GPU
  • Chips: Intel Haswell Xeon e5-2620v3, 6 core @ 2.4ghz, 6x256k level 1 cache, 15mb level 2 cache, socket LGA 2011-v3 [6]
  • CPU Fans: Intel Thermal Solution Cooling Fan for E5-2600 Processors BXSTS200C [7]

GPU

  • 2x GTX 1080 Ti

RAM

  • At least twice as much RAM as GPUs (2 * 2 * 11 GB [GTX 1080 Ti size] = 32 GB)
  • RAM: Crucial - 32GB (2 x 16GB) Registered DDR4-2133 Memory [8]

PSU

  • Some say 1.5x-2x wattage of GPU+CPU, some say GPU+CPU+100W
  • PSUs: Corsair RM Series 850 Watt ATX/EPS 80PLUS Gold-Certified Power Supply - CP-9020056-NA RM850 [9]

Storage

  • SSD: Intel Solid-State Drive 750 Series SSDPEDMW400G4R5 PCI-Express 3.0 MLC - 400GB [10] or 800GB [11]
  • HDD: WD Red 3TB NAS Hard Disk Drive [12] - 5400 RPM Class SATA 6 Gb/s 64MB Cache 3.5 Inch
  • Possibly better HDD: Seagate - Barracuda 3TB 3.5" 7200RPM Internal Hard Drive [13]