Now for something a little different...
This build came about because I needed a large quantity of storage for work and home data, files and media that would be easily network accessible and protected by redundancy and error-correction. At first, I considered the typical pre-built NAS systems from Synology, QNAP, Thecus etc. However, I found that the price premium was very high considering what you get for the money, with basic four-bay bare (no drives) NAS systems starting at around USD 300-400 yet only equipped with ARM SOC dual core processors and 1GB of RAM. On the upper end, more advanced eight-bay bare NAS systems start around USD 850-900 but with Atom CPUs and 2GB of RAM, and easily reaching USD 1500-2000 for more beefy Core i3 or above based models with eight-bays or more of storage. Given that a NAS is basically just a decently powerful computer with a large number of storage bays and SATA/SAS connections, at least a couple of Gigabit LAN ports, and a software package to manage the storage, I decided to go out and try to build my own.
My goals for this NAS build were quite simple:
- Storage: hold at least six 3.5" drives in a small form factor size competitive with pre-built NAS systems.
- Size: minimize the total volume of the NAS as much as possible, i.e. find the highest ratio of storage bays to total case volume.
- Compute power: have enough CPU power and RAM capacity to enable other applications aside from storage, such as Plex, virtualization, and other services.
- Connectivity: support at least dual Gigabit ethernet connections to ensure there is enough bandwidth for serving and receiving data.
- Reliability: minimize the maintenance required by ensuring that the case allows for ample airflow and cooling of drives and CPU, while also including sufficient dust protection since it will run 24x7.
- Noise: run as quietly as possible since it will be stored in a home environment.
My first objective was to search for suitable PC cases. In order to minimize the size of the NAS system, clearly a Mini-ITX based case would be the preferred choice, since ATX and mATX cases would be much larger as they allow for four to seven extra PCI slots for expansion - something which would not really be needed here. So I used PCPP's hardware search tools and set out to search for ITX cases with a large number of 3.5" drive bays. It turned out there were only a handful, with volumes and features as below:
- BitFenix Phenom ITX: 30.8L, 6 x 3.5" drive bays
- Fractal Design Node 304: 19.6L, 6 x 3.5" drive bays
- Lian-Li PC-Q25: 20.3L, 7 x 3.5" drive bays
- Lian-Li PC-Q26: 32.3L, 10 x 3.5" drive bays and 1 x 2.5" drive bays
- Lian-Li PC-Q35: 29.2L, 5 x 3.5" drive bays and 2 x 2.5" drive bays
- Silverstone DS380: 21.6L, 8 x 3.5" drive bays, 4 x 2.5" drive bays
(Note: I also considered the Fractal Design Node 804 - an mATX option and a popular choice here. However, at 41L for 8 x 3.5" drive bays / 2 x 2.5" drive bays, it was too large. The 804's wide shape was also not optimal for placing under a desk, which was to be where the system would reside permanently.)
From the shortlist, I eliminated a few options for the following reasons:
- Lian-Li PC-Q35: only five 3.5" drive bays with a relatively large case volume at almost 30L.
- Silverstone DS380: seemed like a promising option at first, but feedback I found online showed users complaining about the cramped internal layout and sub-optimal airflow resulting in drives running hot.
This left the Phenom, Node 304, PC-Q25 and PC-Q26. The 304 and PC-Q25 were quite similar at around 20L with six/seven 3.5" drives, while the Phenom's size at 30L was more comparable to the PC-Q26 at 32L, but with only six drive bays was clearly less space efficient than the PC-Q26 and its ten drive bays. So on this basis I eliminated the Phenom, and was left with the decision between the Node 304 / PC-Q25 or the PC-Q26.
Studying other user's builds of the Node 304 and PC-Q25, it seemed that building in these cases could be very tight. In the Node 304's case, the drives (and their accompanying SATA and power connections) reside right over the motherboard. Therefore, any cooling has to actively pass over both the densely packed drives and cabling, and the motherboard, and thus could be very challenged in this particular case. The PC-Q25 on the other hand has a different design with power supply over the motherboard and drive bays on the side and below the motherboard. It is also quite a densely packed internal layout.
Reconsidering the original goals of the build, and given the possible cooling challenges for the Node 304 and PC-Q25, and also given that a 50% increase in case volume would provide an additional four or five 3.5" drive bays and one 2.5" drive bay as well as better cooling potential, I decided to go with the PC-Q26.
As it turns out - the PC-Q26 was actually discontinued by Lian-Li, so it was incredibly difficult to find. A lot of searching online later, and with a bit of luck, I found it available at a reasonable price.
The next consideration was - how on earth was I going to find a Mini-ITX motherboard that has ten or eleven SATA ports? A lot of searching on PCPP and online later, it turns out there were almost none - I was most likely going to need to add a SATA or SAS controller to the PCI-E slot in order to get all the SATA ports I needed for the PC-Q26. Looking at the options with at least six SATA3 ports:
- AsRock C236 WSI: 8 x SATA3 ports through the Intel C236 chipset
- AsRock C2750D4I: 2 x SATA3 / 4 x SATA2 ports through the Intel C2750, 6 x SATA3 ports through two Marvell controllers
- AsRock E3C236D2I: 6 x SATA3 ports through the Intel C236 chipset
- Gigabyte GA-B150N-GSM: 6 x SATA3 ports through the Intel B150 chipset
- Supermicro X10SDV series: 6 x SATA3 ports through the Xeon SOC
Another consideration, aside from SATA ports, was also whether the motherboard options had dual Gigabit LAN and also whether there was any built-in display controller and output support. The last consideration might seem like a strange one - but actually is a great option to enable the NAS to be used as a media playback system or enable it to be partially used as a workstation, while also serving NAS duties. This gives the system more flexibility for future usage scenarios.
Studying the options with lots of (again) online research, I eliminated a few options:
- AsRock C2750D4I: though this board had enough SATA ports for eleven drives and I could add a video card, the CPU is an Avoton C2750 eight core which is much more underpowered than the other options. Also, various user reviews I read indicated that the reliability of this board was quite poor - in particular with the Marvell SATA controllers.
- AsRock E3C236D2I: an additional SATA/SAS controller would be needed (i.e. no room to add a video card to the PCI slot) and since there were no on-board HDMI or DisplayPorts, it would not be possible to connect a modern display to it.
- Gigabyte GA-B150N-GSM: I almost considered using this, but it was impossible to find anywhere for a reasonable price.
- Supermicro X10SDV series: relatively expensive, and also had the same issue as the E3C236D2I with the digital display options.
This left one option - the AsRock C236 WSI. With eight SATA3 ports, HDMI and DisplayPort output when using the appropriate Xeon E3 CPU (which needs to include an integrated GPU - some Xeon E3 models do not have one integrated), this option would give me enough SATA ports yet also give me the option to connect a display to the system in case the NAS would serve an additional purpose as a media playback system or partial workstation.
With the main challenges - selecting a suitable case and finding an ITX motherboard with enough SATA ports and the on-board display controller option - I was ready to put together this NAS system.
Intel Xeon E3-1245 V5 3.5GHz Quad-Core
Based on the Skylake architecture and comparable to a Core i7-6700, the E3-1245 V5 includes an integrated GPU so that an additional video card isn't needed. (Xeon E3 CPUs ending in "5" include the integrated GPU, for reference.) With four cores (or eight virtual using HyperThreading), there is far more power in this processor than almost any pre-built NAS system from Synology etc.
be quiet! DARK ROCK TF 67.8 CFM Fluid Dynamic Bearing
Taken from my other build and used here, since the PC-Q26's maximum CPU cooler height is 150 mm. I could have used the Intel supplied Xeon E3 CPU cooler, but I wanted to ensure the CPU and motherboard have enough cooling and also runs quietly. From usage in my previous system, this is a great CPU cooler.
ASRock C236 WSI Mini ITX LGA1151
As explained above, the only ITX motherboard option with on-board display controller and output support, while also including eight SATA3 ports. A relatively good quality motherboard from AsRock and with sufficient space around the CPU socket to support the Dark Rock TF cooler. Accessing all of the ports and connections is quite difficult with the CPU cooler installed, so therefore they need to be installed beforehand. With two DDR4 DIMM slots, this board supports a maximum of 32GB ECC RAM. It also has Dual Gigabit LAN from Intel for great connectivity.
Kingston KVR21E15D8/16 32GB (2 x 16GB) DDR4-2133 ECC
To ensure there is enough RAM for the NAS operating system, caching, and any applications or virtual machines, I maxed out the RAM at 32GB of ECC.
Crucial MX300 750GB 2.5" SSD
Something I picked up on a Black Friday sale, it will be used for caching and any application data. It sits at the very bottom of the case below the 3.5" drive cage.
Seagate IronWolf Pro 10TB 3.5" 7200RPM x 10
After studying the cost per TB of various disk sizes up to the current maximum of 10TB, given that the cost per TB differential between the largest disks and smaller disks was not too large, as well trying to avoid the wastage of having to swap out existing disks for new disks in the future due to insufficient capacity, I decided to aim for the largest disks available for this system so that I would not have to change them later, since buying smaller disks (e.g. 4TB or 6TB per disk) and then running out of space and having to buy all new larger disks later would probably be more expensive in terms of total cost.
For NAS drives that are to run 24x7, there are only a few options at the larger disk sizes - WD's Red series which maxes out at 8TB per disk, or Seagate's new IronWolf and IronWolf Pro series which max out at 10TB per disk. (There were also WD Gold and Seagate Enterprise options at 10TB, but these were 50-100% more expensive than the Red and IronWolf options - so I did not consider these.) When considering the cost per TB, the IronWolf series was more competitive than the WD Reds.
For the IronWolf series there are two options - IronWolf and IronWolf Pro. The former comes with a 3-year warranty, while the latter comes with a 5-year warranty and data recovery service for the first two years. The IronWolf Pro is rated for up to 16 bay drive arrays, while the IronWolf is only from 1-8 bay drive arrays. The IronWolf Pro is about a 13% premium over the IronWolf, and given that the warranty extends much further and includes data recovery services as well as support for a 10 bay drive array, I decided to go with the IronWolf Pro since I will use this server for as long as possible (i.e. until it dies).
Overall, this IronWolf Pro 10TB is a great drive - runs very cool (as it is a sealed helium design) and runs fast. Acoustics are quiet during idle, but somewhat louder during random accesses - typical for spinning disk drives. Interestingly - the startup and power down of ten of these disks in one system sounds like a jet engine powering up or spinning down.
Lian-Li PC-Q26B Mini ITX Tower
This is a very well designed case which works well for this build. Some considerations:
- Airflow is designed to move from front to back. Therefore, I have the front three fans as intakes, while the rear top fan acting as exhaust (which will also hopefully help reduce dust build-up on the dust filter at this fan location).
- The dust filters on the front sides of the case for the three intake fans are easily accessible after removing the side panels. The dust filter for the top exhaust fan is much more difficult to remove, however (it requires removing the fan itself).
- The quality of the build is impeccable - I did not find any issues and the aluminum and black powder coating is beautiful.
- Installing and removing drives is super easy. The case comes with screws for each drive which allow you to slide the drives in and out directly into the SATA backplane. Then the drives are all secured using a locking mechanism on the drive cage.
- Removing the side panels is very easy, and makes the internals very easily accessible. One only needs to take care on the back side panel to ensure the cables are stowed appropriately so the back panel will fit back on properly.
- The space for the power supply is quite short due to the drive cage. Therefore, as short as possible a PSU is recommended, since some space is needed for the cables to come out of the PSU.
- The case only comes with one BP2SATA dual SATA backplane. Therefore, I had to get four extra ones (quite difficult to find).
- Wiring the cables can get pretty tight, but the cable channel in the middle back of the case helps a lot.
Corsair RMx 650W 80+ Gold Certified Fully-Modular ATX
A 160 mm long PSU which had just the right number of SATA and Molex connectors to wire everything up. Runs quiet as the fan does not spin up until a certain load level is reached (which was not reached during stress testing I did later).
NoiseBlocker NB-eLoop B12-PS 58.1 CFM 120mm x 4
Having used these very efficient (in terms of airflow vs noise) fans in my other build, I found these to be a perfect fit to keep this system cool and run quietly.
LSI LOGIC SAS 9207-8i Storage Controller LSI00301
A very good SAS controller which can support up to eight SATA drives directly through the two SAS ports. This controller supported full SATA3 speeds, which is why I chose it. I also avoided choosing a SAS RAID controller, as it is preferable to setup software RAID through FreeNAS or unRAID which support more advanced RAID technologies and file systems like ZFS.
Lian-Li Accessory BP2SATA 2Bay to HDD SATA Hotswap Backplane Black Retail x 4
Though I didn't need to get these SATA backplanes, using these backplanes enabled me to reduce internal cabling because each backplane can power two drives from either one SATA or Molex power connector each. These backplanes also allow more easy swapping of drives when needed, without having to mess with any wiring - a big help as it can get very confusing which SATA cable goes to which drive.
SilverStone SST-CP11B-300; Ultra slim SATA 6G 300mm Cable, black x 7
As space in the case is quite tight, I used these ultra slim SATA cables to wire up six 3.5" hard drives and one 2.5" SSD. These SATA cables are incredibly slim and flexible, so they worked very well.
SilverStone SST-CPS03-RE 36 Pin 0.5 m Mini SAS SFF-8087 to SATA 7 Pin Cable
This cable was used to connect one of the SAS ports on the SAS 9207-8i SAS controller to four 3.5" SATA hard drives, making a total of ten 3.5" hard drives connected to the system.
SilverStone SST-SA011 Mini SAS SFF8087 to SFF8088 Adapter
SilverStone SST-CPS02; shielded Mini-SAS 36pin to 36pin cable
For future expandability, this Mini-SAS cable was connected to an external SAS connector. This will allow me to add extra drives using an external SAS drive enclosure and expand the storage capacity of the NAS system at full SATA speeds.
Overall, this NAS build turned out quite well, though there were a few challenges along the way. The case selection required quite careful consideration as some cases which initially seemed appealing turned out to have some shortfalls which would cause cooling or reliability issues later. The motherboard needed to be selected appropriately in order to ensure all the drives that could be installed would be supported, and also having the ability to select a board with on-board digital video support was a nice bonus. Fortunately, I was able to find ways to work through those decision points and maximize the potential of the system. The other challenge was during the build itself - despite being a not-too-cramped case, cable routing and management still needed to be carefully planned in order to reduce obstruction to proper airflow and cooling of the system. Lastly, the size of the finished system is quite good and sits well under the back of my desk. For size comparison, there is a photo in the gallery of the NAS box next to a standard PS4 console.
I have been putting this build through its paces - first by running drive integrity and burn-in tests to ensure the IronWolf Pro drives are functioning at 100% without any errors or bad sectors. To do this I used a combination of the smartctl and badblock tools built into FreeNAS. Some photos of the burn-in statistics are shown in the photo gallery. The burn-in tests were run twice and thankfully completed without errors (though the tests themselves took one week per round). During the testing, I noted that at around 25-27 C ambient, the drives were running at a maximum of 42-44 C - an excellent result. To ensure that the drives run cool at all times, I have set all of the fans to run at full speed at 1200 rpm. Noise levels from the fans is negligible and most noise comes from the disks themselves when they perform random accesses.
I have not yet decided what kind of software to use on this NAS system. There are a couple of contenders, namely FreeNAS and unRAID. I am also considering using Ubuntu which would give me more flexibility in how I can utilize the NAS - i.e. not only as a storage service, but also for running applications such as media playback services or virtualization or as a part-time workstation. As it will take me some time to determine the most appropriate software setup for this NAS system, I shall report back later with any notes.
If you made it this far - thank you for reading! I hope you enjoyed reading it as much as I enjoyed the process of researching and building it.