• Log In
  • Register


The 100TB NAS

by hklenny


Part List View full price breakdown


Date Published

Feb. 3, 2017

CPU Clock Rate

3.5 GHz


Now for something a little different...

This build came about because I needed a large quantity of storage for work and home data, files and media that would be easily network accessible and protected by redundancy and error-correction. At first, I considered the typical pre-built NAS systems from Synology, QNAP, Thecus etc. However, I found that the price premium was very high considering what you get for the money, with basic four-bay bare (no drives) NAS systems starting at around USD 300-400 yet only equipped with ARM SOC dual core processors and 1GB of RAM. On the upper end, more advanced eight-bay bare NAS systems start around USD 850-900 but with Atom CPUs and 2GB of RAM, and easily reaching USD 1500-2000 for more beefy Core i3 or above based models with eight-bays or more of storage. Given that a NAS is basically just a decently powerful computer with a large number of storage bays and SATA/SAS connections, at least a couple of Gigabit LAN ports, and a software package to manage the storage, I decided to go out and try to build my own.

My goals for this NAS build were quite simple:

  • Storage: hold at least six 3.5" drives in a small form factor size competitive with pre-built NAS systems.
  • Size: minimize the total volume of the NAS as much as possible, i.e. find the highest ratio of storage bays to total case volume.
  • Compute power: have enough CPU power and RAM capacity to enable other applications aside from storage, such as Plex, virtualization, and other services.
  • Connectivity: support at least dual Gigabit ethernet connections to ensure there is enough bandwidth for serving and receiving data.
  • Reliability: minimize the maintenance required by ensuring that the case allows for ample airflow and cooling of drives and CPU, while also including sufficient dust protection since it will run 24x7.
  • Noise: run as quietly as possible since it will be stored in a home environment.

My first objective was to search for suitable PC cases. In order to minimize the size of the NAS system, clearly a Mini-ITX based case would be the preferred choice, since ATX and mATX cases would be much larger as they allow for four to seven extra PCI slots for expansion - something which would not really be needed here. So I used PCPP's hardware search tools and set out to search for ITX cases with a large number of 3.5" drive bays. It turned out there were only a handful, with volumes and features as below:

  • BitFenix Phenom ITX: 30.8L, 6 x 3.5" drive bays
  • Fractal Design Node 304: 19.6L, 6 x 3.5" drive bays
  • Lian-Li PC-Q25: 20.3L, 7 x 3.5" drive bays
  • Lian-Li PC-Q26: 32.3L, 10 x 3.5" drive bays and 1 x 2.5" drive bays
  • Lian-Li PC-Q35: 29.2L, 5 x 3.5" drive bays and 2 x 2.5" drive bays
  • Silverstone DS380: 21.6L, 8 x 3.5" drive bays, 4 x 2.5" drive bays

(Note: I also considered the Fractal Design Node 804 - an mATX option and a popular choice here. However, at 41L for 8 x 3.5" drive bays / 2 x 2.5" drive bays, it was too large. The 804's wide shape was also not optimal for placing under a desk, which was to be where the system would reside permanently.)

From the shortlist, I eliminated a few options for the following reasons:

  • Lian-Li PC-Q35: only five 3.5" drive bays with a relatively large case volume at almost 30L.
  • Silverstone DS380: seemed like a promising option at first, but feedback I found online showed users complaining about the cramped internal layout and sub-optimal airflow resulting in drives running hot.

This left the Phenom, Node 304, PC-Q25 and PC-Q26. The 304 and PC-Q25 were quite similar at around 20L with six/seven 3.5" drives, while the Phenom's size at 30L was more comparable to the PC-Q26 at 32L, but with only six drive bays was clearly less space efficient than the PC-Q26 and its ten drive bays. So on this basis I eliminated the Phenom, and was left with the decision between the Node 304 / PC-Q25 or the PC-Q26.

Studying other user's builds of the Node 304 and PC-Q25, it seemed that building in these cases could be very tight. In the Node 304's case, the drives (and their accompanying SATA and power connections) reside right over the motherboard. Therefore, any cooling has to actively pass over both the densely packed drives and cabling, and the motherboard, and thus could be very challenged in this particular case. The PC-Q25 on the other hand has a different design with power supply over the motherboard and drive bays on the side and below the motherboard. It is also quite a densely packed internal layout.

Reconsidering the original goals of the build, and given the possible cooling challenges for the Node 304 and PC-Q25, and also given that a 50% increase in case volume would provide an additional four or five 3.5" drive bays and one 2.5" drive bay as well as better cooling potential, I decided to go with the PC-Q26.

As it turns out - the PC-Q26 was actually discontinued by Lian-Li, so it was incredibly difficult to find. A lot of searching online later, and with a bit of luck, I found it available at a reasonable price.

The next consideration was - how on earth was I going to find a Mini-ITX motherboard that has ten or eleven SATA ports? A lot of searching on PCPP and online later, it turns out there were almost none - I was most likely going to need to add a SATA or SAS controller to the PCI-E slot in order to get all the SATA ports I needed for the PC-Q26. Looking at the options with at least six SATA3 ports:

  • AsRock C236 WSI: 8 x SATA3 ports through the Intel C236 chipset
  • AsRock C2750D4I: 2 x SATA3 / 4 x SATA2 ports through the Intel C2750, 6 x SATA3 ports through two Marvell controllers
  • AsRock E3C236D2I: 6 x SATA3 ports through the Intel C236 chipset
  • Gigabyte GA-B150N-GSM: 6 x SATA3 ports through the Intel B150 chipset
  • Supermicro X10SDV series: 6 x SATA3 ports through the Xeon SOC

Another consideration, aside from SATA ports, was also whether the motherboard options had dual Gigabit LAN and also whether there was any built-in display controller and output support. The last consideration might seem like a strange one - but actually is a great option to enable the NAS to be used as a media playback system or enable it to be partially used as a workstation, while also serving NAS duties. This gives the system more flexibility for future usage scenarios.

Studying the options with lots of (again) online research, I eliminated a few options:

  • AsRock C2750D4I: though this board had enough SATA ports for eleven drives and I could add a video card, the CPU is an Avoton C2750 eight core which is much more underpowered than the other options. Also, various user reviews I read indicated that the reliability of this board was quite poor - in particular with the Marvell SATA controllers.
  • AsRock E3C236D2I: an additional SATA/SAS controller would be needed (i.e. no room to add a video card to the PCI slot) and since there were no on-board HDMI or DisplayPorts, it would not be possible to connect a modern display to it.
  • Gigabyte GA-B150N-GSM: I almost considered using this, but it was impossible to find anywhere for a reasonable price.
  • Supermicro X10SDV series: relatively expensive, and also had the same issue as the E3C236D2I with the digital display options.

This left one option - the AsRock C236 WSI. With eight SATA3 ports, HDMI and DisplayPort output when using the appropriate Xeon E3 CPU (which needs to include an integrated GPU - some Xeon E3 models do not have one integrated), this option would give me enough SATA ports yet also give me the option to connect a display to the system in case the NAS would serve an additional purpose as a media playback system or partial workstation.

With the main challenges - selecting a suitable case and finding an ITX motherboard with enough SATA ports and the on-board display controller option - I was ready to put together this NAS system.

Intel Xeon E3-1245 V5 3.5GHz Quad-Core

Based on the Skylake architecture and comparable to a Core i7-6700, the E3-1245 V5 includes an integrated GPU so that an additional video card isn't needed. (Xeon E3 CPUs ending in "5" include the integrated GPU, for reference.) With four cores (or eight virtual using HyperThreading), there is far more power in this processor than almost any pre-built NAS system from Synology etc.

be quiet! DARK ROCK TF 67.8 CFM Fluid Dynamic Bearing

Taken from my other build and used here, since the PC-Q26's maximum CPU cooler height is 150 mm. I could have used the Intel supplied Xeon E3 CPU cooler, but I wanted to ensure the CPU and motherboard have enough cooling and also runs quietly. From usage in my previous system, this is a great CPU cooler.

ASRock C236 WSI Mini ITX LGA1151

As explained above, the only ITX motherboard option with on-board display controller and output support, while also including eight SATA3 ports. A relatively good quality motherboard from AsRock and with sufficient space around the CPU socket to support the Dark Rock TF cooler. Accessing all of the ports and connections is quite difficult with the CPU cooler installed, so therefore they need to be installed beforehand. With two DDR4 DIMM slots, this board supports a maximum of 32GB ECC RAM. It also has Dual Gigabit LAN from Intel for great connectivity.

Kingston KVR21E15D8/16 32GB (2 x 16GB) DDR4-2133 ECC

To ensure there is enough RAM for the NAS operating system, caching, and any applications or virtual machines, I maxed out the RAM at 32GB of ECC.

Crucial MX300 750GB 2.5" SSD

Something I picked up on a Black Friday sale, it will be used for caching and any application data. It sits at the very bottom of the case below the 3.5" drive cage.

Seagate IronWolf Pro 10TB 3.5" 7200RPM x 10

After studying the cost per TB of various disk sizes up to the current maximum of 10TB, given that the cost per TB differential between the largest disks and smaller disks was not too large, as well trying to avoid the wastage of having to swap out existing disks for new disks in the future due to insufficient capacity, I decided to aim for the largest disks available for this system so that I would not have to change them later, since buying smaller disks (e.g. 4TB or 6TB per disk) and then running out of space and having to buy all new larger disks later would probably be more expensive in terms of total cost.

For NAS drives that are to run 24x7, there are only a few options at the larger disk sizes - WD's Red series which maxes out at 8TB per disk, or Seagate's new IronWolf and IronWolf Pro series which max out at 10TB per disk. (There were also WD Gold and Seagate Enterprise options at 10TB, but these were 50-100% more expensive than the Red and IronWolf options - so I did not consider these.) When considering the cost per TB, the IronWolf series was more competitive than the WD Reds.

For the IronWolf series there are two options - IronWolf and IronWolf Pro. The former comes with a 3-year warranty, while the latter comes with a 5-year warranty and data recovery service for the first two years. The IronWolf Pro is rated for up to 16 bay drive arrays, while the IronWolf is only from 1-8 bay drive arrays. The IronWolf Pro is about a 13% premium over the IronWolf, and given that the warranty extends much further and includes data recovery services as well as support for a 10 bay drive array, I decided to go with the IronWolf Pro since I will use this server for as long as possible (i.e. until it dies).

Overall, this IronWolf Pro 10TB is a great drive - runs very cool (as it is a sealed helium design) and runs fast. Acoustics are quiet during idle, but somewhat louder during random accesses - typical for spinning disk drives. Interestingly - the startup and power down of ten of these disks in one system sounds like a jet engine powering up or spinning down.

Lian-Li PC-Q26B Mini ITX Tower

This is a very well designed case which works well for this build. Some considerations:

  • Airflow is designed to move from front to back. Therefore, I have the front three fans as intakes, while the rear top fan acting as exhaust (which will also hopefully help reduce dust build-up on the dust filter at this fan location).
  • The dust filters on the front sides of the case for the three intake fans are easily accessible after removing the side panels. The dust filter for the top exhaust fan is much more difficult to remove, however (it requires removing the fan itself).
  • The quality of the build is impeccable - I did not find any issues and the aluminum and black powder coating is beautiful.
  • Installing and removing drives is super easy. The case comes with screws for each drive which allow you to slide the drives in and out directly into the SATA backplane. Then the drives are all secured using a locking mechanism on the drive cage.
  • Removing the side panels is very easy, and makes the internals very easily accessible. One only needs to take care on the back side panel to ensure the cables are stowed appropriately so the back panel will fit back on properly.
  • The space for the power supply is quite short due to the drive cage. Therefore, as short as possible a PSU is recommended, since some space is needed for the cables to come out of the PSU.
  • The case only comes with one BP2SATA dual SATA backplane. Therefore, I had to get four extra ones (quite difficult to find).
  • Wiring the cables can get pretty tight, but the cable channel in the middle back of the case helps a lot.

Corsair RMx 650W 80+ Gold Certified Fully-Modular ATX

A 160 mm long PSU which had just the right number of SATA and Molex connectors to wire everything up. Runs quiet as the fan does not spin up until a certain load level is reached (which was not reached during stress testing I did later).

NoiseBlocker NB-eLoop B12-PS 58.1 CFM 120mm x 4

Having used these very efficient (in terms of airflow vs noise) fans in my other build, I found these to be a perfect fit to keep this system cool and run quietly.

LSI LOGIC SAS 9207-8i Storage Controller LSI00301

A very good SAS controller which can support up to eight SATA drives directly through the two SAS ports. This controller supported full SATA3 speeds, which is why I chose it. I also avoided choosing a SAS RAID controller, as it is preferable to setup software RAID through FreeNAS or unRAID which support more advanced RAID technologies and file systems like ZFS.

Lian-Li Accessory BP2SATA 2Bay to HDD SATA Hotswap Backplane Black Retail x 4

Though I didn't need to get these SATA backplanes, using these backplanes enabled me to reduce internal cabling because each backplane can power two drives from either one SATA or Molex power connector each. These backplanes also allow more easy swapping of drives when needed, without having to mess with any wiring - a big help as it can get very confusing which SATA cable goes to which drive.

SilverStone SST-CP11B-300; Ultra slim SATA 6G 300mm Cable, black x 7

As space in the case is quite tight, I used these ultra slim SATA cables to wire up six 3.5" hard drives and one 2.5" SSD. These SATA cables are incredibly slim and flexible, so they worked very well.

SilverStone SST-CPS03-RE 36 Pin 0.5 m Mini SAS SFF-8087 to SATA 7 Pin Cable

This cable was used to connect one of the SAS ports on the SAS 9207-8i SAS controller to four 3.5" SATA hard drives, making a total of ten 3.5" hard drives connected to the system.

SilverStone SST-SA011 Mini SAS SFF8087 to SFF8088 Adapter

SilverStone SST-CPS02; shielded Mini-SAS 36pin to 36pin cable

For future expandability, this Mini-SAS cable was connected to an external SAS connector. This will allow me to add extra drives using an external SAS drive enclosure and expand the storage capacity of the NAS system at full SATA speeds.

Wrap Up

Overall, this NAS build turned out quite well, though there were a few challenges along the way. The case selection required quite careful consideration as some cases which initially seemed appealing turned out to have some shortfalls which would cause cooling or reliability issues later. The motherboard needed to be selected appropriately in order to ensure all the drives that could be installed would be supported, and also having the ability to select a board with on-board digital video support was a nice bonus. Fortunately, I was able to find ways to work through those decision points and maximize the potential of the system. The other challenge was during the build itself - despite being a not-too-cramped case, cable routing and management still needed to be carefully planned in order to reduce obstruction to proper airflow and cooling of the system. Lastly, the size of the finished system is quite good and sits well under the back of my desk. For size comparison, there is a photo in the gallery of the NAS box next to a standard PS4 console.

I have been putting this build through its paces - first by running drive integrity and burn-in tests to ensure the IronWolf Pro drives are functioning at 100% without any errors or bad sectors. To do this I used a combination of the smartctl and badblock tools built into FreeNAS. Some photos of the burn-in statistics are shown in the photo gallery. The burn-in tests were run twice and thankfully completed without errors (though the tests themselves took one week per round). During the testing, I noted that at around 25-27 C ambient, the drives were running at a maximum of 42-44 C - an excellent result. To ensure that the drives run cool at all times, I have set all of the fans to run at full speed at 1200 rpm. Noise levels from the fans is negligible and most noise comes from the disks themselves when they perform random accesses.

I have not yet decided what kind of software to use on this NAS system. There are a couple of contenders, namely FreeNAS and unRAID. I am also considering using Ubuntu which would give me more flexibility in how I can utilize the NAS - i.e. not only as a storage service, but also for running applications such as media playback services or virtualization or as a part-time workstation. As it will take me some time to determine the most appropriate software setup for this NAS system, I shall report back later with any notes.

If you made it this far - thank you for reading! I hope you enjoyed reading it as much as I enjoyed the process of researching and building it.

Comments Sorted by:

eli_harper13 6 Builds 8 points 26 months ago

Wow this is amazing. Has to be one of the best thought out and researched builds I've seen!

hklenny submitter 2 Builds 1 point 26 months ago

Thank you sir!

FL350 1 Build 4 points 26 months ago

Nice to see a different kind of computer on PCPP! Your write up is very detailed and I appreciate you taking the time to explain your choices for those of us who might consider doing something similar.

+1 for a well though out and executed build.

hklenny submitter 2 Builds 3 points 26 months ago

Thank you so much! Glad you enjoyed reading it, as It took me ages to write it up! :)

PigWithAMustache 2 Builds 4 points 26 months ago

I think you've successfully beat a Linus server killer. +100.

hklenny submitter 2 Builds 2 points 26 months ago

Thank you sir! :D

80-wattHamster 1 Build 2 points 26 months ago

This is wonderfully executed; top marks. Reminds me a bit of my days filling bays with 10,000 RPM Ultra 160 drives and messing around with RAID5. Of course, I only had four...

hklenny submitter 2 Builds 1 point 26 months ago

Thanks! I do remember those 10krpm drives from back in the day and had always wondered about them. Interesting that they never really went mainstream - most likely all due to SSD's fault.

80-wattHamster 1 Build 1 point 26 months ago

The price hike over a 7200RPM drive without a corresponding performance benefit made 10k units dead in the water for end users long before SSDs became commonplace, in addition to being loud and hot. I was using 36 GB drives that cost used what a secondhand 80+ GB (IIRC) would set you back in ATA. My failure rate was pretty high, too. Again, these were used drives, but there are probably only 3 or 4 left operational of the 8 I bought over the course of the project/experiment. Also, not everyone can stumble across an Ultra 160 SCSI controller for cheap, which is what enabled this in the first place. WD's Raptor is probably the only IDE/SATA 10k drive anyone's ever made.

hklenny submitter 2 Builds 1 point 26 months ago

What you said certainly makes sense. Interesting little experiment you did!

saddlepiggy@gmail.com 1 Build 2 points 26 months ago

my build is worth less than 2 of you HDDs XD

hklenny submitter 2 Builds 1 point 26 months ago


saddlepiggy@gmail.com 1 Build 1 point 26 months ago

woops, actually 1 XD

fn230 9 Builds 2 points 26 months ago

Oh, this is wonderful. I love how we're seeing more non-gaming computers nowadays.

hklenny submitter 2 Builds 4 points 26 months ago

Second that thought - gaming computers are cool, but it's nice to see how computer builds can be used for other special purposes. Thank you very much!

comanderbham 1 Build 1 point 26 months ago

This was incredibly well thought out and executed. I found myself reading the entire summary because I found it so insightful and interesting. +1 man this is awesome!

hklenny submitter 2 Builds 2 points 26 months ago

Thank you so much! It took me quite a while to write it up - so I'm very glad you enjoyed reading it. :)

MarshmellowMan2 1 Build 1 point 26 months ago



hklenny submitter 2 Builds 1 point 26 months ago


_Sanic_ 2 Builds 1 point 26 months ago


hklenny submitter 2 Builds 1 point 26 months ago


Sockens 1 point 26 months ago

I.... I don't.... How...? Money... I can't even begin to comprehend this. SO MUCH NAS. SO LITTLE TIME.

hklenny submitter 2 Builds 1 point 26 months ago

Hopefully this investment will save me a bunch of time in the end! :D

Sockens 1 point 26 months ago

That is the intention. Good luck trying to fill this thing up. My total software downloading on my past THREE computers hasn't exceeded 1.5 terabytes.

hklenny submitter 2 Builds 2 points 26 months ago

Thanks. This NAS will be used for storing a lot of photos and 4K videos shot by a DSLR, so I will certainly put this system to good use.

girlshoot947 1 point 26 months ago

just just just NASA would be proud of you now you just need a bigger case with duel titans really pop the party

PigWithAMustache 2 Builds 1 point 26 months ago


hklenny submitter 2 Builds 1 point 26 months ago

Lol. That would be awesome, though I probably don't need that much compute power... :)

Dalton334 1 point 26 months ago

Jesus Christ.

hklenny submitter 2 Builds 1 point 26 months ago


Wolfemane 7 Builds 1 point 26 months ago

Very nice build!!! This is almost exactly what my wife and I are planning. I've been on the fence about the IronWolf Pros. Over the last five years we've had a total of 11 failed seagate drives, and dealing with their customer service and warranty has been an utter nightmare.

The board/CPU you are using is also on ourshort list along with the SUPERMICRO MBD-X10SDV-TLN4F-O Mini ITX Server Motherboard with Xeon processor D-1541 embedded. Only 6sata so I'd have to get an expansion cars to go above 5hdd. It's a trade off though, still undecided.

Great write up though. You've brought some new ideas that I hadn't considered.


hklenny submitter 2 Builds 1 point 26 months ago

Thanks! Yes, I had done a bit of research about WD vs Seagate. There are some who swear by WD because of bad experiences with Seagate, yet there are others who haven't had any problems with Seagate. As it's impossible to take exact statistics based on reviews left on Amazon etc, it seems that it could go either way. Most of the PCs I've used in my life have used Seagate drives, and I have never had one fail on me, so I decided based on cost per TB and warranty duration (here's to hoping that their customer service has improved since your experiences) to go with Seagate and the IronWolf Pros. The other consideration was that WD does not have a 10TB Red model available yet, and the WD Gold 10TB was 50% more expensive than Seagate.

The SuperMicro X10SDV series boards are quite good if you want to go with a pure headless server setup, as they have an IPMI controller built-in for full remote managment. It's also able to support much more RAM (four DIMMs of 64GB ECC Unregistered) if you need it. However, I didn't think I would need more than 32GB RAM and also valued having a digital video output, so hence I went with the AsRock C236 WSI.

Which case are you thinking of using, by the way?

Wolfemane 7 Builds 1 point 26 months ago

I'm pretty sure we are going to go with the seagate helium drives. It's just hard to convince my wife they are the way to go. It's mainly been her drives that have crocked and restoring takes WEEEEEEKS. Hence why we have decided to build a central server and storage system (we have a decent FreeNAS system, but it's much to small for our fast growing needs).

We haven't decided on a case yet to be honest. My wife is adamant about moving all the computers out of reach of our two boys. We've both thought long and hard on building a secure server room in our garage and moving all computer hardware into server cases and rack mounting them. This was the primary reason for going with a pure headless board (and I love SoC mITX boards). If we go this route I'll probably build in a 2u case. If we don't decide to go that route... well... we really haven't looked... until I saw your post. Such perfect timing. You have a very elegant solution with an mitx board (which I really really really want).

hklenny submitter 2 Builds 2 points 26 months ago

I think helium drives are the way to go - they run cooler and quieter than regular drives. I think the reliability of them may actually also be better since the internal atmosphere is separated from its external environment (aside from temperature). But only time will tell.

I know you are interested in the SuperMicro X10SDV series and did you know that SuperMicro also has a chassis for it?

SuperServer 5028D-TN4T

It is a very small form factor, but unfortunately only has four 3.5" drive bays (plus two 2.5" ones). I'm not sure if it has enough drive bays for you (you could use more of them, though your ratio of CPU to disks would be higher than with a bigger server that has more drive bays), but it could be an option if you don't go rack mounted and want to save space. If you want to learn more, there is a guy who did some detailed reviews of them if you just search YouTube.

Regardless, good luck with your build research and decision making. Feel free to drop me a line if you'd like someone else to bounce ideas off of...as I'm interested to hear what you end up with!

ImperiousBattlestar 1 point 26 months ago

100TB ain't overkill, is it?. +1 :)

hklenny submitter 2 Builds 1 point 26 months ago


ImperiousBattlestar 1 point 26 months ago

What is it going to be used for, a whole office's worth of data? :)

hklenny submitter 2 Builds 2 points 26 months ago

Yup, plus a lot of photos and videos taken over the years (and well into the future) with my DSLR. And my CD and movie collection digitized with lossless compression so I needn't swap discs anymore.

s.a.wood15 3 Builds 1 point 26 months ago

+1 because reasons.

hklenny submitter 2 Builds 1 point 26 months ago


qassemaleid 1 point 26 months ago

Sir Im now collecting my build But many told me that 1gb per tb in zfs So based on your experience, is that true ?

hklenny submitter 2 Builds 2 points 26 months ago

Sorry, the other thing I forgot to mention is that you can use an SSD for L2ARC caching in case there is insufficient RAM, though of course that would not be the same as adding more RAM.

qassemaleid 1 point 26 months ago

Thanks for reply , appreciated

Wish you the best

hklenny submitter 2 Builds 1 point 26 months ago

Thanks! Good luck with your build.

hklenny submitter 2 Builds 1 point 26 months ago

I have heard that many times from various sources as well, but in reality it isn't that simple. That 1GB of RAM per 1TB of storage number is only the case for small to medium size business NAS systems. See the link below on their website, specifically the section which lists the requirements for small and medium size businesses:

FreeNAS Hardware Requirements

If you take a look at their best practices page below:

FreeNAS Best Practices

The section on RAM indicates that 16GB is the minimum for over 24TB, and 32GB is recommended when going past 100TB.

For my build, as it is not going to be used as a small or medium size business NAS (i.e. not heavy workloads with dozens of users), I believe 32GB should be more than sufficient.

hdtrejo 1 point 26 months ago

Wow! Good stuff sir. I'm looking at getting the Lian-Li PC-Q26 case for an unRaid server I'm putting together. I'm using the basic version of unRaid which only allows 6 storage devices. I can upgrade to the Pro version which allows 12 storage devices. In terms of quality and build, do you still recommend this case?

hklenny submitter 2 Builds 1 point 26 months ago

Yes, I can definitely recommend the PC-Q26 if you need more than 4-6 drives. If you only need 4-6 drives, it is possible to use some slightly smaller cases like the Node 304 from Fractal Design (up to 6 drives, but can be a bit tight) or the PC-Q25 from Lian-Li which supports up to 7 drives comfortably.

The build quality of the PC-Q26 is very good and I have no complaints there. Acoustics are quite good when the drives are idle and with my NoiseBlocker eLoop fans, but I have noticed the drives can be quite loud when they are doing random accesses since there are 10 drives going all at once. So some sound insulation might be useful, though it's not absolutely required.

If you have any questions for your build, feel free to give me a shout. What kind of CPU and motherboard are you planning to put in, by the way?

Dm721 1 point 26 months ago

2.6/10 actually 100.75 TB because of ssd

schaef87 3 Builds 1 point 26 months ago

Nice write up! Fantastic work.

hklenny submitter 2 Builds 1 point 26 months ago

Thank you sir! Very glad you enjoyed it.

Gothri 1 point 25 months ago

Very nice build. The step-by-step analysis of your planning was helpful to read. I am just wondering ... did you also consider the Silverstone CS380 ATX mid-tower case along with the DS380? The former looks like it contains the same drive cage with the hot-swappable bays, but includes fans on the drive cage for cooling the drives and is situated in a case which seems as though it would provide better airflow and cooling.

hklenny submitter 2 Builds 2 points 25 months ago


The Silverstone CS380 looks interesting...but I see that it only came out last October, so I must have missed it as I did most of the research and planning in September/October last year (and the building after that from October/November as it took a while to get all the parts from around the world - yes, it took me until February to post this build on PCPP).

I think the CS380 could be a good option for those needing the flexibility of a larger ATX motherboard with more expansion and memory slots. One thing I am missing is 10 Gb Ethernet, which is still really expensive now but could be useful in future, but since I have the LSI SAS card to add another four SATA ports, I don't have room for a 10Gb Ethernet card. (Note: I also considered the SuperMicro X10SDV motherboards with 10 GbE built-in, but decided against it due to the much higher cost (almost double) and lack of HDMI/DisplayPort and on-board display controller support which gives me more flexibility for how I can utilize the NAS.) I wonder if one day they will have USB 3.0 based 10 Gb Ethernet adapters...even though USB 3.0 can only go up to 5 Gb/sec, it would still be faster than regular ol' 1 Gb Ethernet. We shall see.

Anyway, comparing the other specs of the CS380 with the PC-Q26B from Lian Li, it is almost 40% larger in volume and has eight 3.5" drive bays compared to ten in the Lian Li. I suppose you could add an Icy Dock to the two 5.25" drive bays to get another three additional 3.5" drive bays for a total of eleven 3.5" hard disks. Build wise, I think the Lian Li is much nicer due to its full aluminum structure and panels, though it does cost a bit more.

So in summary, I think the CS380 could be a good option if you need more expansion slots for other purposes while requiring up to eleven hard disks, and be OK with a larger case volume. But if the main purpose is NAS in the smallest possible volume, then the PC-Q26B is a good choice.

Do you have plans to build a NAS?

cray77 1 point 25 months ago

Once you added the hba, were you able to use it's 8 sata connections plus the 8 that were on the board already? Or can you only use the sata connections comming off the hba?

hklenny submitter 2 Builds 2 points 25 months ago

Yes, after adding the HBA I am still able to use the eight SATA connections from the AsRock C236 WSI. The way I have it setup now is as follows:

AsRock C236 WSI SATA Ports

  • One SATA port utilized for 1 x Crucial MX300 750GB SSD
  • Six SATA ports utilized for 6 x Seagate IronWolf Pro 10TB HDDs

LSI SAS 9207-8i SAS Ports

  • One Mini SAS port split into four SATA ports for 4 x Seagate IronWolf Pro 10TB HDDs
  • One Mini SAS port connected to the external SAS connector

With the above setup, I could add four more disks to the system using an external SAS enclosure if I need to in future. There is still one more SATA port available on the AsRock C236 WSI which I could use for a second SSD (maybe mounted on top of the current SSD I have), but it's in quite a tight spot so it might be a bit difficult to use it.

I added a photo of all the connections to the SATA ports on the motherboard if you're interested. It's in the fourth page of the photo gallery.

hdtrejo 1 point 25 months ago

Did the HBA require any kind of flashing? I read that some require the HBA to be flashed in "IT Mode" so the NAS OS can have full access to the drives such as SMART info.

hklenny submitter 2 Builds 1 point 25 months ago

The SAS controller I bought was a plain HBA, without any RAID functions. So it was already in IT mode. All of the SMART info is fully accessible, and I used that to check temps and perform self-tests.

However, I noticed that the BIOS and firmware on the controller was a couple of years old, which was also causing very slow POST times in the HBA's BIOS screen. Once I flashed the latest BIOS and firmware, the POST time was much faster. So I'd definitely recommend you do that if you get the same type of HBA.

Lynxphp 1 Build 1 point 24 months ago

Very nice build! Congrats.

I just ordered the c236 wsi mobo myself. I am wondering how you plugged your fans? Did you use 3 Y splitter to plug 4 case fans into one fan header? If yes, does this fan header manage all that amperage? I want to do something similar, but was warned that too much amperage on one fan header can potentially damage the motherboard.

hklenny submitter 2 Builds 2 points 24 months ago


I have a total of six fans in my build - 2x be quiet fans on the Dark Rock TF CPU cooler and 4x NoiseBlocker eLoop fans for the front intake and top exhaust.

There are two PWM fan headers on the AsRock C236 WSI, and I use them as follows:

  • CPU fan header -> Noctua NA-SYC1 Y-Cable PWM fan splitter -> 2x be quiet fans on Dark Rock TF CPU cooler
  • System fan header -> Gelid PWM 1-to-4 Splitter -> 4x NoiseBlocker eLoop fans for front intake and top exhaust

The Gelid PWM 1-to-4 Splitter powers the fans using a Molex connection for additional power. Therefore, there isn't a risk of the fans drawing too much amperage from the motherboard, yet the PWM signal still goes to the motherboard. The link to the Gelid splitter is below.


What case are you using for your build, by the way?

Lynxphp 1 Build 1 point 24 months ago

Hey! Thanks for you answer and the link to the powered splitter. I'll see about getting one.

My build is modest compared to yours, but it should cover my needs. I ordered a node 304. It has 3 case fans. Here is the full part list: https://pcpartpicker.com/user/Lynxphp/saved/Pr8sYJ

hklenny submitter 2 Builds 2 points 24 months ago

Your build looks fantastic! The Node 304 is a great little case.

For the Node 304, I see that it has a built-in fan controller for the three included fans. So I believe that the fans are powered by the fan controller (presumably via Molex connector) and you wouldn't need to connect them to the motherboard - unless you wanted to use software to control the fan speeds. Is that what you were thinking of? In that case, you would need the powered splitter. But note that the fans included with the Node 304 are 3-pin (not PWM), though still controllable via the motherboard.

By the way, what kind of NAS software or OS are you planning to use? I tested FreeNAS and unRAID, but ended up using Ubuntu and used ZFS which is included in the later versions from 16.04, I believe.

Lynxphp 1 Build 1 point 24 months ago

Thank you!

Yes you're right. I'd like to use the software to control the fans. I ordered custom Noctua fans, two NF-9A for the front and a NF-14A for the back.

I got my power Supply today, the SF450. I just noticed that it only has the ability to power either 4 SATA and 4 Molex or 8 SATA. For now, I'm only going to have 4 SATA in the build. However, I'd like to upgrade and maybe have up to 8 drives at some point in the future (with some creative handicrafting, i know the node 304 only supports 6, but i found forum posts of people reporting having managed to cram 6 HDD + 2 SSD in the case).

So for now, i can run my case fans with a Molex. But in the long run, i can either buy 3 Molex to SATA and be satisfied with a total of 7 drives, or I'd have to split the 3 fans in the motherboard header and hope that I won't burn the circuitry. When i get my mobo, I'll write to Asrock support and ask them how much amperage the fan headers on the mobo tolerates (the 3 fans would amount to 0.33A).

I was thinking of using unRaid as I want to use my NAS mainly for plex. What was it you disliked in unRaid? What made you go for Ubuntu?

EDIT: I just ordered the Gelid splitter. Thanks for the tip.

hklenny submitter 2 Builds 2 points 24 months ago

Good idea to get a small SFX PSU like the SF450. I was actually going to suggest that but I forgot to. It will save space inside the case which will help with cooling your drives. It will also make it easier to utilize the PCI slot of your C236 WSI for something if you need it.

I think there's an easier way to get around the limitations of 4 SATA/4 Molex or 8 SATA on the PSU cables. To be able to fully utilize 6 HDDs + 2 SSDs, what if you did this?

  • Use the 8 SATA PSU cable.
  • Plug 6 SATA power connectors into the 6 HDDs.
  • Split 1 SATA power connector into 2 SSDs. Since SSDs use less power, it should be OK.
  • Get a SATA to Molex converter and use that to power the Gelid splitter to power your 3 case fans.

I think that would probably be the least complicated route (and least cabling) to get all 8 drives working.

Sorry I should clarify - I don't think FreeNAS and unRaid are bad, I just wanted to more flexibility and control for what I could run on the NAS machine. Using Ubuntu allows me to run ZFS for speed and redundancy, yet also install other apps (namely some software development apps) and be able to configure and use them more easily. It also allows me to utilize the machine as a desktop or via remote desktop to do other things (e.g. software development), so I can get more use out of the machine as it won't be serving high volumes of data constantly.

Between FreeNAS and unRaid, I think I'd go for unRaid though. The interface of FreeNAS doesn't seem as polished as unRaid's to me, and I think you get better support with unRaid though you need to pay a little to use it.

cobbleking 1 point 23 months ago

Most expensive ITX build.

SimonOcean 1 point 23 months ago

Thanks for doing this in depth write up. I am in the process of planning a NAS build myself (having been disappointed by the price / performance offered by turn key solutions from QNAP / Synology. Reading build logs with in depth component selection analysis like this really helps people following in your footsteps. Thanks again.

hklenny submitter 2 Builds 1 point 23 months ago

You're most welcome. Should you have any questions or would like someone to bounce ideas off of, feel free to message here or PM me.

Good luck with your build!

Lynxphp 1 Build 1 point 22 months ago

Hi hklenny! Again, I've got a question for you. I also have the c236 WSI mobo. Despite all my efforts and trying 2 different sticks of ECC ram I couldn't get my RAM to run in ECC mode. More on that story there: http://forum.asrock.com/forum_posts.asp?TID=4623&PID=28185&title=c236-wsi-mitx-kaby-lake#28185 Anyways, I'm actively looking for 16Gb sticks of ECC ram that run in ECC mode on that mobo. Did you happen to upgrade your BIOS to 2.10? If so, did you check in the bios if your ram runs in ECC mode (this info is only visible in with the newer bios version)? If you did, and if it does, can you confirm that your RAM is the Crucial 16GB CT16G4WFD8213 ?

hklenny submitter 2 Builds 1 point 22 months ago

Hey there, sorry for the late reply.

So prior to your message, I hadn't actually checked whether my RAM was running in ECC mode or not. I'm still on the v1.00 UEFI BIOS, so after a bunch of Googling and testing, I found one way to check it - use memtest86 v7.3 Free Edition:


Just create a USB stick using the download and boot to it. Make sure you boot with UEFI on the stick to get v7.3 running. After checking my NAS, it confirms that ECC mode is enabled for my RAM. Note that I'm using two Kingston KVR21E15D8/16 16GB DDR4 ECC RAM sticks. (See photo number 16 in my build's photo gallery.) Also note that this RAM wasn't in AsRock's Memory QVL list, but it seems to be working in ECC mode just fine according to memtest86.

Give the memtest86 a try on your system and let me know how it goes!

Lynxphp 1 Build 1 point 21 months ago

Hi again!

Glad to hear that your RAM is running in ECC. I tested mine with all following methods (except the memtest way): https://www.pugetsystems.com/labs/articles/How-to-Check-ECC-RAM-Functionality-462/#UbuntuLiveCD-ecc_check_c And sadly, it isn't supported. I'm starting to think that my proc the G4560 doesn't support ECC even though it should according to the information from intel.

Asrock proposed to take my mobo back, but i don't live in the US and i'd have to provide a return address in the US. I won't go that way because of the extra cost and long time without a mobo and thus without my NAS... The next step will be getting the E3 1245 v6.

hklenny submitter 2 Builds 1 point 21 months ago

Hey there! Interesting to hear about the results of your test.

About the ecc_check via Ubuntu method shown in the Puget article you linked, note that I tried this method (plus the other methods described in the article) and it could not detect the RAM running in ECC. I believe that this method is outdated for our hardware and hence when I ran the test it didn't show the correct result.

I'd recommend you try using the test method I used (memtest86 v7.3 Free Edition) to test whether your RAM is running in ECC or not before you decide to change components.

Let me know how it goes!

Lynxphp 1 Build 1 point 21 months ago

Ow! Thanks for the heads up! Ill make sure to try your method and let you know.

nelson01 1 Build 1 point 22 months ago

Excellent build and great write up. I'm thinking of expanding my nas in a define R5. I also plan on running two Windows VMs, one for gaming using a GTX 1080 Ti. Your build is inspiring. Outstanding job!

hklenny submitter 2 Builds 1 point 21 months ago

Thank you very much! I'm glad you enjoyed reading it. If you post your new NAS build up here, let me know! I'd love to see it.

nibbled 1 point 21 months ago

Really enjoyed reading about this build, so much so that I decided to do one myself based of similar components, I used the AsRock C236 WSI and Lian Li PC-Q26 from the build and must say I’m loving the case, ideal for a small NAS although hard to come by now they’ve stop making them. Currently with my build I’m using six 8tb WD Red drives and an SSD, with only 1 more SATA port left on the motherboard I’m thinking of how to expand. I could go the simple route and just get a basic 4-Port SATA PCI-E card and use 3 of those ports to get to 11 drives. However I’ve also got in the back on my mind even more future expansion, I’m a bit of a noob with regards to HBA and SAS cards so trying to get my head round the best route to go but it can be a minefield so hopefully you’ll be able to help guide me.

What I’m interested in doing is purchasing another Lian Li PC-Q26 case and filling that with a further 10 drives, giving me 20 in total, I’d rather go down this route than buy a much bigger case and retire the Q26 I already have since it’s such great case, quiet and very compact. What I’m struggling with is how to connect them both up to serve as one machine, the one I have now which acts as a NAS feeding my Plex share, the other I’d like to add as a DIY DAS box, minimal components inside to save on cost. If I’m right in what I’ve been reading I can get away with only having a motherboard and power supply in the DAS with some form of expansion card PCI-E or SAS in order to connect all the drives and output the connection to the NAS I already have. What I’m unsure about is what cards/cables I’ll need to hook it all up, and if it can even be done bearing in mind the limited amount of expansion slots available, only one on the NAS with the AsRock C236 motherboard.

So ultimately I’d like to know if I could hook 2 of these machines up together, if I can how would I go about doing it? Am I limited to how many extra drives I can connect, with your setup you mentioned only being able to connect a further 4 drives, would that be my limit too or can I get additional hardware to connect more drives? Worst case scenario is that none of this achievable so what I’ll end up doing is just have two separate servers that I’ll switch between but would be nice to have them both connected somehow but realise that could be an issue with a small build such as this.

Thanks in advance for your help, really enjoyed reading about your build, very detailed and sound reasoning behind each component selection.

hklenny submitter 2 Builds 1 point 21 months ago

Sorry for the late reply! The notification for your message was lost in a deluge of e-mails, so I had forgotten to reply to this.

I'm thrilled that you enjoyed reading about my build and it inspired you to do one yourself. (That means it was worth the time spent writing it up!) I'm also very pleased that you've enjoyed using the PC-Q26 with the C236 WSI in your build. It's a great combination at this point in time and it's a pity that the PC-Q26 is discontinued. Maybe Lian-Li will consider re-introducing it or a new version in the future.

Anyway, with regards to your expansion project using a second PC-Q26 case, I believe it is possible to use one NAS + one DAS in the manner you have described. This is because SAS adapters can actually expand to more drives than the number of connectors they have, using SAS expanders. Note the below is based off my existing understanding (without doing much additional research), but I think it would work like this:

  1. In your existing NAS, you would need to get a suitable SAS HBA adapter which supports enough SATA devices. The Broadcom LSI Logic SAS 8207-8i which I used in my build supports up to 256 SATA devices, so this will be just fine.


For the SAS cables, you can follow my build exactly because these are the same cables you would need. In other words, you need a Mini SAS to 4 SATA cable (Silverstone SST-CPS-03) to connect up the extra hard drives in your NAS using one of the internal SAS ports from the HBA, and then a Mini SAS to Mini SAS cable (Silverstone SST-CPS02) and a Mini SAS to External SAS SFF8088 adapter (Silverstone SST-SA011) to route the other internal SAS port externally.

  1. In your DAS, you need a good power supply, a basic ITX motherboard (I don't think it needs to have a CPU or RAM), and a SAS expander card like this one:


In this box, basically what you need to do is connect everything up like a regular computer, but there's no need for a CPU or RAM as the motherboard is only used to trigger the power for the power supply and supply power to the SAS expander and any fans you have inside the DAS.

  1. Lastly, you'll need an external SAS SFF8088 to SFF8088 cable to hook the NAS and DAS computers up. You'll plug this cable into the Silverstone external SAS port on your NAS and into the external SAS port on the SAS expander in your DAS.

I think that theoretically the above setup should work. Let me know if you end up building this as I'm curious to see how it turns out!

Rockwolf 1 point 20 months ago

32gb ram is nowhere near enough for this. In this system, you need a minimum of 100gb of ram as it is a 10TB server.

motonack 2 Builds 1 point 19 months ago

Where did you end up finding the PC-Q26? I'm in the same boat as you and i'm currently checking ebay, hardwareswap (reddit) and typical e-tailers for this case to hopefully pop up one of these days. There's really no case on the market like it and really my only other ideal option is to spring for a proper rack mounted setup.

hklenny submitter 2 Builds 2 points 19 months ago

Unfortunately, I believe the PC-Q26 was discontinued by Lian-Li at least a year ago, which is why it is so difficult to find. When I was in the process of obtaining all the parts, around a year ago, my local retailers didn't have any stock as it had been discontinued. I eventually found it at Computer Universe, a German e-tailer that carried it. However, I just checked their website and it is completely gone now.

I also have seen it at Amazon.co.jp in the past, but I just checked that and it is completely gone too.

I'm afraid the PC-Q26 may have gone the way of the Dodo. It might pop up on eBay one day if someone has some old stock, or if there are any used ones available. Hope you are able to find it!

motonack 2 Builds 1 point 19 months ago

I actually have a very solid lead on one from a famous tech youtuber right now. They're ironically upgrading their data server hardware right now and are willing to part with the case as soon as they can strip the hardware from it. I must have this case lol.

hklenny submitter 2 Builds 1 point 19 months ago

Nice! After you get it all built, if you post your build on PCPP, please send me a link!

Then I wouldn't be all by my lonesome as the only person with a PC-Q26 build listed in PCPP... :)

motonack 2 Builds 1 point 19 months ago

Will do, currently working on the parts list. May or may not be pulling the trigger on the parts this year. We'll see how my current storage holds up before I get the ball rolling on this proper NAS setup.

motonack 2 Builds 1 point 19 months ago

What kind of storage configuration did you end up deciding on for your 10TB drives? I'm kind of in a black hole of constant reconsideration of drive setups and raid configs. I'm getting really paranoid of data integrity and so i'm thinking of doing a 1 for 1 vdev for every two drives I add, and then just having all the vdevs in a pool. This way I won't have to worry about ridiculous striping and rebuild times for either raidz2 or raidz3.


hklenny submitter 2 Builds 1 point 19 months ago

I'm very sorry I hadn't replied earlier - I've been very busy recently and just got back to this through my e-mails.

I'm using raidz2 on my 10TB drives, which I believe should be sufficient for my purposes. I think the choice of raid setup really depends on how much you value your data. I'd imagine that not all of your data that will be stored on your NAS will have the same value to you - so perhaps it might be worth dividing your NAS into a critical portion (1 to 1 mirrored type setup) vs normal portion (maybe raidz2).

Ideally, if you really have critical data you do not wish to lose, you should be backing that data up at periodic intervals at a remote site. Even pure mirrored storage won't be any use of something catastrophic happens to your data, e.g. your NAS is destroyed in a fire or natural disaster, etc.

Let me know what you end up choosing for your setup, as I'd be interested to hear your considerations!

willfeltner 1 Build 1 point 18 months ago

I plan to base my photography NAS off of your build. Thank you very much for detailing everything :)

hklenny submitter 2 Builds 1 point 18 months ago

You're most welcome! Glad typing all of this up proved useful to you. :)

Scootervb 1 point 17 months ago

Just to be clear, The Crucial RAM listed in the Parts List is not correct? And you went with Kingston KVR21E15D8/16 16GB DDR4 ECC RAM.

Was there a reason for this?

Also, any reason not to go with the Xeon e3-1245 V6

Thanks for posting the build!

hklenny submitter 2 Builds 1 point 17 months ago

Welcome! Hope you find the build information useful.

Sorry, my bad - I mixed up Crucial and Kingston. You are correct - two sticks of Kingston KVR21E15D8/16 16GB DDR4 ECC RAM was used. I've fixed the parts list and description accordingly.

It's been a while since I built this NAS, but I believe I chose Kingston due to price and compatibility with the motherboard.

As for the Xeon E3-1245 V6, it wasn't available when I built this last year. It is now and if it works, by all means go for it!

musclemuffins 4 Builds 1 point 16 months ago

Sweet mother of Terabytes. This is by far the best build I've come across so far. Awesome build!

lch106 1 point 9 months ago

Thanks very much!

tamalk 1 point 14 months ago

As others have said, this is absolutely amazing! Thank you for the wonderful write up.

Please update us on what NAS system you ended up going with and how it's performing!

lch106 2 points 9 months ago

Thank you! And sorry I missed this message in my inbox.

I ended up going the semi-custom NAS route - a Ubuntu desktop installation running ZFS, with Webmin for web-based remote administration of all server services. I chose the desktop installation for ease of use during setup (the system has more than enough horsepower to run the desktop while serving as a NAS) and the ability to run any apps I'd need as a workstation, while the ZFS filesystem is well supported under Ubuntu with easily downloadable utilities. Though ZFS setup requires some command line work, it is fairly easy. I went for a RAIDZ1 setup with one hard drive for redundancy. I think that should be sufficient given that I will be backing up the vital files on another smaller NAS, as well as offline storage.

Webmin was an essential component to make the setup and administration part easier - everything is accessible remotely without being at the console. Setting up shares via Samba was as easy as click-to-install via Webmin. If anyone is curious as to how to setup a NAS using Ubuntu + ZFS + Webmin, I'd be happy to add some info here at a request.

Performance with this setup has also been excellent - ten disks running in this setup far outpaces the ethernet link speeds capable by this system. One potential point of improvement is if I had 10Gbps ethernet - the AsRock board does not have this built-in and I am out of PCI-E slots. I wonder if one day there might be such a thing as a USB3 based 10Gbps ethernet adapter? Even if it only had as much bandwidth as USB3, it would still be faster than 1Gbps ethernet. I think it might be a while, as 10G equipment (adapters and switches) are still very expensive and quite rare.

Anyway, things are working great! If there are any specific questions, please let me know and I'd be happy to entertain them.

willfeltner 1 Build 1 point 8 months ago

Our thought processes are pretty similar, thank you for sharing every detail :)

kfloyd 1 point 2 months ago

Really loved this build, but since the Lian Li PC-Q26 is discontinued is there any alternatives similar to it that I can use to build something similar of your build?

hklenny submitter 2 Builds 1 point 7 days ago

I'm very sorry I never replied to your original message - I must have missed it.

Unfortunately, the PC-Q26 is discontinued, yes. Did you find an alternative for it? SilverStone has a bunch of cases which might work.

[comment deleted]
[comment deleted]
[comment deleted by staff]
ThoughtA staff 1 point 26 months ago

Don't troll.

[comment deleted by staff]
hklenny submitter 2 Builds 1 point 26 months ago


Zukhan 1 point 26 months ago


[comment deleted by staff]