Instead of updating my previous Completed Build I think its better if I create a new one since the chassis is a major component of this system.
For the sake of keeping things brief here: I've written an extensive review of the hardware and some realizations that set in after initial setup on my blog. I'll be adding more soon about setting up in this the IPMI/BIOS, benchmarking, and setting up the OS and software shortly.
A bit more detail for the curious: This is the second home server I've built, and is replacing my old one. The goal, in terms of replacing the old system, is to at least double my storage capacity, increase redundancy, allow for more advanced home server tasks than I use now, run cooler than the old one, and provide the means to grow in the future. Beyond that, immediately it will be a home NAS and media server (via Plex). I'll be setting up Docker and/or VM's to run other services like Sonarr, too.
It's a butt ton of storage and RAM out of the gate for the express purpose of not needing to upgrade any time soon.
The reason I've rebuilt the system in a new chassis is the Silverstone CS380 I initial got had a serious problem cooling the drives. The Silverstone is the only consumer case with 8 hot swap drive bays, though. You can get server chassis that would support all the hardware I'd be moving without needing to get your own hot swap drive bays, but they're a decent bit more expensive than parting it out yourself. Although the new system is still not doing a great job with drive temps, the Silverstone was worse and would require destructive modifications to get it lower, which is a bit much for my taste. I'm of the opinion that if I can return it, why destroy it to make it work? The new chassis is better and a lot easier for me to tweak as needed. Frankly, at this point I'm happy enough with the drive temperatures.
The only remaining problem with the system is it's loud. 8 drives spinning 24/7 will create that problem, though. I've order some sound damping material to put on the chassis' lid which will hopefully help without increasing heat further.
A few things to note if you're considering doing anything similar in this Rosewill chassis:
- Shockingly the Noctua CPU HSF fits without a problem. The fan itself initially was a bit high on the heatsink but it only barely rubbed against the chassis' lid. Once lowering the fan it's all fine.
- There's very little room for anything resembling good cable management. I did what I could.
- You may notice the 120mm Noctua fan sitting at an angle to the PCI slots. There's this really great 3D printable model on thingiverse. The 3D printer I currently have setup is too small to print this so I had a "guy" (read: this "guy" was actually a teenager) on 3D hubs print one for me. It turned out pretty crap, but it works for now. I'll be printing another when I have the time to setup my larger build volume printer again.
- Airflow through the case is pretty good even with the mess of cables. CPU temps are fantastic and with the M.2 SSD heatsink added everything idles in the low 40C's... except the hard drives. Those are hovering in the mid to high 40's at idle. I've done everything I reasonably can to get them lower, but at the end of the day they're purpose built 7200 rpm NAS drives with a peak operating temp of 70C, so I'm fine with it.
Does a fantastic job keeping the CPU temp low at idle, well under thermal throttle limits when it's at full load, and the CPU drops back to idle temps quick.
I'm using this in a Rosewill 4U server chassis, and it's a tight fit. It does fit, but the lid is barely grazing the heatpipes and the fan had to be adjusted a bit. Most people shopping for this will be putting it in a traditional PC case tower, so you're fine. If you're considering using this in a server chassis though... check your measurements.
Overall the board is very solid and feature packed for a great price. I only have two complains:
- The board's fan control is methods are perfectly fine in a server room, but if you're like me and considering this for home use they're janky at best. These aren't really meant to be used at home so its difficult to hold that against them, but I still think it's worth pointing out. No one makes server grade compatible "prosumer" hardware for home-lab folks, and I think we need to start asking for it.
- I've been building on ASUS motherboards for years now, and I'm so used to their front panel pin adapters I forgot what it's like not having one. The Supermicro's front panel jumper pins are a bit strangely labeled.
Perfectly fine. As this is ECC RAM intended for workstations and servers, it's difficult to comment on it, really... There's no crazy cool sniper rifle shaped heat spreaders, no RGB, it's just RAM. Perfectly functional, easy to install RAM.
Quite happy with this SSD.
I did pick up a heatsink attachment for it on Amazon since it was idling at ~51C. Not terrible but not great. Being the Evo series it makes sense that Samsung will skim on some factors like thermal protection since the devices aren't expected to be in heavy duty situations. However, the heatsink I got dropped idle temps by 10C for $10, which is nothing to scoff at.
I'm coming from '"first gen" Seagate NAS 4TB drives (before the Ironwolf branding). Both my old 4TB's and these new 6TB's run hotter than I'd expect in their given setup, idling in the mid 40C's. These are rated for operating up to 70C so I honestly think it's perfectly fine. Considering reputable sources (Google, Backblaze, Microsoft, etc) can't agree on the "40C is the perfect temperature for hard drive reliability" I'm thinking I'm fine.
Otherwise, these are a bit loud, too. That said, I do have 8 of them in a ZFS RAID, so in a smaller drive pool it won't be as noticeable.
I'm rating this as a standalone PSU but including some commentary on my situation.
Overall, like every other Corsair PSU I've owned, it's great. I went for an 80+ Gold certified PSU for this build since it's a home server running 24/7 and I wanted to keep efficiently up to help with the power bill.
A note to the wise: Initially I put this PSU in a Silverstone CS380 case, paired with a "traditional server motherboard" where the motherboard's 24 pin and 8 pin power adapters are "above" the CPU instead of to the CPU's right. That was a problem as the included cables aren't long to reach that far. In ANY traditional setup, the PSU's cables are perfectly fine. This is only a problem in a truly monstrously huge case, cases that aren't designed well (the Silverstone), or badly chosen parts list like a Full ATX tower and a Mini-ITX motherboard.
This blu-ray burner has followed me around for a while now. I believe it's in it's 4th computer at this point? It's solid, works well, and is cheap.
Versus the competition these are still some of the best.
I don't like the color scheme of Noctuas, but that's a long standing problem that I'm basically over at this point. I appreciate the sound dampening silicon corners around the fan, but I had to remove them in my situation so the fans would fit properly. They are a bit louder and don't move quite enough air in my situation, but without going to something ridiculous it's the best I can do.