I am in an unusual situation in that I have to buy my entire rig from scratch – everything from a keyboard to my SD card reader. My goal was to bargain hunt parts for a 1080p gaming rig that would last me for the next 5 years. I spent all of January scouring on-line auctions, internet retailers, classified ads and brick and mortar stores. Everything was purchased new unless noted as an open box return. I feel my efforts paid off as I saved over $300 on this build.
When I was 16 years old, I lusted after Steve Job’s NeXT Cube. I mean, who didn’t want a Borg Cube as their computer back then? The HAF XB EVO isn’t a true cube, but it’s as close as I can get in a budget ATX form factor. The design is well thought out and the build quality is quite good. The panels remove and more importantly, insert back easily. Cable management and motherboard installation was easy and intuitive. I removed the two Cooler Master 3-pin 120mm intake fans. I recognize the Chinese model number; they are the same fans that every brand (Antec, Corsair, NZXT, etc.) slaps their name on. I replaced the Cooler Master fans with two Noctua 140mm PWM intake fans and added a Noctua 120mm PWM rear exhaust. The case has two layers, with the motherboard on top and the power supply and drive cages on the bottom. With only a single 2.5” SSD below, I feel like the bottom is going to waste.
The only complaint I have about this case is that the front panel has a microphone and headphone jack with barely visible black on black icons. Every manufacturer of black chassis gets lazy and does this, but it’s been nearly 17 years since PC99 standard was introduced. Applying the pink and green rings to the front bezel isn’t too much to expect in a modern case design. Oh, and the plastic cable ties Cooler Master provide are absolute junk – they snapped in half on first use! Thankfully, the EVGA power supply came with nice Velcro cable ties.
I bought the Gigabyte GA-Z170X-Gaming 3 motherboard deeply discounted as an open box return. The on-board audio appears to be a Sound Blaster Z, albeit with some different components. It has the Creative SoundCore 3D DSP, but uses a Realtek codec instead of a Cirrus Logic. There is an upgradable operational amplifier to cater to the swapping fad. I have strong feelings against such things, but it’s none of my business if people want to do it. There is a power conditioner on two USB ports and I can confirm, after borrowing a USB DAC that it does make a noticeable (but not necessary) difference. There are two PCIe 3x4 M.2 slot, which is what drew me to the board because I got a great deal on a PCIe M.2 before I bought this motherboard. The back panel has 2 USB 3.1 ports, 3 USB 3.0 ports, 2 USB 2.0 ports. I thought this was a lot, but I’ve already filled up the USB 3.0 and 2.0 ports. I have two USB 3.0 ports on the front of my case, but I like having permanent devices plugged to the back. I’m thinking of adding a PS4 controller, so I will probably buy a $5 USB bracket to plug in to the USB 3.0 header.
The Gigabyte UEFI is adequate, I had no problem finding and adjusting options. However, the automatic recovery feature of the DualBios that Gigabyte has been touting for years didn’t work! When I first used the board, I was stuck in an endless boot cycle without the ability to enter the UEFI. I had to short out the Clear CMOS pins for at least 3 cycles before I broke the loop!
This is the first time I bought an i5 instead of an i7 for my own use. I’m curious to see if I’ll suffer without Hyper Threading. Most of my daily activities include editing photos and gaming, though I do plan on running OSX in a virtual machine. I don’t think any of that will be aided significantly by jumping to an i7 CPU. I pushed the turbo frequency from 3.9Ghz to 4.3Ghz without changing the voltage. For now, I’m happy at 4.3Ghz and I’ll seriously sit down and do a balanced overclock when I have more time.
I initially purchased 8GB Crucial DDR4 2133Mhz for a crazy low price from a private seller nearby who accidentally bought DDR4 instead of DDR3. At the time, I hadn’t found a good deal on RAM yet and just wanted to get my system up and running. A week later, I found a crazy mad deal online for the Corsair Vengeance LPX 16GB 3000Mhz. The following day, I found a super crazy mad deal on an open box 16GB G.Skill 4000Mhz sticks at a local computer shop. I tested all of them at their stock overclock speeds a few days ago. Surprisingly, at 3000Mhz, I could squeeze about 3fps extra from Unigine 1 and 4fps from Unreal 4 demo engines. At 4000Mhz, the gain was 3.3fps in Unigine 1 and 4.5fps in Unreal 4. The scaling isn't quite what I expected for the cost and I have decided to keep the Corsair 3000Mhz in my system and sell the other two pairs.
So, I mentioned that I bought the Samsung SM951 256GB AHCI M.2 PCIe 3x4 (32 Gb/s) for my operating system and program files. It was the first component I bought, because I acquired it for nearly half price and I just wanted to play with it. The motherboard has a second M.2 slot, but I opted for the Samsung EVO 500GB SSD as my data drive. So far, I can’t feel the difference between a PCIe and a SATA III. In my head, I knew that throughput speed and time didn’t scale linearly, but my heart wanted to be thrilled and amazed by how fast my computer had become! Oh well. I’m sure it’s measurable, but we’re probably talking about a 1 second difference, at the most. I can see the NVME PCIe variants being very useful in virtualization and maybe daily archive replication. However, it’s not worth it for any home user to buy a PCIe M.2 until the cost comes down to match a Sata III SSD.
The EVGA GTX 960 4GB FTW looks to be a decent AIB. The normal NVIDIA GPU base/boost is 1127/1178Mhz and this model comes pre-overclocked to 1190/1342Mhz. I was able to push the boost to 1503Mhz without changing the voltage. I stress tested with Furmark 220.127.116.11 on max settings. I must say, the Maxwell micro-architecture is very impressive. Running on full load, I can keep the GPU heatsink fans at a quiet 1000rpm and never exceed 69C. That being said, I did install an Asus STRIX for someone recently and its heatsink did a MUCH better job cooling at the same noise level. However, the maximum GPU temperature for the NVIDIA GTX 960 is 98C, so I don’t ever have to worry about overheating. What this does mean, though, is that local ambient temperature around the computer increases more with an EVGA than the ASUS during a gaming session. Not much of an issue in Winter but in Summer it could be noticeable... During my overclocking tests, I did notice that AIB memory usage was 2802MB in most engines on Ultra setting. A few times, while testing Unreal 4 engine, it very briefly peaked at 3907MB every few minutes. I wafted back and forth between buying 2GB and 4GB and now I’m glad I chose the latter.
I paired a set of Micca MB42X (version 3) bookshelf loudspeakers with a Dayton DTA-1 Tripath Amp. Out of the box, the imaging is good and I can hear clear details. I listened to a few songs, watched a TV show and played the opening of Filip Victor’s Half Life 2: Update. In my opinion, the 4Khz frequency needed a modest boost to improve vocals and the 8Khz needs a major boost to increase brightness in cymbals. I bumped up 125Hz to add warmth in the bass range. I slightly cut the 500Hz frequency to remove the boxy sound that remained. I was limited to Creative’s EQ, but when I have more time, I’ll download Equalizer APO and adjust the entire frequency range to suit my taste. I honestly wasn’t expecting sound this good from a pair of $90 speakers and a $30 amp.
I really like having a small amp on my desk to physically adjust the volume while gaming or watching movies. I bought some yoga blocks for $3 each and put my speakers on them, keeping the tweeters at ear level. These blocks are basically rectangular chunks of ethylene-vinyl acetate – the same material gym mats and shock absorbers in shoes are made from. I thought it would be good material to dampen the acoustic energy before it could resonate through my wood desk - it appears to be working swimmingly. I was originally going to make a DIY Sound Anchor by filling a paint can with sand, but this was easier.
Now that I’m using loudspeakers instead of a gaming headset, I had to look in to options for microphones. It just happened that I was borrowing a USB DAC from a musician friend to test out the power conditioned USB on the Gigabyte motherboard. She also gave me a few USB microphones to test and the one I liked the most was the Audio-Technica ATR2100. It has a cardioid pattern which only records sound in front of it and rejects sound on the sides and in the back. I recorded a raid with my guild and sure enough, my voice was loud and clear and the music and other guild members talking in the background was largely minimized. That’s impressive considering the loudspeaker was only 2ft behind the microphone. I can’t express how liberating it is to finally be free of a gaming headset!
This is my first LCD desktop monitor! I know that sounds ridiculous, but I was using a Sony GDM-FW900 CRT for the past 10+ years. It’s basically the equivalent of a 22” IPS 1440p display so I had no reason to upgrade. I would have kept using it if I hadn’t moved recently. It’s just too big to manage. So anyway, I bought the AOC I2269Vw 22” (it’s actually 21.5”) monitor. It seems to be the best 1080p IPS bang for the buck. I was told directly by AOC that the entire 69 series of monitors use an 8-bit LG AH-IPS display. I negotiated a ridiculously low price on an open box return – never underestimate how badly a big box retail store wants to get rid of large sized opened inventory. It has an analog D-Sub and a digital DVI-D input. That’s all you need unless the monitor has built-in speakers. I had to buy a DVI-D cable from a thrift shop for a buck because the monitor only comes with an analog one. That annoyed me greatly. There is a 75mm VESA mount on the back, so I bought a discontinued Vogel’s EFW1130 desk arm that was heavily discounted on-line. It’s nice being able to pull the monitor close to my face when I need to do photo editing and then push it back for gaming. The monitor itself is good, but the bezel on the bottom is glossy piano black. It reflects my keyboard and fingers and is driving me nutsoid! I need to cover it up with black gaffer’s tape or something.
My keyboard is a Rantopad MT Aegis. Never heard of it? Well, me neither. I cut a deal with a small Chinese knick knack importer near Baltimore to buy the last of his inventory for half price. It’s a mechanical keyboard that appears to use a Cherry MX blue clone of their own design. It has several LED backlighting options, laser etched keycaps and a braided USB cable. One of the more interesting features is a swappable acrylic face plate. It comes with a fluorescent yellow-green color that makes it look like a firefly with the backlight on. The keycaps glow the same yellowish color but the backlight underneath is violet-blue. It’s a nice effect and I have to admit, even though I never look at my keys, I still turn it on at night just so I can peripherally see it. The backspace key requires a heavier push. I don’t know if it’s by design, but it’s simultaneously infuriating and really nice as it prevents accidental deletion of multiple characters. Overall, it’s a pretty awesome keyboard and I don’t regret buying it.
I’m a thumb trackball user and Logitech has a monopoly on them within the U.S. Their M570 model is for office use and it really can’t stand up to the rigors a gamer will put it through. The left button that actuates the micro-switch are horribly designed and manufactured. The longevity is anywhere from 8 months to a year for most gamers. To Logitech’s credit, they do replace the trackball for free, but it’s PITA to always need a new one. What really ticks me off is that the issue has been well documented for the past 6 years by numerous end users, yet it hasn’t been re-engineered. It’s hard to give my loyalty to a company that just doesn’t give a crap about its own products.
Which leads me to the Elecom M-XT3URBK trackball, which I bought out of frustration with the Logitech. I really don’t know if it’s going to be any better for gaming, but after 5 replacements, I’m willing to take a risk. I had to spend an extra $10 on a proxy shipping service, because retail stores in Japan refused to ship it to me! I tried all the big shops that were (sort of) English friendly. Anyway, I’m quite happy with the trackball. It’s the third revision and from what I’ve read in forums, it fixes a lot of problems in the previous two models. I replaced the ball with a really old Logitech red one I found in storage. I don’t know if I’m imagining it, but it seems to be a lot smoother than the black ball that came with it. The chassis under the palm is matte, but the left side that holds the ball is glossy plastic! Somebody in charge of material choice turned their brain off because it should be obvious that this section will attract dust that gets funneled under the ball. It’s nothing anti-static fluid can’t fix, but I shouldn’t have to buy it in the first place. Overall, it’s worth it – especially if you kept an old Logitech ball and swap it out like I did.
Buying all this and overclocking would be for naught if it didn’t actually make a real world difference, right? So, I tested input framerate with Unigine Valley using DirectX 11 and Open GL at 1080p with all graphical options maximized. I also tested with an Unreal 4 engine, but deleted the statistics like a dope and can’t recover the file. The average was calculated from at least 275 logged framerate variables derived from Guru3D/MSI Afterburner monitoring a single cinematic demo loop. The still title screen and still credits were eliminated as they show as being 60fps and skew the results. Only the actual rendered animation was included in my calculations.
To recap: I pushed the i5-6600K turbo boost from 3.9Ghz to 4.3Ghz, the GTX 960 boost from 1342Mhz (already OC’d from 1178Mhz) to 1503Mhz and the DDR4 SDRAM from 2133Mhz to 3000Mhz (factory OC via XMP). This is highly mediocre quick and dirty overclock! Remember, I didn’t touch the voltage. For this test I couldn’t underclock the GPU to Nvidia’s stock 1178Mhz boost, the lowest I could go was 1216Mhz. I also limited the highest framerate to 60 with Guru3D RivaTuner Statistics Server. The AOC monitor is 60Hz. Why waste system resources on something you will never see, right?
Unigine Valley DX11, stock speed: Min 21 Max 55 Avg 30.75
Unigine Valley DX11, overclocked: Min 22.6 Max 60 Avg 33.48
Unigine Valley OGL, stock speed: Min 19.1 Max 49.8 Avg 28.54
Unigine Valley OGL, overclocked: Min 21.4 Max 55 Avg 31.62
That’s a 2.73fps or 8.9% average increase in DX11 and 3.08fps or 10.8% average increase in Open GL. This is in line with my expectations of a modest overclock using the highest graphical settings. On a custom setting, which lowers resource gobbling options like shadows, the framerate increase should be much greater.
I guess that’s it. Honestly, I didn’t think I had so much to say! If you have questions, feel free to ask. If I’ve written something in error, let me know, so I can correct myself. Thanks for looking!