- 2 months ago
With the two games that I play the most, which is Mordhau and Insurgency Sandstorm, during heavy moments where a lot of things are happening on screen, my FPS would dip in the late 20s affecting my gameplay drastically. It never dipped that bad with my other ram.
So, anyways, I swapped out my two sticks of 2400 32gb RAM (G.SKILL Flare X (for AMD) 32GB (2 x 16GB) 288-Pin DDR4 SDRAM DDR4 2400 (PC4 19200) AMD X370 / B350 / A320 Memory (Desktop Memory) Model F4-2400C15D-32GFX) for two sticks of 3600 16gb ram (G.SKILL Ripjaws V Series 16GB (2 x 8GB) 288-Pin DDR4 SDRAM DDR4 3600 (PC4 28800) Desktop Memory Model F4-3600C19D-16GVRB), which I have it running at 3400, because trying to run it at 3600 gives my computer an error message. Note that I am running a threadripper build, and I heard they were more reliant on RAM speed.
Everything seemed peachy at first, and it actually felt like my faster RAM actually improved my FPS. Tried ARK and it was giving me over 100 FPS with most of the settings maxed out. Never got that much on that game. Then I tried Insurgency and Mordhau, and both games were giving me over 100 FPS in the beginning, or at least Mordhau was, but when things get chaotic in those games, my FPS would dip so low it would affect my performance.
Why would faster RAM make those games chug more during those heavier moments? It has half the memory of my old RAM, but I don't see how that would affect it as I never use up that much memory. Is it because my new RAM sticks weren't in my MOBO'S QVL while my old RAM were? Or is it because I left the voltage for my RAM as is when I went to go change the speed? It feels like my computer is underperforming considering I have a 1080ti card.
GPU - EVGA FTW3 Hybrid 1080ti CPU - Threadripper 1950x MOBO - Asrock x399 Taichi Memory - G.SKILL Ripjaws V Series 16GB (2 x 8GB) 288-Pin DDR4 SDRAM DDR4 3600 (PC4 28800) Desktop Memory Model F4-3600C19D-16GVRB PSU - EVGA Supernova 1600 T2 80+ Titanium OS - Windows 10 Pro