Yes, but remember, few softwares take advantage of more than four cores.
Meaning that the technical advantage of having more cores doesn't translate into extra performance in practice, in most softwares.
It's having power you don't use, and can't use, until the softwares you use starts taking advantage of it.
When it comes to performance in software that only uses 4 cores or less, the power powerful cores are best.
OK, you can run some software better with an FX 8350, but not all software. Software that is really processor intensive does better on Intel 3570K, or 3770K - unless that specific software is thoroughly optimized to take advantage of the extra cores the FX series CPUs.
Aahhh... well, in that case, I don't see any reason why one shouldn't overclock his/her GPU to within safe limits.
However... I'd say safe temps are around 85 degrees (Celsius) under full load (tip: run Crysis 3 on MAX settings @ 1080p for most setups), for around 15 minutes.
Well, GPUs come with BIOS. That's firmware information on the GPU, similar to a motherboard BIOS. It gives basic information for how the GPU should operate.
You can flash an HD 7950 to a HD 7970 Ghz BIOS, for example, or a GTX 670 BIOS to a GTX 680 BIOS, which will give your GPU a higher performance by deafult, higher clock, etc. ...
Aahhh... that makes sense.
What about AMD cards?
Really? On a single threaded application, Intel runs twice as fast (if not more) than AMD CPUs, even their latest top-tier Vishera processor.
I think you need to research these things before you say them.
Until we have a much greater software support to take advantage of 6 or 8 cores, we won't see AMD be best in all benchmarks.
If you can't see that Intel is better in single-threaded scenarios, you might want to check benchmarks from a non AMD loyalist/fanboy review.
Now that's surprising. I'll look into that. After all, if they really do allow you to burn your GPU by overclocking, and give you a new one if you push it too far... well, then overclocking might be too good to pass up!
Hhmmm... interesting. But... if you burn your GPU by overclocking, is it still within the warranty, and will you get a replacement one?
Yes, but as I've said, that depends on the software being optimized to run on more heavily multi-threaded CPUs, on 6 or more cores.
You don't get that from most software currently, and really, almost no game has that much multi-threaded optimization.
You can get better productivity on an AMD CPU for some softwares, but not most games, and not all softwares.
If you'll really benefit from the FX series CPUs, in certain very specific softwares where the FX series CPUs are better, go for it. If you don't need any of the Intel exclusive motherboard or CPU features on Ivy Bridge or Z77, go for it.
But if you plan on having a more balanced CPU for diverse tasks (including gaming, but not limited to gaming), or if you need Z77 chipset or Intel CPU exclusive features, go with an Ivy Bridge solution.
Hhmmm... I'll check the GTX 670 by GIGABYTE. Is it the 3-slot version, or the 2-slot version you're talking about?
The 2-slot version, OC, WindForce 3 of the GTX 670 by GIGABYTE seems right on the money. It seems very good - I just am not yet sure if it's better than the MSI and EVGA alternatives, when it comes to value.
Do you have any comparisons which could help me make that determination?
Well, the best air coolers will indeed be massive.
You should check your case to see the maximum size of compatible air coolers. It'll probably say something like "compatible with coolers to up 155mm" or something.
Really? I thought there was a certain limit beyond which you would void the warranty, when overclocking your GPU. Like, pushing it past 1.2 Ghz on air with no modifications, causing your GPU to burn up.
Yeah. But, then again, if you have a GTX 670, just save it's original BIOS, and flash in the new GTX 680 BIOS. Especially if you have a non-reference cooler, and your board if deafult overclocked to begin with.
Well... now you know. xD
Actually, although motherboard and CPU are important, for graphics, you want to invest on a good graphics card.
I'd say, go with an HD 7950 by GIGABYTE, for around 300$. It's the biggest bang for your buck, if you plan on going for a multi-monitor setup.
For a new CPU, maybe you should go with an i5 3570 (Non-K, if you don't plan on overclocking, because most games don't benefit from overclocking anyways), DDR3 1600 RAM @ 1.5V @ CAS 9.
Motherboard... try ASRock Extreme 4 Z77, or ASUS P8Z77-V LK.
Well, dollar for dollar, it's not worth it.
However, with nVidia's latest beta drivers, you can see nVidia start to make their drivers better for gaming, reflecting a new focus. They're now focusing on their dedicated graphics model, and drivers are a big part of that.
They're also focusing on GeForce Experience (as they should have been doing for a long time now), too.
Well... uuhhh, a higher resolution is much better for gaming, professional work, etc.
Try getting a 1080p monitor. If gaming is important, look for anything with 2ms GTG response time, or a TN Panel monitor with 5ms (non-GTG) response time.
Although, if they can make it come with a really powerful battery, have a really good response time (so it's as quick to react as the human body), and so forth, it'll be awesomesauce!
Those are nice, but they're still 5ms GTG, not 5ms native.
They're nice for an IPS Multi-Monitor setup, but not for a gaming-multi-monitor setup. Thanks for the links. n.n
Well, seems the price of the FX 8150 is a bit high.
But yeah, if you can get the FX 8320 for under 155$, it's absolutely worth it.
Actually, what happens if a person wearing the suit falls down stairs? Or if the suit freezes uo, and the person is stranded if it runs out of batteries?
Well... get the Noctua NH-D14 2011 model. It's the top of the line Noctua model.
You could also go with the Phanteks PH-TC14 PE, in several different colors.
There's also the Thermalright Silver Arrow Extreme you can get.
Choose which review you think is more accurate, and go with that CPU cooler. Although, all three of those seem rather effective.
Yeah. All they'd have to do is make their own PhysX library that uses OpenCompute technology, make it free, and make it easy for developers to incorporate it into their game engines (like CryEngine, Unreal Engine, etc).
I don't remember. Not a Terminator fan.
Regarding this computer... it's OVER NINE THOUSAND!!!
Seems pretty good. Don't know much about the case, but, meh, a case is a very personal choice.
Right now, AMD has a better strategy, and their poised to win big time.
I mean, they have marketing, bundles, and a great focus. Now if only AMD would start hiring more hardware engineers, and more software engineers too.
If AMD can keep working on their drivers, and the AMD engineers can put more effort towards their GPU line, it'll be awesomesauce of epicness!
OK... well, it goes a lot deeper than that.
There are differences, like in performance using AA (anti-aliasing), AF (ansitropic filtering), tessellation, etc.
nVidia does have CUDA, but most games don't use that. nVidia has PhysX, but if you check the nVidia page, you'll see only a very short list of games actually make use of PhysX in their games.
nVidia GPUs normally have more stable framerates, meaning their maximum and minimum framerate stays closer together, meaning less high peaks, and less moments of lag.
AMD however is the opposite. They normally run at much higher framerates, with higher peaks in high framerate, and longer periods of lag.
AMD has "the holy trinity" of console GPUs (and CPUs), and right now it has better price/performance than nVidia.
Really, what nVidia needs to do is to start improving their drivers for gaming, like crazy. They need to make gamers feel like they're the better option again. They need to stop putting their developers and engineers to work on that Tegra cr*p, and start putting them to work on the GTX 700 series, looking at how to make better reference coolers (maybe take lessons from MSI, GIGABYTE and ASUS?), start working on getting GeForce Experience to be awesome and full driver integration by default (while also adjusting settings in-game automatically, without requiring any user intervention).
nVidia has their strategy all wrong. They're trying to get into the mobile arena, when it's simply a lot more interesting for them to focus on their core market, the video card market, and developing awesome software to support their products.
Also, the nVidia bundle was just pathetic; there are no other words to describe that bundle. It doesn't even pale in comparison in the hopes of aspiring to dream of coming to close to being ad awesome of a bundle as the AMD Never Settle Reloaded Bundle.
nVidia is going to drop big time, if this continues. They need to rethink their ways, and stop trying to make Tegra work; if they gain the mobile market, but lose their GPU market (for consumers, at least), how will that play out for them?
I think if nVidia doesn't start focusing on their core market again, nVidia to AMD in the GPU market will be like AMD to Intel in the CPU market.
Well... if you want a video card that can handle 4 monitor outputs, and it's not going to be running any graphics-intensive tasks (like 3D gaming), then I'd recommend this: http://www.superbiiz.com/detail.php?name=AT-7870XT2
The HD 7870 XT has more than enough power for you, although you could get away with an HD 7770 if you aren't doing anything too demanding.
Remember, though, you'll have to use monitors which are compatible with the four outputs. Most monitors won't offer HDMI, DisplayPort AND DVI connectivity - that'll be expensive.
I'd recommend you get converter cables, and use 4x ASUS VN247H-P, because they're cheap monitors, offer great multi-monitor support (due to being frameless), and they're not heavy monitors.
You could get a 4 monitor stand, if you wanted too. Alternatively, you could go with a 3-monitor setup, like here: http://pcpartpicker.com/forums/topic/1541-triple-monitor-gaming-imax-style-youtube-video-and-setup-tips
Buy the HD 7870 XT. It has more Stream Processors, and it'll use up one less PCIe slot.
It's in stock over there. And it's the better model of the HD 7870 XT.
Well... the answer is that, I'm pretty confident the HD 7770 wouldn't bottleneck your Athlon X2.
However... you have to take into consideration the PCIe specification of the motherboard you're using. If it's not PCIe 2.0, you might as well just stick with the HD 7770.
Did you just choose every of the most expensive component, just for the heck of it?
You don't need two soundcards.
And you wouldn't be able to get much power out of the PCIe SSDs, because so many of the lanes would already be used for the GTX TITAN cards.
You'd also be better off with the Swiftech H220 AIO cooler, because of it's extra oomph! in performance.
And why not get the CoolerMaster Cosmos II for that price?
Also, what possible advantage could you have in getting that optical drive that you wouldn't get elsewhere?
Also, with that much GPU power, why waste it on a 1080p @ 60Hz monitor? It's senseless!
You'd be much better off with 3x 1080p, or even 3x 1440p ...
Just... meh, sounds like it's 38,000$ more than what you could get if you chose better components, and more wisely.
Well, we have several different types of BlackWidow keyboards. The Razer BlackWidow Ultimate is LED backlit, and uses Cherry MX Blue switches. The Razer BlackWidow Non-Ultimate uses the Cherry MX Blue switches, but isn't LED backlit.
The Razer BlackWidow Ultimate Stealth is much more silent than the Non-Stealth counterpart, featuring Cherry MX Brown switches (which don't make as much noise), and uses a green LED backlit color scheme. The tournament edition is a budget version, and I don't know much about it, honestly.
I'd go for the Stealth Edition, because I'm not to keen on the loud noises of the Cherry MX Blue keys. Cherry MX Brown keys are much more silent, because they deliver tactile feedback, which is, you feel it with your fingers when the switch is activated.
Cherry MX Red and Cherry MX Black don't have the tactile feedback. They're smooth going down when you press them, and they're the most silent of the Cherry MX switches.
If they make a 20 foot tall, mecha version of the suit, that'd be awesome. I would never drive again, I'd just run while the men and women around me look in fear and awe, while drives run away from my general direction as fast as humanly possible.
If they add rocket boosters, jet accelerators, wheels...
Hehehe, well, one can dream. Although if we replaced cars with mecha suits that certainly would give America the excuse they were looking for to start exercising. I mean, if you had a mecha suit, wouldn't you work out all time time with it?
(this topic is lol-tastic!)
Well, how about the Razer BlackWidow Ultimate?
Seems like a great case. Send me some reviews of it. n.n
Well, the purpose is so people in this forum, and builders in general, can know what to buy given a certain budget.
Like, "what's the best 100$ case to buy?" ...
Well... meh, you're right.
For a single monitor @ 1080p, you're absolutely right, a single HD 7950 should be enough.
As for why I chose the Cooler Master Seidon 240M, it's because of acoustic performance, mostly. Remember, water cooling definitely makes a big splash, because you can run one fan at minimum speeds, and still keep your CPU frosty chilled.
The CrossFire setup would be for running games @ 144fps @ 1080p (what the monitor is for). 2x HD 7950 could handle that for most games at ULTRA settings, but 3x HD 7950 would be for huge multi-monitor setups.
Hhmmm... well, actually, I hate to admit it, but you're right.
The CrossFire advantages of 1536 x 3 Stream Processors over CrossFire isn't going to match up as nicely as 2x 2048 Steam Processores, especially considering they're on a higher clock speed.
However... considering that 3x 1786 Stream Processors over CrossFire @ 900Mhz can probably outperform that, I think it might be worth it.
Furthermore, I wouldn't run an overclocked CPU running an LGA 2011 system. Also, the reason for going with an i7 3820 is so the motherboard later accept an Ivy Bridge Extreme CPU upgrade later.
Remember, this is a theoretical build for 2500$, meant for upgradability anf maximum framerate.
Budget solutions could be made at around 1600$, but that wouldn't allow for a third (or fourth) GPU, no CPU upgrades possible, etc.
Also, why overclock, considering that gaming doesn't see any meaningful benefits from it? Overclocking an unnecessary luxury for those who want synthetic benchmarks, and numbers to brag about. It isn't practical in real world scenarios.
So why pay more for an option you can't gonna use, or that isn't going to give you anything back for your money?
Also, if you're paying over 2000$ for your system, you'd more than likely want to be able to perform meaningful CPU and GPU upgrades in the future, so your system can still be considered top-tier for the next one or two years.
Well, thanks man.
Hope it serve ya nicely! n.n
OK. Topic coming up soon.
It still performs like a beast, and has great acoustics.
If you want air filters like no other, try the Corsair 550D. SO many fan filters, it'll drive you half-mad! (but that's because that half is already a technology enthusiast, which already means we're half loony! the other half is a hardcore gamer - . . . OK, we're all just full-blown insane)