I'll third that, but I think a similar level of "security" could be achieved with a simple burner email and an old laptop with linux (and no important data) on coffee shop wifi.
(In the context of clicking the "1e3ka" link.)
How much of an overclock can you get on the 4790k?
Assuming you can get 4.5 GHz all cores, that's 25% faster than the i3 8100 is. This more than makes up for the improvement in IPC since Haswell. (I think ~20% in the most extreme cases, but usually less).
If you were hoping to game while doing editing then I guess it's probably the most cost effective way and would be near the i7 4790k gaming performance.
Found a lightly used HP ProDesk 600 g1 microtower on Craigslist. i5-4570, 8 GB, 120 GB SSD w/ 320W psu and allows full 75W on pcie x16. It's a bit bigger than I was hoping for, but I was doubtful anything better was going to come up. (RX 460 worked - in case anyone has the question in the future.)
Thanks for the reply.
Yeah I got the RX 460 at a pretty decent price ~$92 (open box). I guess maybe I should try for a GT 1030? That should at least sit around 35W peak power draw, but it's a significantly weaker option.
She just has a 720p TV and we're not exactly the competitive gaming types; we're more in the get drunk on a Thursday night and goof off while gaming crowd. I think with Vulkan, the RX 460 can stay above 60 fps in Doom especially at 720p.
I'll keep my eye out on craigslist, FB marketplace, etc. If nothing comes along by the end of the month, I'll take my chances with a Dell or something.
Eh, most people are happier just drinking the Kool-Aid anyway.
The best way to compare a CPU is just with synthetic and real-world benchmarks in situations that you expect to use your computer. Does core count, clockspeed, or architecture matter if your computer meets your performance expectations?
what kind of FPS are you getting?
Yes, I guess I didn't word my comment the best. I'm anticipating that optimizations in Windows or game engines that prevent unnecessary crossing between the two modules could improve game performance. Hopefully it isn't too difficult to implement.
It's interesting they didn't mention the latency involved with threads crossing the CCX. I assume that eventually windows will implement a completely new designation in system manager/task manager for thread scheduling to keep inter-dependent threads on the same CCX as much as possible.
I'm anticipating that this change alone could vastly improve some medium multi-threaded tasks (such as gaming in DX 12). Especially since threads within the CCX are actually able to move between cores with lower latency than CPUs with Intel's current architecture.
I know reference coolers aren't very popular, but there's a similar deal today with the same price after rebate. Also still with Doom.
MSI Radeon RX 480 4GB reference
If I decide to spend my money on a PC upgrade, I'll go with Ryzen. (I know I'm a scrub, but Jesus, I still don't notice any lag when gaming on my FX 8350)
Personally I thought AMD's marketing was pretty realistic here and the 1800X actually benchmarks better than I thought it would. Anyone who thought AMD would wipe the floor with Intel in every category (price, TDP, IPC, clockspeed, and feature set) was a bit delusional.
Honestly this is right where the consumer wants AMD to be. Finally nipping at the heels. If they had octacore CPUs that ran at 4.5 GHz with Kabylake IPC, they would be charging $999...
This is how almost all high-end semiconductor products get released. (CPUs/GPUs especially) The full die-size product is released first because they only produce the full size products. Almost every instance of a lower cost and less powerful processor is a cut-down from a faulty full sized chip. (Or when demand stabilizes, they selectively choose chips to artificially lock/neuter to meet price demands) Good recent examples are the RX 480 --> RX 470 or GTX 1060 6GB --> GTX 1060 3GB.
Right now AMD probably doesn't even have enough faulty full size chips to warrant a 6 or 4 core CPU release. Why sell your 8-core locked as a 4-core when you would make more money on the 8-core anyway? Faulty chips will pile up over the next month or so and then you'll have your less expensive CPU.
I agree with the HyperX Cloud II suggestion.
I purchased the original HyperX Cloud just over 2 years ago and use it quite often (at least 15 hours/week). I've never used a headset at this price or cheaper that I liked more (admittedly it was only 2 or 3 others). By all accounts the Cloud II is as good or better so I certainly recommend it.
Was your GPU overclocked? Do you know what the temps were during the test?
Pure carbon doesn't have the band gap required for CPUs (as they are currently designed) like silicon does. Carbon alloyed/composited with something else might have potential.
I think GaAs probably has the best potential currently and it is theoretically possible to hit 200+ GHz on GaAs semiconductors. Just need more research on how to generate nearly perfect crystal alloys.
Employees are very hit or miss there. I usually just get everything I want then let some random person put their "commission sticker" on it.
You can check out his comment history. I think it's self-explanatory.
Ooooo! A safe place for AMD fanboys! No Intel owners allowed! /s
I bought my FX 8350 around the end of when they were (arguably) competitive. I purchased it the week that the Haswell Refresh was announced. I very nearly picked up a i7-4820k. At the time of purchase, I had been using mainstream laptops exclusively as my computers. I knew there was going to be a huge performance increase no matter the choice made. I didn't know yet what I would end up doing - lots of video editing? Just gaming? Other computational stuff? I decided the FX platform probably offered more flexibility than the slightly more expensive i5-4670k, while saving money compared to the i7 4770k/4820k (very similar prices at Microcenter). I picked up my FX 8350 for $150 at Microcenter.
I swear on my life, I have never actively thought, "I wish my computer was faster." or, "I'm getting unsatisfactory FPS in _ game." Granted, I'm no hard core gamer. Right now I'm mostly playing League of legends and Minecraft. However, I've played the Metro games, Max Payne 3, Sleeping Dogs, Battlefield 3/4/1, etc. so there are some more demanding titles in there. I probably couldn't tell the difference between 45 and 60 fps. My sister and I have used it for basic video editing. It usually took awhile, but the i5-4670k almost certainly would have been worse in that regard. I've dabbled in some computational software (molecular modeling). Very reasonable performance. Similar to the super computer nodes I have access to at work [dual quad-core 2.8 GHz Intel Xeon X5560 "Nehalem EP" processors sharing 24 GiB of system memory]. Although I can access more or less as many nodes as I want simultaneously, so that vastly outpaces the FX 8350 (or even i7-6950X) in raw compute power.
I used to recommend the FX 8350/8320e before broadwell dropped. In specific situations like video editing/encoding on a budget it still was competitive (it never was the best for gaming). I wouldn't any more. It's still a great CPU and will probably remain viable for a long time (just like the i7-2600k is still a great CPU), but with options like the i5-6400, i5-6600k, i7-5820k (and now kaby lake), it's very difficult to justify much older hardware.
I don't regret my decision, but I wouldn't make the same decision with the options I have today.
Both. I run a G3258 (stock speeds) in my linux test bench.
My personal experiences (regarding the G3258 with Minecraft) are mostly limited to Ubuntu with the default Intel linux lts drivers. I don't know the standard 1.11 Minecraft settings, but after the initial terrain generation, I typically saw 35-45 fps. This would have been on Java7.
When I saw this topic, I did a quick check to see if Windows performance was similar, and it seems to be the case. I did not find a specific benchmark with 1.11 on the G3258, but I think it's a fairly safe assumption that if the linux performance has been pretty steady across updates, Windows is probably similar.
If you like high definition texture packs, a graphics card will improve performance.
If you like running around in creative and starting forest fires and blowing up TNT, a better CPU.
Lowering the render distance will also improve performance.
Honestly, I think you'll find the performance acceptable.
Theoretically and actually!
The iGPU on the G3258 should keep frame rates above 25 fps @ 720p most of the time. This is just personal opinion, but I consider that very reasonable for a game like Minecraft.
1080p or higher resolutions will result in lower fps, which is up to your discretion as to what you consider "playable".
Call you names and wish cancer on you while potentially sabotaging your team's ability to win.
If you stay out of ranked play, chances are significantly better that you won't run into as many angry (over-competitive, quick to blame others, raging) players.
It's a fun game if you like MOBAs. Especially if you find a champion you really like early on. Since it's free, I'd recommend at least trying it. I'll warn you that the first 20 hours of play probably won't be that fun (takes some time to get the hang of last-hitting, etc.)
I pretty much only play League with friends now. I used to play about 20 hours/week. Now I play maybe 15 hours/month.
I used to do this more. Grinding MMORPGs (surely when I get that next legendary weapon, the game will be more fun!) or trying to 100% a couple games (surely the satisfaction from fully finishing this will be worth it!).
Back when Max Payne 3 was released, I bought 1 and 2 to play through so I would have the experience of the first hand back story and not the little dribbles that the game developers wanted to share. It's probably the fondness I have for older games, but I think I actually enjoyed playing through Max Payne 1 more than 3 (I definitely still liked 3 though).
About 2 years ago I decided life/time is too short to waste your freetime on things that aren't either productive or fun. If I'm not actively enjoying the gameplay experience, I stop playing it. I'll come back and give it another chance after awhile, but I'll stop again if it isn't fun. Even if I payed decent money for something, no sense in wasting time and money.
This seems pretty normal. AMD's "ZeroCore" is a power saving measure to downclock or disable compute units when there is little or no demand (idle, web browsing, display off).
When under load, it should be around 990 MHz (the base core clock) and 60ish (or 70ish) degrees Celsius is perfectly fine. If you can consistently stay within +/- 10% of the benchmarks listed at various 3rd party review sites - some examples - PCworld, JayzTwoCents, Guru3D, or techporn then everything is fine and your card working as it should.
For reference I have a Tri-X R9 290 that usually downclocks to 300 MHz core when I'm not gaming and it works as intended.
Not to be too specific, but it has terrible weighting across different categories. It compares base clock speed, turbo clock speed, and compute units across completely different GPU architectures (apples to oranges basically). It also uses power consumption and noise ratings from OEM versions (usually the minority of the video card purchased) and has a pretty heavy emphasis on synthetic benchmarks (not indicative of normal use situations).
This tends to have a bias towards nvidia graphics cards because they have higher clock speeds and tend to do better in certain synthetic benchmarks than AMD cards, even if gaming performance is roughly equal. The comparison methodology is also pretty heavily biased towards newer video cards.
You didn't give much background information. Is this under load? At idle? What are your 100% load and idle temperatures?
How badly is the cooler damaged? It's an ATI PCB correct? If so, you could pick up this Sapphire X800GTO off ebay and then just swap coolers. Aside from a different SKU on the back bar code, I think they are identical except the cooler.
If the exact brand is the whole sentimental part, sorry for implying this is a suitable replacement - I understand it's probably a difficult situation. It should look the part though.
Hey! This is pretty sweet! How long will you be keeping the group buy open? I want to ask a couple of my friends if they have some interest.
This is about as overkill as you can reasonably get. If you need to save a couple thousand, just only get one of the ASUS ROG PG27AQ monitors. Then you could use that left over money to make a killer custom watercooling loop and overclock your 6900k.
Honestly, a computer that cost about 1/8 this much would probably still bring 95% of the satisfaction. Then you could spend the leftover 8 grand on a used RX-8 and spend some time getting out and cruising or on a track. Be a lot more fun in my opinion.
This build is obviously just a guideline, modify as you need. Hard to recommend specific hardware when I don't know your specific uses.
PCPartPicker part list / Price breakdown by merchant
Considering it comes with Windows 10 Pro already installed, you could probably get $200 - maybe $250. Otherwise the GPU is probably worth $45 or $55 (1 GB vs 2GB model respectively). The motherboard is pretty old/inexpensive and only runs SATA 2. The CPU and motherboard are maybe worth $90. The case, power supply, and RAM round out the price.
You could probably sell it reasonably fast at $150, but if you don't mind waiting a while, getting $200 + is possible.
Most of the hardware is a couple generations old and it wasn't top of the line at that time either.
Just my thoughts. Good luck.
I currently use a Deathadder 2013. I also own a Steelseries Rival and have used a Logitech 403, as well as a handful of other cheaper examples.
I prefer the Deathadder over those two. The Deathadder has slightly more crisp L/R click and the Rival has more pronounced bumps while spinning the scroll wheel (tactile not the physical grooves in the rubber). I could go into more specific comparisons if you want.
I think at this point (although there might be better or better value mouses), I've become so used to the Deathadder 2013 (which I've been using since 2013), that I'll probably just stick with it and buy a new one when it fails.
Have you tried seeing if a Linux OS will identify your Deathadder? If that doesn't work, I think it might be dead.
The R9 390X outperforms the RX 480 so it's not an upgrade (in terms of raw compute power) at all. If you are looking for a GPU with lower power draw or smaller size, then you might have an argument for the RX 480. Price to performance, the RX 480 is better, but since you already have the better card... it doesn't make much sense to change now.
It's unfortunate that you purchased your R9 390X right before the RX 480 launch, but you must not have been doing much regular research; the Polaris launch wasn't exactly quiet or unexpected.
If you can find hardware for good prices, then I think anything is up for consideration used/refurbished.
I'd say RAM, cases and CPUs are pretty safe, not much can go wrong. GPUs are probably ok as well (just avoid cards from crypto-currency mining machines. Everything else, your mileage may vary.
I've even bought used storage a few times (obviously not for anything critical) and it gets periodically backed up anyway. Either way, I've never had an unexpected drive (new or used) failure.
Ha! Yes you are right! I was running this test on Ubuntu and I forgot I hadn't updated to java 8 yet.
For my test I was generating a new custom preset "Caves of Chaos" world and as soon as I could I would spawn in a bunch of TNT and set it off. During terrain generation (before the map even loads) 1 core was maxed out and then while terrain is still generating, but physics is active in currently generated terrain, I could get 2 more cores maxed out. (Due to lots of darkness in this preset - entities spawn immediately and a large amount of gravel, sand, water, and lava are instantly falling/flowing)
I think this is a pretty demanding scenario.
When I get the time (probably next week), I'll have to try again with Java 8 and see how it goes.
The default settings look pretty good so I think somehow you made a mistake in your overclock settings. The CPU turbo voltages seem a bit high, but we'll overlook that for now.
I'm going to sleep now so I don't have time to help on a specific case basis. I noticed you have a Gigabyte motherboard, but if by chance they use similar settings to ASUS, you can follow this overclocking guide from overclock.net which has really good general settings for overclocking.
I can give more specific help in about 20 hours when I'm on holiday for new years. Unfortunately I only have a Sabertooth 990FX R2.0 w/ FX 8350 and a couple MSI mobos w/ A10s as reference so I can't reliably provide Gigabyte specific help.
You don't need to overclock right away, and if you are unsure of what you are doing I would recommend that you don't attempt anything because you might damage your hardware if you make a big mistake.
Dual core CPUs will probably be around for quite a long time (BGA type applications will probably exist essentially forever - considering I doubt your microwave will ever need more than a single core CPU).
In consumer applications, they will surely slowly be phased out (in regards to current silicon semiconductor technologies), but this is very dependent on demand. Fact is, the majority of computer users (and probably a growing percentage) just need web browsing and power to use office type software. Dual cores are a great performance tier for these applications because they allow low demand background tasks to operate unhindered by the small spurts of higher demand applications. I imagine we're talking on the order of 20 years before dual core silicon CPUs no longer have a meaningful place in the consumer PC market (laptops and SFF included).
On an unrelated note, MC certainly has received some optimization regarding parallel computing, but I can't get it to use more than ~300% CPU core (37.5% total) usage on my FX 8350 (4.4 GHz). I think it's still a maximum of 1 thread for terrain generation and a max of 2 threads for "physics".
Are you sure you set the multiplier correctly? I suggest you return everything to it's default values (BIOS and otherwise) and then run a stress test. Show us a screenshot of AMD overdrive during the stress test. That way we can see if you have a bigger hardware issue.
Yep, as long as you know what you want, it's a great store.
They actually sell most CPUs as a "loss leader" and lose money on them to increase foot traffic in stores so people will purchase high profit margin hardware as impulse buys.
Generally speaking they have excellent prices on CPUs, motherboards, certain hobbyist items, and OEM HDDs. Even after tax, these items will often be cheaper than from an online retailer offering free shipping.
PSUs, SSDs, RAM, cases, and refurbished TVs/monitors (and rarely a prebuilt PC) are often at least competitive in pricing.
Pretty much everything else (accessories, networking items, laptops, software, other storage, cables, GPUs, etc.) is typically not sold at a great price.
I'm a big fan of the Microcenter near Minneapolis, I go probably at least once a month.
Are we actually getting CPUs of some kind or is it literally going to be an empty $45 box?
It's free, so yes, it's pretty cheap (but almost every other Linux distro is also free). It's a pretty good place for someone new to Linux to start. There is a lot of information on the internet if you have problems in Ubuntu. It still has a lot to offer power users as well. (obviously a very simplistic description).
I currently use Ubuntu 14.04 for about 80% of all computer use (work desktop and my main desktop at home) - been working great for me. My laptop is windows.
I mean... for $45, I hardly mind that there's probably a relatively large chance that it's a scam.
EDIT - got one too. If this ends up being real, I'll be pissed I didn't buy the rest of the stock!
I would probably just recommend getting the NUC6i5syh and be done with it. Buy the SSD/HDD size you want and the amount of RAM you want with it.
There is currently a deal on Newegg where you get 8GB of RAM free with it.
Well, I suppose the nice thing is that you can use different skins/launchers to get the exact "feel" that you want. I imagine a good analogy would be: "I've been driving Ford cars my whole life, what are all other cars like?"
Simultaneously very similar, but different. You aren't forced to have a certain look or AppDrawer. Apparently sound levels (across alarms/calls/media) have more customization on Android and maybe notification types as well. File system management is a little different (I don't most people would notice). Phone-wide commands are more consistent I think. iPhones were kind of wonky with "single hand" mode changing how stuff interacted.
Hard to say more. I think just watch as many different reviews on the Axon 7 as you can find.
For reference, I've been a semi-power user of Android devices for 6 years. Never had an iPhone (or anything made by Apple)
Yes, I'm aware. If that Zen ES chip is only clocked at a similar speed, or has a similarly low TDP, then it would suggest AMD has competitive server CPUs. I agree, it's too early to really predict anything. We only have AMD's hidden blender test that says IPC are close to Broadwell.
Hopefully we won't have to wait long, maybe January 17th according to a few sites; although it's also rumored that Zen might have a TLB-type bug like the original Phenom that stunts performance by 30-40% to ensure stability. Tweaktown source - I suppose that could delay the launch? (Otherwise we'll just get another Bulldozer! hahaha)
There is this other TechPowerUp article (you've probably seen it) where a (supposedly) 8-core Zen ES is pretty solidly in the pack of 10-core Intel Xeons. There is an Ivy Bridge right above and below and a Broadwell CPU pretty close also. This might mean IPC is in a good place.
It doesn't say much for sure because we don't know the TDP, clockspeed, or even officially the core count. It at least suggests that AMD will have competitive server chips.
Just contributing to the rumor! I don't like to talk much about new hardware until 3rd party reviews are out though.
I've purchased many things in life used, open-box, and refurbished. In applicable examples: TVs, monitors, full computer (with dGPU), laptop, game console, motherboards, heatsinks, HDDs, mouse and keyboards. Everything has worked perfectly. My luck with electronics truly has been absurdly impeccable.
Got burned on a $1400 1996 F-150 though... I deserved that one haha.
I have a Sapphire R9 290 TriX. When it was brand new, max temps were 73 degrees C (default fan curve) in a Fractal Define R4. I assume the TIM has gotten a little old now because after 2.5 years, it regularly reaches 80 C in the same situations. I've loved the card; it overclocks pretty well and I've never had any issues.
The Node 202 has quite a bit more restricted airflow, but I think you can still make a TriX cooler work since it has nice fan openings. Temperatures might just be even slightly higher (and I'm assuming you are getting an older card with potentially also older TIM) so maybe temperatures in the 85 C range which is still within manufacturer specifications.
Good luck with the build.
Currently I would say I'm using 80% linux at home and 70% linux at work. (Work was previously about 90% windows, but 2 months ago I started working with new software so it jumped to mostly linux). Currently using Ubuntu (not that I particularly like it, but nearly all of my work is CLI anyway... so I don't deal with the desktop environment much).
I think there are pretty wide examples of the advantages/disadvantages of linux in this thread so I won't really rehash it. It mostly just comes down to software. Unless you work in a scientific or technical field, windows will always have better (or the only) software support.
In my experience (very little experience with OS X), stable linux releases have very minimal issues - if something is expected to work, it works - Windows rarely goes more than a week or two in my hands without making me very pissed off about some stupid driver issue, etc.
In my (quite possibly naive) eyes, you aren't a power user unless you have at least an intermediate knowledge of linux. I can understand software limitations and I consider these exceptions.
In direct response to the OP, I wouldn't exactly say Linux has improved its GUI ten fold, but it's certainly on par (or can be) with Windows 10 and OS X. There have been considerable advances lately in ease of installation and QoL improvements for sure. I 100% agree that linux is a Win/Win.
I'm assuming it's used. Is this the single or double fan version?
Depends on what you mean by pair. You can not pair a GTX 1060 with a GTX 750Ti by SLI. You can still have both connected to the system and use the 750Ti as a physx card or use them for separate tasks.