Unfortunately, nope. Texas area.
You should also check out the TechQuickie YouTube channel. Linus from LinusTechTips explains a lot of different things there - some better than I do in this topic. (In fact I should include some of his videos in this post, if I ever get around to updating it once more.)
I just check out LinusTechTips YouTube channel, TekSyndicate's YouTube channel (mainly the regular one, enterprise one, and the hardware channel), read the articles from PCPer.com and Videocardz.com ... the info is all there. (I don't work in the computer industry, though, unfortunately.)
Yeah. I'm leaning FX-8320 or FX-8310, personally. I don't need 125W TDP, I'm fine with 95W.
Faster cores < More cores. (For me.)
PNY isn't bad. But I want guaranteed compatibility, just in case I have to RMA for whatever reason. And also I want the guaranteed stability. Even if it won't always work, tested RAM always will. And for me, that's worth the extra few dollars. (And also avoiding that green PCB from PNY.) If it weren't for the green PCB and compatibility, I'd pick PNY over ADATA any day of the week. I want stability, and this is an exception due to technicalities (and my own OCD).
Oh my GabeN! (Sorry fellow gamers. I shouldn't the name of GabeN in vain. =P )
That's well worth it. =) I should get it next chance I get. It might be a few months until I can. Hope that deal is still there when I save up. Thanks! (8 cores and 95W? I'm lovin' it!)
The thing is that I'm also looking at motherboard stability. I've heard really good things from Wendell at TekSyndicate about the stability of the ASUS 990FX Pro R2.0, and how he said that after he soldered back on a new replacement capacitor, it was one of the most stable platforms he'd ever encountered in his career.
As for the RAM, it's not on the compatibility list. (I'm not worried about PNY. They also sell Nvidia's own QUADRO cards, so that tells me all I need to know about their reliability. They aren't known for being a gamer- or consumer-facing company. They're a business-facing company, selling their brand as reliable. So I know all about PNY.)
I think I may have found a different RAM kit that's compatible, lowest-latency supported (officially) by that motherboard:
Here's my new parts list:
FX-8350 version: http://pcpartpicker.com/p/DwdXGX
FX-6300 version: http://pcpartpicker.com/p/bpCbf7
Now the big question for me right now is this: does 6 cores over 8 cores make a huge difference in gaming on the FX-series of processors? (And how much of a difference does it make on compilation time?)
I need to know if it's worth saving or not for the price difference.
Nah. I've got a GTX 970, and I'm running a 1440p monitor by X-Star (the DP2710LED, PLS "Matte" Pixel Perfect version). Basically, one of the best monitors in terms of ultra-low response time (effective; not in terms of advertised specs).
You can get a QNIX for 220$ (off-grade) here:
Or you can get the same monitor I have for 350$ (including shipping) here:
Mine can overclock to 104Hz no problems, although I keep it at 60Hz due to not having AdaptiveSync (and since I watch a lot of YouTube videos).
1440p is very easy and smooth on this monitor with my GTX 970. Keeping settings at high in most games with framerates in 60fps range (sometimes dipping under that) is fairly easy. And even ultra-modded Skyrim with ENB and the S.T.E.P. enhancements (and SMAA injection) is still running at max (and then some, with over 100 mods and a lot of 2K textures) at 60fps locked at all times (and I could probably overclock my monitor to 75Hz, and raise the frame limiter to 75fps in the .ini file).
Minecraft, even modded, will be no problems at 1440p. League, Skyrim, and Battlefield should also not pose any problems.
144Hz is tempting, but don't fall for it. The beauty of an IPS panel is worth it. It makes Skyrim's colors "pop" in a way that I can't really explain in text alone. It's something you'll have to see for yourself in a side-by-side comparison of still images, videos and actual gameplay. Take my word for it (or don't - it's your call).
If you want to check response time in monitors so you can verify that the X-Star is truly worth it, check out this video:
Have at it. And good luck.
Just this: Do. Not. Until. Reviews.
Multiple inputs will likely add input latency. That means input lag between your commands (mouse, keyboard, game controller) and the image displayed on your monitor. That means that the 5ms response time is rendered insignificant, and the 120hz just means you'll have a really smooth image as your mouse moves through molasses.
Also, just because you have DisplayPort 1.2a (that monitor isn't guaranteed to have it, mind you) doesn't mean it'll be perfect. FreeSync as it current stands has a 48Hz to 144Hz window, over which you encounter tearing or the problems associated with V-Sync. Until we have a frame limiting solution (like the kind you can patch into Skyrim using the .ini file) and a capable way of lowering that FreeSync window, it'll still be an issue. That's because the current-gen scalers lack the programming and sufficient onboard DRAM (enough to hold the full screen image in raw pixel format, read: bitmap), we won't see the FreeSync window below 48Hz. That's because monitors have a certain "life-time expectancy" of a pixel on the screen before it fades out. It doesn't have persistence like e-Ink displays, or like solid-state NAND flash (like in SSDs). It's more like DRAM, which needs to receive a refresh signal every so often or it will lose it's information (in the case of the pixel, it'll lose the color value or start to fade to darker shades, affecting image quality).
The panel can't keep a single frame forever without refreshing the image. And that image has to be stored if it doesn't get a refreshed image signal within a certain amount of time (below 48Hz, or 1/48th of a second, or around 20ms). So including DRAM in scalers is part of the AdaptiveSync spec by VESA, since 2011 (I think - I may be wrong). But it hasn't been implemented by scaler manufacturers yet, even though the rest of AdaptiveSync has. (FreeSync is AMD's implementation of AdaptiveSync.)
The scaler is just the little chip that takes your DisplayPort or DVI cable input, and can decode that information before "writing" it onto the screen for you. It takes the "binary gibberish" and turns it into images, basically. It also handles the refresh rate of your monitor, and can allow overclocking to work beautifully (or not at all). It can also do other amazing things, like adding variable refresh rate to your monitors (enabling FreeSync/AdaptiveSync). But if it doesn't include DRAM and programming to self-refresh the frame if it doesn't receive a new frame within a given time, than it won't remember the last frame - it'll just have part of the current frame, so if it refreshes an incomplete frame you'll end up with screen tearing.
What you want (ideally) is for your monitor, when it gets to below 48Hz, to turn up to 96Hz (or even 144Hz if possible), self-refresh its current frame, and so that the input latency is as low as possible when the next frame arrives (since sudden and dramatic drops in framerate aren't that common, but instead we tend to see curves rather than going from 120fps to 10fps within 10ms in most graphs - unless you game was optimized** by Ubisoft).
** Optimized = May contain up to optimizations, meaning none at all. If your game is Ubisoft after 2011, it's not a game; it's an exercise in masochism and patience.
That has to do with how monitors work. Turning up refresh rate when you don't receive a signal to lower overall input latency (or to simulate lower response time and give a smoother refresh rate experience) can work beautifully, if done right. But that's up to the programmers of the scalers. DRAM will add cost to AdaptiveSync monitors with wider windows, but gamers must demand this for the cost in R&D and production to be worth it. Initially some people will receive some benefit from it. Not all. And there will be the infamous "early adopter tax", so don't expect to get off without selling a kidney or two for the "privilege" of beta-testing hardware so engineers can ditch the Gen1 product you got a third-mortgage so they could avoid the mistakes they made with it, to make their "latest and greatest Gen2 product, best product that will ever be... until Gen3 in 6 months".
Hope that helps. Basically: don't become an early adopter if you value your wallet. Let them work this sht out first. Let early adopters sell kidneys for "the privilege of beta-testing" (note: people used to get paid to do that, and there were jobs revolving around that). Don't get a Gen1 product. Wait until DRAM is included, and the AdaptiveSync window is below 20Hz. Or just wait until the reviews come out. As TotalBiscuit would say: Don't pre-order. (Especially with hardware. Remember Sandy Bridge's Cougar Point SATA 3.0Gbps ports on their PCH, and the costly fiasco it was for Intel? Look up the cost of hardware failure for early adopters. Let them deal with that sht.)
Thanks. I really appreciate that, man.
Regarding if I've written other guides: Yes. There are several other guides elsewhere on the forum, but they're all buried all over the place.
I could write some other guides, but I'm not as knowledgeable about other topics. Monitors don't evolve that fastly, compared to SSDs, video cards, etc. I'd rather write guides about topics that don't require as much constant updating.
So I'm thinking if I do write other topics, I'll write about mice, keyboards, mousepads and other peripherals. (Case fans, heatsinks and other airflow-related components are a bit out of my scope, though.) I might also be able to write about other topics like power supplies, thermal compound, optical drives, wireless adapters and wireless routers, modems, anti-vibration and other sound-dampening case accessories, screws and tools for computer builders, cable management, computer case cleaning, etc.
But to be honest, I think I'd just maybe write about mice and mousepads in the future. People know a lot about power supplies, and there's likely to be more knowledgeable people to write about the subject than I. Case fans and heatsinks are also not something I know that much about. CPUs, GPUs and SSDs/HDDs evolve too quickly. Optical drives are dead with high-speed internet anyways. So here's what I could write about next:
Sweet. Good luck, man. Once you 1440p, you never go 1080p. =P
I'm not entirely sure about that. DisplayPort 1.2a allows for adaptive refresh rate, but uses DisplayPort 1.2 cables. The video card and monitor are the one things which need to support DisplayPort 1.2a to allow the adaptive refresh rate to work.
DisplayPort 1.2 (and therefore 1.2a) do allow 2560x1440 to run at 144Hz. However that can also be accomplished using DVI-D if you prefer. I've overclocked my own 1440p monitor to 104Hz without any issues, but I did need to do some tweaking of the software, video driver and OS to accomplish this.
You CAN use Thunderbolt to connect to a monitor, because Thunderbolt WILL carry a DisplayPort video signal with it. But you need to split the DisplayPort video signal from the PCIe signal, and that can cost a lot of money. You're better off just connecting a DisplayPort monitor to your DisplayPort on your video card, to be honest. Thunderbolt monitors are too expensive and not really worth it.
Also, please look other Korean imported monitors, as they can be both cheaper and have amazing quality. DVI is also not bad, and is better than HDMI in a lot of ways. (Maybe not HDMI 2.0, but we don't yet have many monitors using HDMI 2.0 at the time of writing this.)
Short answer: For RPG and horror games, I've really been enjoying my X-Star DP2710 LED "Matte" PLS monitor. I'm sure you'd enjoy getting one too. 310$ is the price, and I think it's fairly reasonable.
If that's a bit out of your price range, check out the QNIX monitor for 230$: http://www.newegg.com/Product/Product.aspx?Item=9SIA4JH2CA3490 (Tip: it's a steal. And shipping is included.)
And then there's this 1080p version for 170$: http://pcpartpicker.com/part/acer-monitor-umkg7aa001 (But only if you can go to Microcenter.)
Long answer: read this topic. (It's been updated.)
Thanks. I've just updated it a bit more to include HDMI 2.0, DisplayPort 1.2a and DisplayPort 1.3 so now they should be enough to keep most people happy (for a while).
If I edited down the verbosity and toned down the humor, I'm sure I could submit it to PCGamer, PCWorld or some other magazine.
Just did, due to feedback like yours. Hope it helps.
Thanks, man. I've updated this topic due to such great feedback. ;)
I'd recommend 1440p. I'm running a GTX 970, which is similarly-performing to the R9 290 and R9 290X. (I needed a GPU with max length of 8.5 inches, so Nvidia was the only way to go and I had 350$ to get something.)
I'd pick the X-Star DP2710LED "Matte" on eBay from dream-seller. You'll find it in the updated guide, along with great videos as to why it's such a great choice. I also was able to overclock it to 104Hz no problems, but 105Hz started to have artifacting. No bueno.
FreeSync is nice, but I'm not sure when we'll see a truly mature, well-developed FreeSync monitor. If the X-Star DP2710 LED "Matte" PLS-panel monitor can get updates to the firmware on its scaler (and a DisplayPort connector), I'd gladly pay an extra 100$ for it (total of 400$ would still be a bargain!).
The issue is that 4K is too much much GPU horsepower for any single graphics card for the next 1 to 2 years on current games, nevermind the games out then. 1440p is playable now with current hardware at 60fps at High to Ultra.
Don't buy 4K until the market is ready for it. Let graphics card manufacturers catch up first and make affordable 4K-capable GPUs, just like they're finally making sub-150$ GPUs that can handle 1080p max details @60fps now. Wait a few years before 4K. Also, scaling issues.
1440p is the way to go. Also, more affordable which gives you more money for something else.
The topic has been updated, due to positive feedback like yous. Thanks!
That monitor hasn't been at that price for a while. For 149$, it was reasonable at the time. Nowadays there's much better options for lower prices.
Thanks, man. This topic is seriously outdated, and is in need of some updating for various things.
Just got to that part of the video now. I started replying before seeing the whole thing. >.<
Why the GTX 770, instead of the GTX 970?
You get better performance for nearly the same cost. You could even afford an ASUS GTX 970 STRIX, one of the quietest video cards on the market right now that still performs great, for just 20$ more.
And the Corsair Vengeance RAM could go for a less-marketed brand, like G.SKILL, ADATA or Kingston.
Tiered Power Supply List
This helps show you the most stable, reliable, top-performing power supplies on the market. Basically, anything that's a Tier 2b and upwards is something that's worth trusting your system and its components with. The Corsair CX series is a Tier 3 PSU.
Next, there are reviews that come with Nvidia or AMD coming on top for the same price bracket very often. That's not surprising that some reviews turn out like that. Also, if you're comparing stock coolers against each other, of course you're going to end up with Nvidia on top because of heat constraints in AMD cards.
If I apply the same "use aftermarket coolers for one team" logic, I could end up with this benchmark: http://www.anandtech.com/show/7406/the-sapphire-r9-280x-toxic-review/3
Or, if seeing is believing, you can check out this video and this link:
Or this benchmark: http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-toxic-edition-oc-3gb-review/10/
The point is that benchmark numbers will change often between the red and green team. Yet, despite this, overall the red team offers better price-performance value, especially here in the US where you can find the R9 280X for 280$ or sometimes lower.
(Like this Sapphire Dual-X R9 280X for 269$ from NCIX: http://pcpartpicker.com/part/sapphire-video-card-100363l )
So sure. You can go that route, but even with the cheapest GTX 770 in the US (this PNY card: http://pcpartpicker.com/part/pny-video-card-vcggtx7702xpb ) for 305$, that's still over a 10% price difference, placing the price/performance in the red team's camp. Nevermind the fact that some R9 280X have now come out with the Tahiti XT2 GPU, which is a less power-hungry, less heat-intensive GPU, which is a revised version that AMD added after the launch of the R9 280X. (It's available by default in some GIGABYTE cards, for example, and provides better overclocking.)
There's more to it than simple benchmarks and prices.
Yep. Except it's going to be very loud once you consider you're running the included OEM cooler from Intel on your i7 CPU once it heats up, and the case won't dampen the noise at all.
So yeah, you might have a bit more performance, but at the cost of having your gaming experience tainted by the sound of a small hairdryer at your feet. (Maybe it can be fixed by having an air conditioner in the room and some really good, active noise canceling headphones... except that negates the cost benefit.)
Also, between R9 280X and GTX 770, the 770 performs best at 1080p while the 280X does best at 1440p and above because of the memory bandwidth.
Also, the case is very subjective anyways, the HDD is in-store pickup only (and that's a deal-breaker for a lot of people), and the motherboard may be stable for a non-overclocking motherboard but also lacks higher quality onboard sound, which means if someone wants that they'd either need a PCIe soundcard later or even a USB soundcard.
Also, an 80Plus Bronze PSU isn't always the most stable or feature-rich in power protection. It's one area never to skimp when building a PC. One small spike in electrical current from bad internal wiring and your PSU or motherboard are toast! There are some things no surge protector can fix, like a brownout (low voltage during an extended period of time).
The problem with this build is that your average Joe would build it, and not find the OS included. Sure, Linux is an option, but not for your average Joe, and not for most gamers out there (since most AAA games are on Windows - SteamOS may change that in the future, but for now it isn't even out of beta yet).
It also lacks an optical drive, which would also make it difficult for your average Joe to install Windows or Linux unless they already have their OS on a boot-able flash drive or external storage drive.
It also doesn't include a keyboard, mouse, monitor or headphones. And while yes, those are very personal, I think having some recommendations would be nice. Like the CoolerMaster Devastator Mouse+Keyboard combo (around 30$), for example, or the Superlux HD681 headphones (also around 30$). Very good stuff for an entry-level price.
But besides this, I think your build is solid. I'd just add these as "recommended budget peripherals", include an OS (typically 90$ to 100$ standard), optical drive (maybe an external USB optical drive), and let the user pick the monitor.
For the price, I'd also consider the Fractal Design Define R4 (possibly Blackout Edition, if available for under 100$), and I might also consider a different motherboard (like a full-sized ATX for future CrossFire support, so there's one slot of space between GPUs for cooling, and room for add-on cards if wanted).
I'd also have given users a recommendation of a 120GB to 250GB SSD as an alternative to an HDD, or as an addition so there's more speed with Windows and so the OS and games don't have to compete for IO requests from the mechanical hard drive arm going back and forth like mad.
Still, all that being said, these are minor gripes. Overall, you guys did a great job and pulled off a minor miracle on such a tight budget! Plenty of CPU overclocking left if needed as well thanks to included liquid cooler, although I hear Devil's Canyon can reach 5Ghz on air, so liquid cooling might push it a tad over that. Keep up the good work, guys!
No problem. This is a really old topic, and I don't even visit this forum anymore. I hope this old topic of mine has been useful for you. (I mostly use TekSyndicate nowadays.)
Well, we need cabling technologies that can handle it. DisplayPort 1.3 is coming, and I'm hoping we'll see 96Hz at 4K, since it can handle 3D at 4K according to the draft information available before it's Q2 2014 release.
Check it out. It's already capable of 60Hz@4K using DisplayPort 1.2 so going to 96Hz using DisplayPort 1.3 shouldn't be too much of an issue. And since Oculus Rift uses DisplayPort, we might see future video cards running more DisplayPort connecotrs in the future. I think we'll also see DVI eventually being faded out for smaller, digital connectors.
I'm thinking we'll see maybe two or even three DisplayPort (or at least mini-DisplayPort connectors) on the back of videocards for output, since it'll be something that'll help give multi-player with Oculus Rift using a single computer or console.
Imagine instead of using splitscreen methods of multiplayer, you could have Oculus Rift running two 3D 1080p@60Hz (per eye) using their own cables.
(Also, given today's modern encoding technologies like with Nvidia ShadowPlay which streams 1080p using Wireless AC or Gigabit Ethernet, we could see triple-antenna wireless Oculus Rift models running with only batteries. I think we could see such technologies coming around when batteries improve, and since there's not much light coming from such a small display, only the encoding would be a real battery drain.)
Anyways, enough speculation for now. (Well, unless I go to into motion tracking using magnetic fields, infrared, camera, ultrasound, or some other method. Or low persistence displays. Or how 72Hz or 96Hz which could be used to display 24fps movies with extra smoothing, which could help bridge the gap between multimedia displays and computer monitors, and with something like G-Sync we could also see much perceptually smoother monitors. But I could save that for another time.)
Take care. =)
Thanks for your input. I recently changed my opinions on this, and I've since been much more interested in IPS-like panels.
Although I'm hoping IGZO and other 4K panels will improve their response time, and get framerate up to 72Hz to 96Hz by default. Maybe G-Sync?
Aahhh... good to know.
Something that should be noted: The first advantage only happens with 3D vision gaming. If you don't game in 3D mode, not a serious benefit. If not running at 100+ fps, not worth it.
Regarding frame latency: please note that AMD will fix their drivers on the 31st of July, or so I hear. Other similar issues will be resolved quickly as well.
Regarding Shadowplay: that's important if you do Let's Play videos on YouTube, live streaming via TwitchTV, or something similar.
Finally, you've got to remember that it's quite possible AMD will launch the HD 9000-series cards very shortly (October is the rumor), and the difference between the GTX 770 and HD 9970 will be very similar the difference between GTX 580 and HD 7970 at launch. That's because it's a new generation architecture and a die shrink. (GTX 580 was 40nm, and HD 7970 is 28nm... GTX 770 is 28nm, HD 9970 is supposed to be 20nm).
There's also the Never Settle Bundle.
Also worth noting is that, in spite of this, nVidia often has better driver experience, including GeForce Experience to optimize game settings automatically, and the APEX/PhysX libraries, which are very nice.
I've heard it can handle it. Otherwise, there would be no point in making an AM3+ motherboard with PCIe 3.0 ...
At least, Vishera CPUs are supposed to handle it. I suppose you could look it up, if you wanted to. But that would involve google searches into CPU specs.
If I'm not mistaken, it does support PCIe 3.0 on an FX-8350. Or at least that's what I heard. I heard that PCIe 3.0 is supported by Vishera CPUs, but not by the 990FX chipset.
I still think the 990FX will be known as the chipset that killed a platform.
No problem. n.n
Glad I could help. Apparently the research on that LG monitor helped show to be that the LG monitor wasn't as thin as I thought, or had seen from the pictures.
I think that Acer 23" is what I'd get, if I were looking for a gaming-grade sub-200$ IPS monitor.
Although I still think the ASUS VG248QE is likely to be amazing. 144Hz, 1ms, LED backlit, around 250$... even though it's a TN panel, it's still awesome. Now, for multi-monitor, I might go with the Acer 23".
Personally, I think if I were going to build a system today, I wouldn't get a multi-monitor setup. I think I'd go for 1x ASUS VG248QE and 1x Acer 23" LED-backlit 5ms monitor. That way, when playing high-FPS games, I can use the ASUS. But when I just want the best colors, I can stick with the Acer.
Well, I'm looking on YouTube as I write this. Apparently, it's 11mm thick. The pictures are indeed misleading, at least as shown in this video:
However, mind you; 11mm is still really thin.
But due to the misleading pictures and marketing, I'd like to point out that you might help LG change it's marketing by going with one of the Acer monitors instead. The 23" acer monitor did seem nice. But the 21.5/22 inch monitor is also quite interesting. The price difference didn't seem that big; 160$ vs. 180$. You're getting a little bit extra size, which can help you put the monitor a bit further from your face, which makes viewing less tiring and more enjoyable. Anyways, hope that helps.
Out of the ones listed here, I think these offer some of the thinnests bezels:
Personally, I think the first monitor I've just listed (LG 23EA63V-P) has the thinnest bezel of them add, and it still gaming-grade.
However, if you're looking for a 1ms, you won't have an IPS screen. I'd say the VN247H-P.
If you're looking for a cheaper monitor (better for multi-monitor support), I'd say this guy is best: http://www.newegg.com/Product/Product.aspx?Item=N82E16824009483
Although I think you could still go well this this guy: http://www.newegg.com/Product/Product.aspx?Item=N82E16824009484
So, to wrap it up:
Thinnest bezel of any IPS 5ms monitor: LG 23EA63V-P (worth it around 200$ or less)
Thinnest bezel of any sub-2ms TN-panel monitor: ASUS VN247H-P (worth it around 180$)
Best deal for value of any thin bezel 5ms IPS monitor: Acer H226HQLbid (worth it around or below 160$)
Yeah, nice. I'm looking forward to the K95 too. And hopefully a new Logitech G710+ followup.
Yes. So, it's 32 PCIe 3.0 lanes, and some extra PCIe lanes of other widths.
Yes, it is a great board. 32 (or more) PCIe lanes is what you'd expect on most LGA 2011 boards, of those 300$+ Z77 boards elite overclocking boards.
It's really refreshing to see PCIe 3.0 on AMD systems, for a nice price of 200$. There are still a lot of things which could be better; I would love to see wireless included (with reverse-RCA jacks for aftermarket antennas), a/b/g/n up to 300mbps (or 450mbps) using dual-band.
I'd love to have seen 4x USB 3.0 headers, 8x USB 3.0 on the back panel. The audio chipset being a mere ALC 892 is sort of underwhelming, really. I'd expect better from the premier AMD performance board. Speaking of which, why not make a ROG AMD PCIe 3.0 board soon?
Also, why not Dual LAN with BigFoot NICs? There are a lot of little things that could be done to make this board better. 1x 24-pin ATX, +2x 8-pin ATX would allow for MASSIVE overclocks, absolutely insane stuff.
Heck, but this wishlist for an AMD board would have to be at least that rumored AMD Centurion 5Ghz chip we're hearing about.
No probs. n.n
Yes. Remember, practice is important.
Also, necessity also helps. For example, try chatting (using very long messages) to six people at the same time, with very fast response times. So much so that they will even have trouble keeping up. I can literally keep up with six other people at the same time, in dedicated text chats using IM messaging, while offering responses that are up to 4 - 8 times longer than their messages. I often put about 1,000 or more characters into each response, while they struggle to do over 200 characters...
Well... the more you type, the faster you are at it. Note: I type using about 4 fingers at most. I've never taken professional typing lessons, and I've learned/gained my speed through practice.
If I used all my fingers, it's likely I could type around 120wpm... maybe more.
Yeah. I've been IM-ing since ICQ in 2002, and I've spent an average of at least 3 hours online every day since. (Even on vacations, which I might have been offline for weeks, I often spend 7 to 12 hour days online, so the average stays.)
And I love writing for blogs, IM-ing, forum posts, articles, etc.
Must be laziness and/or procrastination. Heh, it's probably why I haven't created the topic about user-submitted articles yet. =P
I'm a really good typist' 78+wpm (words per minute - accurate words with no spelling errors)
Meh, for me this is just a basic forum reply I'm putting in light effort into.
I know that sounds arrogant... but really, it's just comes natural to me, because I think of these things on my spare time, and my mind has a good way of organizing, explaining, wording and expressing my ideas in a logical, practical, thorough way (with explanations). Not hard, just demands practice; now if only I could speak to regular non-forum people, since this is how I normally speak to most other people 'IRL'. (Yep... it's sad. Well, at least it's not "oh my gawd I just watched charlie the unicorn for the third time, what's wrong with me?" sad... or "why am I listening to Justin Bieber's Baby, since I'm a composer of fine classical music that's world renowned and well-respected by my peers?" sad...)
Add a link too, if you don't mind. (also, is the salesman from Amazon a reliable vendor that will continue to sell the part for some time to come?)
Thanks man. I'll write to philip about that, and offer it to him as a suggestion.
As for more power delivery... well, I think better powerful delivery would be better. I know "green gaming" is a big thing with gamers... but I'm not interested in a "low power" GPU. I'd much rather have a GPU that consumes 300W, and has a 300W-grade cooler on it (go nVidia Vapor Chambers to keep TDPs down).
I mean, why would I want more overclocking potential in a GPU when instead I could get a better default reference cooler, better (more efficient and silent) fans, and higher TDPs right off the bat?
Heck, go include some pro-gamer and pro-reviewer base fan profiles (for "silent", "performance", "hardcore gaming", "energy-saver", "casual gamer", and whatnot). Add the option for the community to share, test and rate their fan profiles.
Add also an area for gamers to vote and share their favorite softwares - imagine a community-based software rating and recommendation system. Something like YouTube, but instead of videos, it's ONLY gaming software (not games!). Things like GPU Overclocking, CPU Overclocking, CPU Monitoring, GPU Monitoring, Benchmarks, tutorials, guides, comparisons, etc.
And then allow users to also subscribe to users to see things like: "Check out IAmLord666AwesomesaucinessOfDoom1337!!!11!1!1liI1!WhyDoIMakeTheseRidiculousUserNamesOhGodKillMeNow 's favorite software, comparisons, personal reviews, etc.
Thus, a fairly good and reliable user might be able to have good insight into things. He might also be able to make lists of his PC components, modifications, softwares used, etc. Maybe also link some YouTube tutorial videos he made (or others made) on how to do something, how to mod a game (like Skyrim), etc.
Anyways, I figured that these sorts of ideas would really help shape the gaming community, and if adopted by a gaming manufacturer, that would be REALLY awesome.
Because that would pretty much make a gamer-specific social network that's meant for the purpose of sharing information about gaming. Add some manufacturers, retailers/e-tailers, some magazines, news sites, blogs, TV Channels, YouTube Channels, and you've got a backbone ready. Add live streaming support, and now it's getting ready to be epic; then add support for Android Streaming (via WiFi or LAN) for things like OUYA, iOS support for mac gamers (obviously this is to be taken as seriously as the term "hardcore text-based DOS 8086 gamer"), Windows XP, Vista, 7, 8, those enterprise versions for "hard-work/hard-play" gamers (I don't believe a word of it - it's one or the other, or no sleep period), Linux (yeah! Steam for Linux!).
Then, add integration with Steam accounts (to show your achievements), Facebook account (to login, and share with your friends as per your privacy settings), Twitter (same purpose as the Facebook account integration, except you can only type the first 1.40 characters of your username, roughly 11.2 bits).
I'll write the topic request later. n.n
Well then next time I'll increase the inside gaming/YouTube references by a factor of exactly 8.41 !! (after I get approval from The Federal Procrastination Committee For Forum-Posting Amateur Journalistic Seriousness, of course)
Hehehehe... I like using my amateur articles as an excuse to give tech jokes based on currently available information (and a fairly high dosage of opinion, bias, and insanity).
Anyways... I'd like some feedback on my humor: Too acid? Too intellectual? Too witty? Too complex? Or not enough (insert_any_prior_adjective_here) ? Or just right?
Thanks for the compliment, dude. I'm thinking, do you know of any online blog, website, reviewer, or portal that would be interested in articles like these? I mean, of course I'd have to polish up the language a bit, maybe simplify the jokes a bit (and explain them for the simpleton who stumbles onto the page, or doesn't have time to research why the joke is funny and/or witty), maybe make the humor less acid and/or dark, make the article more professional-sounding (and work on polishing up the grammar/spelling, using better vocabulary, etc). Those are all, of course, minor adjustments I can make on the fly, but I'd like to get some feedback as to where/with whom I could apply for a position as freelancer writer/contributor and off-hand amateur semi-journalistic comedian?
Also, I'm going to propose something on the website... I'd like to gain your support, with other users, if possible. I want to get user suggestions to get philip to implement a way for users to send PCPartPicker-exclusive articles to be published here. Of course all rights to the content of the article would be forfeited to PCPartPicker.com (or, better yet, given to CopyLeft/public domain/GPL/GNU/whatever/iLostTrackWhyAmIListingTheseAgain?), and there would be a disclaimer about "this content does not reflect the opinions or ideas of PCPartPicker.com in any way, but are user-submitted and reflect the author's opinions". Something to keep angry partners from attacking PCPartPicker or creating tensions, but also keeping PCPartPicker from being considered an "unreliable source of information" in case some article contains information which is later proven to be false.
Of course initially I think this could work as a contest where a few users write sample articles, and the best ones get featured on PCPartPicker (maybe under a "News" or "Articles" or "Reviews" Tab in the main/home/front page). Make it fair for the community. Of course guidelines would apply, and only approved articles/reviews/news would get featured (since some of it might not be appropriate or accurate), which would get filtered manually pre-voting (and thus philip would choose the finalists to get voted on).
Heck, we could even include in the "User Agreement/Terms of Service" that any News/Articles/Reviews submitted to PCPartPicker (not the forum, though) gives PCPartPicker the right to display, link, share, distribute, and advertise on any content submitted. We could double it, so there's a reminder before a person submits an article; a small page that loads (with text box in full view, occupying say 300px tall on the screen) with another disclaimer. We could even implement it so that PCPartPicker would submit the content to the public domain (and register it there) before it's approved to be published here.
If we can get user support for it... we can probably get philip to work on it. What do ya say?
Odd, the Nanoxia DS1B seems to have disappeared, but the Nanoxia DS2B still appears on NewEgg, here: http://www.newegg.com/Product/Product.aspx?Item=N82E16811281002
Thanks man. n.n
Thanks for the pointers.
I looked up tutorials on ASUS SSD Caching that pretty much say that... although the bit about stopping ASUS SSD Caching you pointed out is something I hadn't heard before, but it makes sense.