Comments Locked

146 Comments

Back to Article

  • nathanddrews - Wednesday, May 6, 2015 - link

    I suspect there will be some upset Titan X customers quite soon.
  • chizow - Wednesday, May 6, 2015 - link

    Why? We'll have enjoyed 3+ months of top-end GPU performance, and there's no indication from any of these slides that AMD's upcoming graphics will change performance hierarchy or the current pricing structure, at all.

    But most importantly, we won't have to deal with AMD drivers and half-baked ecosystem, we just get products and features that work, as advertised. Can't really put a price tag on that. :)

    My bet is AMD's single-GPU flagship slots in just under Titan X in performance, maybe 10-15% slower which will allow AMD to price it ~$700 until Nvidia responds with a 980Ti that will sit somewhere between that and the Titan X at that same $700 price point, forcing AMD to drop prices. GTX 980 drops to maybe $500 or so. 970 stays in that $300-330 range, and AMD's new parts are priced accordingly.
  • testbug00 - Wednesday, May 6, 2015 - link

    For 3+ month earlier performance, sure! Of course! Buying earlier for high end/lowest power/etc almost always has a premium.

    AMD's drivers really are on the same level as Nvidia's on the whole, if you want to talk 4+ years ago now. I can agree with you. My experience with NVidia professional + consumer product has been TERRIBLE for professional and fine for consumer. My AMD consumer + consumer + consumer product has been fine, fine and fine.

    However, statisticly, they are more or less even, perhaps a slight edge to Nvidia.

    And, according to all leaks it will have a bit over Titan X performance. These same leaks that were accurate for Titan X. So, I would say your predictions are very incorrect based on what has been shown to be accurate in the past. Hopefully for Nvidia those leaks are wrong. Hopefully for the consumer those leaks are right, driving down prices for everyone.
  • Manch - Wednesday, May 6, 2015 - link

    You're talking to a die hard NVIDIA fan. Might as well be talking to a tree.
  • chizow - Wednesday, May 6, 2015 - link

    It's easy to be a fan of the best, and solutions that work and make life simpler/easier. Which is why its hard to be an AMD fan.

    What's your excuse again for being a die hard AMD fan? Might as well ask a tree.
  • Impulses - Wednesday, May 6, 2015 - link

    What's anyone's excuse for being a brand fanatic over just buying the best bang for the buck? Seriously.
  • chizow - Wednesday, May 6, 2015 - link

    I guess, but that's also part of the problem, because the "bang" is going to be a lot different for everyone. Especially in this case, you're going to get a lot more "bang" from Nvidia for just a few more bucks.
  • Mikemk - Thursday, May 7, 2015 - link

    Agreed, though there are a couple brands I'm an antifan of.
  • Ziggurat - Thursday, May 7, 2015 - link

    If I may wager in with a constructive point. I am a self proclaimed Intel + Nvidia fanboy, but I get so happy when AMD makes great products, or awesome breakthroughs. 10 years ago I was a fan of AMD CPU's, and last summer a bought an R9 series GPU because I wanted to check out if AMD had a product I'd love to use. Good news, I now have R9 series graphics in my HTPC.

    Right now Nvidia is objectively the best GPU maker for enthusiasts, and AMD has some GPU's that are reasonable picks in few price classes. I would of course not switch teams in a heart beat, but if the two comapnies switch places in the totem pole, I will of course "put on a read team shirt". This technology difference is reflected in GPU marked share right now, and if AMD gave Nvidia a harder time holding that position, well Nvidia would become better them selves. The improvements we see from both sides comes from their plan to make more money, and competition is the instrument that makes the fruit we eat.

    I am a fanboy, or maybe I am not, because I'd not be angry if AMD bested Nvidia, I'd buy their graphics card.

    Now for AMD's CPU's, that's a sad state of affairs, but I can only hope they are not run out of the CPU business, because intel would suck without competition. This became a long comment, sorry.
  • chizow - Thursday, May 7, 2015 - link

    @Ziggurat I mostly agree with your position, I was also a big fan of AMD CPUs until they momentarily gained the lead with Athlon 64 and priced me and every one else in that $300 range out of the market.

    I've also said many times to AMD fanboys, that if current positions were reversed and any Nvidia GPU had an AMD logo on it, I'd buy it in a heartbeat, because clearly, right now, GeForce products are better top to bottom from both a hardware and software/end-user experience. We know for a fact AMD fanboys can't say the same thing, because they're still AMD fanboys using AMD cards!

    What I really dislike about AMD however, is they are incredibly dishonest when they don't have competitive parts, just lots of FUD and misinformation they put out there that their dim-witted fanboys parrot forever. Look at FreeSync as an example, we are still sorting through the BS that AMD has put out there through the whole FreeSync run-up. Its really hard to support a company that deals in BS like that.

    I also looked into an HTPC build last year and almost built a Kaveri-based system, but the lack of mini-ITX options at launch, the fact I would need a chassis and PSU, and ultimately, significantly higher price tag even with some really good Micro Center deals swayed me towards just building an Intel NUC. Slightly cheaper and a LOT less heat and footprint. Now? AMD wouldn't even be a consideration because you have ridiculously good Intel/Nvidia options like the Alienware Alpha.
  • testbug00 - Thursday, May 7, 2015 - link

    Sorry, but, your issue with AMD is what BS they say? Interesting. Never heard any of their competitors spew BS, or, outright lie about things to consumers... Oh... Wait...
  • chizow - Thursday, May 7, 2015 - link

    @testbug00

    AMD has a LONG history of downplaying and outright talking shit about their competitor's solutions, when it suits them or they don't have a competitive solution. Start with PhysX. Or CUDA. Or GameWorks. Or 3D Vision. Or G-Sync. Its the same BS from AMD, we don't like closed, proprietary, we love Open. Their solution will fail becaused its closed. And what do you get in return? Nothing! Empty promises, half-assed, half-baked, half-supported solutions that no one cares about. Open/Bullet Physics. OpenCL. GamingEvolved/TressFX. HD3D. FreeSync. Abandon/Vaporware.

    I know AMD fanboys like you will repeatedly cite the 970 VRAM issue, but again, the net result had no negative impact on the consumer. The 970, despite the paper spec restatement, is still the same outstanding value and performer it was the day it destroyed AMD's entire product stack. Unlike AMD who repeatedly overpromises and underdelivers. FreeSync is just the latest addition to this long list of examples. Oh wait it isn't really free, new panels cost a lot more. Oh wait you do need new hardware. Oh wait old panels can't be just firmware flashed. Oh wait there's a minimum refresh rate. Oh wait outside of a limited window, its actually worst than non-Vsync. Oh wait FreeSync disables VRR. Oh wait it doesn't work with CrossFire.

    Yet AMD this whole time, kept saying FreeSync would be better because it didn't require any "unnecessary" proprietary hardware or licensing fee, and would actually work better than G-Sync. Just a bunch of lies and nonsense, FreeSync is in disarray as a half-assed, half-broken situation, but coming from AMD, why would anyone expect anything less?
  • Enderzt - Thursday, May 7, 2015 - link

    What are you going on about?

    I am not a specific fan of either team but it's crazy that you are talking about AMD as being the dishonest company when Nvidia is currently still suffering from the 3.5 gig 970 debacle. Talk about being disingenuous to your user base. Pretending Nvidia is superior to AMD in-terms of honesty is laughable.

    And what do you mean by FreeSync BS/run-up? They are offering an industry standard solution for adaptive monitor refresh rates that requires no license fee or proprietary display scaler like G-sync. This is an open VESA standard. I don't really understand how you could think AMD is the bad guy in this market when Nvidia is the one charging you 300 extra dollars for a monitor with the same features and less functionality. G-sync displays only have a single display-port input and the monitor company needs to add a completely separate scaler module if they wanted to add HDMI/DVI/VGA inputs, which again further raises the price of these monitors.

    So AMD goes open source with their refresh rate solution that will benefit the widest gamer audience with the least expense and their the dishonest bullshit company. Your arguments just don't hold much weight and pretty obviously come from a biased view point.

    Does Nvidias top of the line cards currently out perform AMD? Yeah can't really argue with facts and numbers. Is the Nvidia experience great? Hell yeah! G-sync kicks ass and the cards work well. But pretending they are better top to bottom, across all field, is a pretty ridiculous statement. There are exceptions to every rule and AMD has a nice fit in the price to performance market. And according to what we know about this next gen card release they will soon be competitive a the top of the line as well.
  • chizow - Thursday, May 7, 2015 - link

    @Enderzt, I won't repost everything just read my reply above. Same applies here.

    You aren't getting the same features and less functionality, you're getting a half-baked, half-assed solution with a ton of asterisks and glaring flaws, but as an AMD user and fan I can tell it will be hard for you to even notice.

    But yeah, that $300 cheaper monitor looks to be cheaper for a reason, its just not very good.

    http://www.tftcentral.co.uk/reviews/acer_xg270hu.h...
    "From a monitor point of view the use of FreeSync creates a problem at the moment on the XG270HU at the moment, just as it had on the BenQ XL2730Z we tested recently. The issue is that the OD (overdrive) setting does nothing when you connect the screen over DisplayPort to a FreeSync system. "

    Looks like a recurring theme for AMD FreeSync monitors, I'm not sure how anyone can claim these technologies are even close to equivalent.

    Also, if you want to get into really deceptive behavior an AMD's part, you can rewind to driver-gate where AMD was seeding press with overclocked/custom-BIOS 290/X to make them look and perform better. You can also look at their attempts to deny any problems with CF and runtframes, until PCPer made it clearly obvious and ultimately forced AMD to revisit and fix their CF framepacing.
  • P39Airacobra - Wednesday, June 3, 2015 - link

    Really? And Nvidia was so honest out the 970! Yes you are still a fanboy! Lay off the fluoride, And cut down on your vaccines!
  • 01189998819991197253 - Friday, May 8, 2015 - link

    @ Zuggurat
    That was a good choice on your HTPC GPU. Nvidia has some pretty serious HDMI handshake issues, which makes using an Nvidia GPU in an HTPC incredibly annoying. I've got Nvidia in my desktop and wish I could use it in my HTPC, but having to reboot or restart the graphics drivers whenever the TV or receiver is turned off is a deal breaker for me.

    This issue has persisted for years now and it looks to me like Nvidia just doesn't care. So I'm stuck with AMD in my HTPC because Nvidia is too cheap and lazy to get the HDMI handshake right.
  • chizow - Friday, May 8, 2015 - link

    You must have a really old or low quality receiver, I haven't had any issues with this on 2 receivers, most recently a Yamaha VSX-677 that has standby HDMI passthrough. My old Sony STR DG1000 didn't have issues with this either.

    Nvidia does have some HDMI issues, but only because they explicitly honor what is in the mfg's INF files. If mfg's did a better job of forming their INF files, these issues wouldn't be as prevalent.
  • Mr Perfect - Thursday, May 7, 2015 - link

    Because humans are emotional beings, Mr. Spock.

    Seriously though, I'm with you. Whoever builds the best card at the time I'm buying gets the sale.
  • Zefeh - Wednesday, May 6, 2015 - link

    But when you take head to head comparisons of GPU's AMD released in 2013 and compare them to the current line up of that Nvidia has launched this year its kinda stupid how little preformance increase it is.

    I plugged in all the fps values from the anandtech bench comparison between the 290X vs the GTX 980. I computed the percentage increase in FPS of the 980 over the 290X then summed these and divided(by 40 test values) to gather the average performance gain in FPS in percentage.

    Guess what - The GTX980 has on average 12.85% more FPS than the 290X. 13% more FPS at 83% cost increase. That, is very bad. The reason why Nvidia has more market share isn't because they have a better product, its because their marketing is MUCH better than AMDs. Hard numbers are startling.
  • chizow - Wednesday, May 6, 2015 - link

    @Zefeh, at face value what you say is true, however, if you look historically and the fact we are stuck on the same 28nm process node, the advancements made this generation and Maxwell in particular are frankly stunning.

    Again, the key thing to remember is that we're on the same process here, to be able to increase performance nearly 2x (780Ti to Titan X or GTX 680 to GTX 980) on the same process node without the benefit of a simple doubling of transistors, while also simultaneously reducing or maintaining TDP is nothing short of miraculous.

    Now, once you put that into context, and see we weren't even able to manage that with BOTH a new process node AND new architectures, you can see why 28nm will actually be remembered as a rather significant node (at least for Nvidia). They've increased performance something like five-fold (GTX 480 to GTX Titan X) on just a single process node (40nm to 28nm).

    Again, you can claim all you want that the only reason Nvidia has greater marketshare is because of marketing, but I can quite easily rattle off about a dozen of their technologies that I enjoy that make their product better than AMD. No marketing in the world is going to make up for that gap in actual features and end-user experience, sorry.
  • Zefeh - Wednesday, May 6, 2015 - link

    I ain't gonna do what I did again for other cards, but I'm quite sure AMD has competed pretty evenly in the GPU market since the 4000 series and you basically disregarded them in that post.

    What you just said - "They've increased preformance something like five-fold" can also be said for AMD. You blatantly disregard that AMD has its own points of success, just focusing on Nvidia bonuses! "Nothing short of miraculous"? Its a company doing what its supposed to do, natural progression of technology...

    To be honest, I'm a consumer that runs 2 HD displays and a 3rd incoming. I am no datacenter and I am no dude working on multi-million dollar systems. I get these GPU's to game and there's a reason why I buy a GPU - performance/cost ratio. Titan's cost is stupid, its performance is comparative to AMD at $300+. I don't give a crap that power has been cut, I care about cold hard performance. Drivers have NEVER been an issue for me with AMD and I've been with them since the HD4870. The people I recommend AMD GPU's to also don't have a problem. What you are spewing is just random things that are expected of companies...

    You want to know whats "Nothing short of miraculous"? The fact that AMD hasn't been making a net profit as a company for YEARS and is STILL a better performer than Nvidia minus the corporate world (Because they just don't have those resources).
  • chizow - Wednesday, May 6, 2015 - link

    Wow so you go straight from normal coherent comparisons to AT Bench to driving off the deep end lol.

    Same can't be said for AMD, yet, sorry. Nvidia has pulled the rabbit from the hat on 28nm, not once, but twice. We have no idea if AMD will be able to perform the same feat, but from the looks of some of these announcements today, it is not looking so good.

    But I'm a consumer too when I'm off work, I just wanted to make it a point that the data center I run is just one of tens of thousands in the world that are filled with Intel and Nvidia solutions, despite the hopes, dreams and testimony of AMD fanboys.

    But since you are an AMD user and run 2 screens, you should be able to tell me. Did they fix that full clock/high power issue yet with more than 1 screen? Because they hadn't fixed it as of 2 months ago when I had that 290X. Just wondering.

    But yeah, Titan performance is comparative to AMD at $300? Get real man, seriously, you'll be lucky to see that level of performance from AMD at that price point next generation. The fact AMD hasn't turned a net profit and performed so poorly in the marketplace is why they're a fraction of the size they were even 4 years ago, and why today's announcement is flush full of rebrands rather than new components and new products.

    They've gambled their resources in a lot of the wrong areas, and while they have generally been competitive in raw performance and certainly in price, the market simply isn't interested in their products. AMD fans will insist its something ethereal like marketing, but the the market simply prefers better products which often comes down to features and support when price:performance are close.
  • looncraz - Friday, May 8, 2015 - link

    nVidia had the same "bug" if you recall. However, yes, they fixed that "bug." Also, anyone who knew anything would just run special idle profiles. I still do this, which saves quite a few watts from the standard profile (AMD should really do this by default, there is no reason to push the memory clocks to full at any point in time in 2d outside of a few specific applications).
  • chizow - Saturday, May 9, 2015 - link

    @looncraz, the key word is Nvidia HAD. But they addressed it within 3 months of inventing the feature to downclock and reduce power while in desktop mode with Fermi in 2010. AMD not only took longer to implement their own version, but it is still NOT FIXED. See the difference? We are not talking 3 months to get their drivers in order, we are talking like 4 years now.

    You keep putting "bug" in parens as if you don't acknowledge this as a problem, it is obvious this kind of attitude is what results in a non-fix for AMD users. Their users just accept it and blow it off, like dozens of other major issues, so they just never get fixed.

    Also, custom profiles don't work, because contrary to what you claim, downclocking the memory does result in major issues like flickering screens, lag, and outright loss of range on monitors, which is most likely why AMD has to keep their memory clocks at full and their GPU clocks high to begin with.
  • Hicks12 - Friday, May 8, 2015 - link

    Sorry but what is that five fold increase tested against? Since we're on anandtech let's use their benchmarks.. http://www.anandtech.com/bench/product/1135?vs=118...

    I don't know why you compare the titan x to a 480 as the price is stupidly different, just because the year of release is different doesn't warrant excluding the price price difference.... Gtx 480 was $500 and the titan x was $1000.... Yeah that's not valid comparison.. Even still the benchmark shows less than a 3x improvement (sorry it's a titan black ha). Let's compare the gtx 980 to the 480 as that was $550 so very close in price.... Oh look it's much closer to 2x performsnce.. And I have just checked the 5870 against the 290x and its pretty much identical in improvement so really don't sit on that nvidia high horse as both teams have made good progress.
  • chizow - Friday, May 8, 2015 - link

    I'll be gentle with you since you are not just an AMD fan, but also appear to be very young as well from some of your less than coherent ramblings. :)

    So as you can see in my original comparison, I was drawing a comparison between what has been accomplished on 28nm, which is just 1 process node away from 40nm. GTX 480 was Nvidia's first flagship on that node. Titan X (GM200) will be Nvidia's last flagship on 28nm. The goal was to compare how much we have advanced on just 1 process node improvement and as we can see, the results are simply astounding. Price is irrelevant, as that was simply Nvidia taking advantage of the market opportunity AMD presented with the lackluster 7970.

    Again, you don't seem to be able to follow the relevant parts being compared here. Titan Black is 3x faster than 480, which is what you would expect as typical from one process node to another. But Nvidia pulled a pretty amazing feat on 28nm and pulled a rabbit out of the hat again with Maxwell. Titan X is ~1.7x faster than Titan Black, so 3*1.7=5.1x which is exactly as I said. That's what AMD still has left to take care of leaving off from the 290X, they have the difficult feat of pulling off similar while reducing or maintaining current thermals, which is clearly going to be an uphill battle for them given how hot and power hungry Hawaii was to begin with.

    Before you reply, please take a moment to read and really understand what I wrote, you might actually learn something. :)
  • looncraz - Friday, May 8, 2015 - link

    Not just nVidia's superior marketing, but also AMD's unfortunate image of being the "affordable" option. AMD BADLY needs to shake this image. They need to make themselves appear to be all about quality. They very well may need to create new product brands that downplay the AMD connection entirely to do this.

    I buy AMD because my experience with nVidia products is usually more frustrating. This is partly because AMD exposes options nVidia does not (the inverse is also true), and also partly because I'm simply more familiar with AMD's more recent hardware.

    I run specialized profiles for my R9 290 that gets me in BF4 ultra@1080@60hz at a v-sync locked 60fps while drawing only 300 watts. My 7870XT (cut down 7950) would pull 330 watts and not perform nearly as well. The 290 outperforms the nVidia 780 system I built for a customer AND pulls less power (while gaming anyway, the 780 wins on idle, albeit not by much).
  • chizow - Saturday, May 9, 2015 - link

    @looncraz, I agree with that 2nd part that AMD has been billed as the cheap, value brand, but that's really none other than AMD's fault. I said it when they launched the 4870, that hey badly underpriced that part relative to how it performed and despite evidence that shows this, AMD fans still claim otherwise. While it was true AMD need to capture mind and market share from Nvidia, they dug themselves a hole they have yet to climb out of. Only after they briefly took the lead with the 7970 did they finally make their move to raise their product stack, but it was ill-timed and ultimately allowed Nvidia to jump their entire product stack up an entire SKU level while creating a super premium part in Titan with their biggest ASIC.

    As for the marketing bit, again, people seem to think marketing is the equivalent of nothing or fluff. I fully disagree. You can't effectively market your products if you don't have the goods to back it up. Indeed, AMD spends a LOT of time marketing their own products, but not by demonstrating and espousing their benefits and features, but more often then not, by disparaging and their competitors. This only takes you so far, because at the end of the day, the end-user will see through any marketing and realize, Nvidia products are better because they deliver the solutions, features and support to differentiate themselves.

    Similarly, I buy Nvidia because I know for a fact AMD doesn't even offer solutions for some of the features and technology I've come to rely upon. Its also a part of the reason I pay close attention to some of these releases, because AMD loves to come out of the gate swinging claiming their products WILL be better some point in the distant future, but at the end of the day you get underwhelming results, delays, or features that just fade into memory as abandonware.

    Also, I'm actually pretty surprised you were able to run BF4 with Vsync on with AMD, certainly you set that in-game right? Because that's another pet peeve with using AMD, their driver-level Vsync simply doesn't work!
  • MisterAnon - Thursday, May 7, 2015 - link

    Is it easy to be a fan of the guys who release a card like the 970 with gimped memory and equal performance to AMD's 2013 equivalent? Ouch.
  • chizow - Thursday, May 7, 2015 - link

    Sure, the 970 is still a great card, a paper spec restatement has not changed this at all. Indeed, we may look back one day and say the 970 was the final nail in AMD's coffin as it is slaughtering Radeon in the marketplace.

    Now, is it easy to be a fan of the guys who say so much misinformation about FreeSync before it even exists, only to have it suck and not be even close to what they claimed originally?
  • Manch - Friday, May 8, 2015 - link

    Well at least you don't deny it. Your constantly railing against anything however AMD is as bad as Apple Sheeple screaming their heads off about Android. You're like Tony Swash on DT. Me a die hard AMD fan? lol I'm a fan boy of neither. I buy what gives me the best bang for my buck.
  • chizow - Friday, May 8, 2015 - link

    Nah, because in the case of Apple they are trading off superior end-user experience for inferior hardware. With Nvidia, there is no compromise. You get a better end-user experience AND better hardware capabilities.
  • stephenbrooks - Tuesday, May 19, 2015 - link

    The tree's response is: AMD support open interfaces, whereas NVidia have a tendency towards proprietary solutions. Also NVidia have a rather high market share right now and I'd like to see something other than a monopoly especially combined with the above tendency towards non-open technologies, so I always buy AMD at the moment.
  • P39Airacobra - Wednesday, June 3, 2015 - link

    Whatever happened to just being a fan of beautiful technology? I am a AMD or Nvidia or Intel user, I do not care, I just go with whoever offers me the best for my dollar. During Nvidia's 8000 and 9000 series I first had a 8800GTS 320, Then a 9800 GTX+, And during AMD's 5000 series I got a 5850, the 5850 lasted a long time, So when it died It was Nvidia's 600 series or AMD's 7000 series, And since at that time I just wanted something cheap but on the performance level of my old 5850 I got a 650 Ti, Because it was cheaper than the 7850 and performed close to it. Then later I got a R9 270, Because games were still not horrible console ports yet, And a cheap re-badged 7870 was perfect for 1080p gaming. Then game devs sale out and release horrible optimized ports, So I took forever deciding to get either a R9 290 or a GTX 970, And I probably annoyed many forums asking for advice! I finally decided on a GTX 970, Because to me it offers the best bang for buck in the high end market. As for CPU's I always used to use AMD. Now I use Intel because a i5 will last forever. But if AMD comes out with a competitive CPU I might go with them again. It makes no sense to me to get lost in the idiot fanboy BS! It makes sense to me to get the best value for my hard earned dollar! I never cared much for being trendy! Trendy does nothing but drag you down and suck all your brain power away.
  • Alexvrb - Wednesday, May 6, 2015 - link

    He gives die hard Nvidia fans a bad name. Also I take issue with the tree comment - I've never encountered an arrogant blowhard tree before. :P
  • chizow - Wednesday, May 6, 2015 - link

    And on that note Alexvrb, you'll be happy to see your favorite turd-like ASIC Turdga, rides again in AMD 300 series.
  • Alexvrb - Thursday, May 7, 2015 - link

    An OEM rebadge, you're gonna have to troll a lot harder than that. I know you have it in you, you're a first class gosu trollmeister. Anyway I argued with your bogus pre-release claims about it throttling. I'd never buy one, I think it's overpriced.
  • chizow - Friday, May 8, 2015 - link

    You really think Turdga is only in OEM? LOL. It's their "newest" chip (but of course, still sucks) and has the most features, given how many times they've rebranded all their older ASICs you REALLY think it won't be in their desktop stack.

    You argued stupidly over its performance and TDP, I already made provisions that they COULD prevent throttling by going with custom cooling, but in doing so, prove their stated TDP was bogus, and they did exactly that.

    But I am sure you were the first idiot to claim the 970 wasn't a 135W TDP part based on THG's review with a custom cooled, custom power target part amirite?

    Go find another AMD turd to polish, fanboy (here's a hint, there's a whole pile of turds in AMD's 300 stack).
  • Alexvrb - Friday, May 8, 2015 - link

    That's much better trolling. Unlike you, I'm not a fanboy. I did point out that the 285 models tested that "blew TDP" were all custom (hence the range of results), and I never knocked the excellent 970 for blowing power targets. Not once. I did point out that lots of Maxwell chips throttle, but only because you were so hellbent on attacking AMD for throttling. My oh my how your view changes depending on your precious.

    Personally I don't think I've given anything but praise for the excellent 970. Sorry, not a diehard fanboy like you. But you're blind to that. If some isn't a diehard Nvidia-only fanboy, then to Chizow that makes them an AMD fanboy somehow. If you disagree with Chizow on something relating to graphics cards, you're automatically an AMD fanboy somehow. Everyone sees how you turn into a total flamer on AMD articles.
  • chizow - Saturday, May 9, 2015 - link

    LMAO, you're not an AMD fanboy, yet you repeatedly defended quite honestly, the worst ASIC AMD has developed in recent history. And now you're going around making excuses for AMD's Rebadgeon line of mobile and OEM chips. Riiight you're not an AMD fanboy, not at all!

    You didn't acknowledge Turdga's throttling/TDP issues, you kept insisting it would be a higher performer at a lower TDP, and as I correctly pointed out, it would be EITHER/OR but not both. And, we later find out it's not even the fully-enabled ASIC, so if it was it would certainly blow by Tahiti TDP while performing better, just as I stated when drawing comparison to other GCN 1.1+ ASIC Hawaii. So yes, in the end I was right.

    And of course you did say something negative about the 970, you tried to make claims about the TDP being misstated but of course, you like every other AMD fanboy was using non-ref higher power target results from THG which were later retracted.

    But yeah, you're not an AMD fanboy, at all! lol.
  • chizow - Wednesday, May 6, 2015 - link

    Sorry yeah, can't really agree with that when there's at least 3 major features right now in AMD's drivers that are still broken/supposed to be fixed that I rely upon and that's before even getting into Day1/game compatibility type stuff or the various value-add features GeForce brings to the table.

    1) For starters, low power mode while driving multiple monitors. I drive 4 monitors, AMD still hasn't fixed this low power state problem while driving multi-monitors. The difference between 20W while working and 220W while gaming is immense. 220W all the time would be....unpleasant to say the least.
    2) VSR. Very limited resolutions supported (no 2x2 or 4K for a 1080p panel on all cards except the 285), with no guarantee this is updated/fixed on these new cards. DSR works amazingly on newer Nvidia hardware right now especially for older games.
    3) FreeSync. Yeah, still a big "I" for Incomplete on this one. Would be REALLY hard to invest in a new $500-700 AMD GPU and a $600+ FreeSync monitor with all the questions and outstanding issues out there relating to VRR windows, ghosting, crossfire support etc.

    And when it comes to work? I administrate a multi-million dollar data center that disagrees with you. :) We exclusively run Nvidia for our computational workloads. CAD and graphic design are more mixed, but still far more that request Quadro for SolidWorks, AutoCAD and Creative Suite/Cloud. The only "Radeons" are in the handful of Mac Pros we have, and Apple does most of the QA/support for them. I did have a user insist on a 290X once but it ended up being shelved for 3 months while we waited for a firmware update from Lenovo to fix their UEFI boot issue. I guess that's not really AMD's problem per se, just shows they're not very well supported by OEMs, either.
  • amilayajr - Wednesday, May 6, 2015 - link

    Holy cow you talk too much. I tell you what, go inside your washroom and deal with some business down there and release some stress...... It's just freaking graphics cards. Buy whatever you want and don't mind others whatever they want to buy. So what if AMD makes this and that product that doesn't meet your expectations. For all I care as a buyer, I will buy whatever my money can buy and what I can afford that will meet my requirements and needs. Now, go back and do some releasing. You definitely need one.
  • chizow - Wednesday, May 6, 2015 - link

    Cool take your own advice, and make sure to flush. The great thing about the internet is that any idiot can make a claim, only those that truly know what they are talking about can back them up.
  • DinoBuaya - Wednesday, May 6, 2015 - link

    Exactly, idiots like you. Good of you to admit it.
  • chizow - Thursday, May 7, 2015 - link

    Like I said, a prime of example of some idiot making a claim, but lacking the knowledge to back it up.
  • DinoBuaya - Friday, May 8, 2015 - link

    By that you mean not even the basic knowledge to back up anything you have posted in all your posts so far in fact proves the very observation you are trying to make about others? Yup, we can see now why you are the prime example.
  • chizow - Friday, May 8, 2015 - link

    I've backed up what I've said aplenty, and I'm more than happy to do so in more detail if you like, Dino. :)
  • 01189998819991197253 - Friday, May 8, 2015 - link

    @chizow
    This fits you to a tee.
    http://www.mattcutts.com/images/duty_calls.png
  • chizow - Friday, May 8, 2015 - link

    Oh great, another meme posting idiot on the internet. I am totally going to click that!
  • Barnassey - Wednesday, May 6, 2015 - link

    Don't forget the broken H.264 encoder that pretty much NO application uses.
  • MisterAnon - Thursday, May 7, 2015 - link

    As someone who owns both I haven't experienced any of those issues, and AMD still has features that Nvidia hasn't implemented like the ability to play games on multiple monitors with different resolutions.

    Everyone here knows that you're just salty that you wasted money on a Titan X (I didn't know people actually bought these terribly valued things) which is bad value even among Nvidia cards yet alone right before R9 300 is coming. Lol get over it.
  • chizow - Thursday, May 7, 2015 - link

    So you can confirm for a fact AMD GPUs fixed low power state while driving multiple monitors? Which driver? This was still broken as of the Dec. WHQL. And Nvidia can't play games on multiple monitors with diff resolutions? What kind of nonsense is this? Windowed mode right? I run multiple instances and multiple games on diff monitors no problem at all. Just more rubbish from usual AMD idiots.

    And yes, it sounds like you're salty because you can't afford better, don't worry, one day you might get there!
  • Alexvrb - Friday, May 8, 2015 - link

    Sounds to me like he was talking about playing one game across multiple monitors of differing resolution. Wait, uh oh, someone said something that wasn't positive about your precious! WHOOSH! Off to set these villains right!
  • chizow - Saturday, May 9, 2015 - link

    Whoosh, AMD fanboys wrong, as usual.

    https://www.youtube.com/watch?v=gJC8uRqoGRU
  • Crunchy005 - Monday, May 11, 2015 - link

    "I administrate a multi-million dollar data center that disagrees with you."

    I wonder how much money they might save by looking at something other than Nvidia. I'm sure he doesn't look at price/performance at all on anything but Nvidia and blindly buys Nvidia top line regardless of price. I kind of feel sorry for that datacenter.
  • chizow - Wednesday, May 13, 2015 - link

    LMAO, how much money would you save by having dozens of researchers, scientists, and doctors twiddling their thumbs because the CUDA-based tools and programs they wrote don't work on the hardware you purchased?

    What you and other AMD fanboys don't realize when espousing the bullshit slidedeck benefits of AMD Open Junkware like OpenCL, is that people's time is money, and on a more morbid note, you can't get time back so its actually wasting what they consider valuable moments of their lives if they have downtime.

    We have people competing for time on our clusters, so yeah, no need to feel sorry for us, feel sorry for the guy who gets fired for trying a science experiment hoping support arrives sometime in the next year or so without first consulting his end-users. Honestly, comments like this shows how little real world experience you have, no one is going to risk their job and livelihood on an unproven commodity, which is why AMD has such a poor showing in the professional and HPC markets.
  • dragonsqrrl - Wednesday, May 6, 2015 - link

    What sort of major problems have you encountered with Nvidia professional products? Hardware, driver related? My personal experience with Quadro's at least (5000, K5000) has been great running workloads in Maya and Adobe CS6/CC.
  • testbug00 - Thursday, May 7, 2015 - link

    A "newer driver" broke some other parts of the system. Also some issues with it and forcing my laptop to shut down.

    Having to adjust the voltage by getting to the PCB to make the GPU stop crashing my computer is no fun.
  • chizow - Thursday, May 7, 2015 - link

    Huh? Sounds like more nonsense. You probably should've just put in for service or RMA.
  • testbug00 - Thursday, May 7, 2015 - link

    On my laptop? Out of warranty? Send it to who exactly?

    Apart from the weird driver bug (Which I believe was statistically unlikely and is NOT the norm (Given my other experiences have all been perfectly fine, that is doubly true)) and the one time opening it up and being a general PITA, everything works.
  • chizow - Friday, May 8, 2015 - link

    So, momentary moment of PEBCAK. I guess you are going to claim your AMD based systems have never crashed or BSOD'd too right?

    How come not a single AMD fanboy has answered the simple question whether or not their graphics are at full power/clocks when driving multiple monitors? Still not fixed, right? You'd think AMD would make that a priority given how much emphasis they put on ZeroCore and PowerTune in their slidedecks.
  • 01189998819991197253 - Friday, May 8, 2015 - link

    @dragonsqrrl
    Nvidia has had serious HDMI handshake issues for years. That makes it awful for HTPCs.
  • 5150Joker - Wednesday, May 6, 2015 - link

    Pretty much what chizow said. Even if AMD could best the Titan X at a cheaper price, I'd still go with the Titan X or 980 Ti. Don't want AMD's half-assed drivers or FailSync that can't even do 144 Hz. Then there's their absymal Crossfire profile support that everyone already knows about.
  • Scannall - Wednesday, May 6, 2015 - link

    I really haven't had any problems with AMD drivers for several years now. And only minor problems with NVidia drivers. That's becoming a non-issue. You do yourself a disservice by assuming things don't ever change or improve.
  • Gunbuster - Wednesday, May 6, 2015 - link

    Oh case closed, this one persons anecdotal experience ties it all up. No coming soon driver promises for freesync crossfire, no broken enduro and coverup, no broken frame pacing, no me too geforce experience slathered with advertisements, no ugly plastic red branding, no freesync FUD a year after G-Sync, ... Keep the faith AMD fans.
  • Refuge - Wednesday, May 6, 2015 - link

    lol same could be said for you.

    Frame Pacing was fixed, G-Sync may have been out sooner, but if we are talking market share they are both basically almost non-existent, so lets not call that a win so soon.

    Optimus was just as broken.

    We get it neither are perfect, you just like green more than red. Its fine.
  • chizow - Wednesday, May 6, 2015 - link

    Optimus was broken? Seriously, you're going to have to do better than that if you want to give your posts any credibility. Be honest here, have you ever even used a laptop with Optimus?
  • Gigaplex - Wednesday, May 6, 2015 - link

    Typing this post on an Optimus laptop right now. Yes, it was broken at one point. Yes, it mostly works fine these days, unless you're a Linux user where it still causes major headaches.
  • chizow - Wednesday, May 6, 2015 - link

    What was broken about it? I've used Optimus since it was introduced in 2010, and while there were some app-specific issues that were addressed with context menu launchers, overall the switchable graphics aspect worked seamlessly. The same cannot be said of Enduro, even today.
  • Refuge - Friday, May 8, 2015 - link

    Yes, Optimus was broken and caused a rather weird variety of crashes, performance issues, etc. I honestly place blame mostly on simple miscommunication between nVidia and OEM's because a lot of the problems I was seeing at first were simply related to OEM firmware not playing properly with the Optimus drivers.

    But its been fixed for quite some time now just like Gigaplex said. Just making the point that things do change, and things do get better.
  • chizow - Wednesday, May 13, 2015 - link

    Except your point is nonsense, because Optimus was fixed early on without nearly as many issues, while Enduro is still junk today.
  • Gunbuster - Wednesday, May 6, 2015 - link

    Oh really? Did Optimus leave people with $1800 laptops unable to game for months because the driver was broken and then go around deleting forum threads of anyone who dare point it out?
  • Scannall - Thursday, May 7, 2015 - link

    Does NVidia pay you to troll forums? If not, why all the brown nosing? Personally, I am a fan of the best bang for the buck when I go to buy a video card. That seems to change month to month, if not week to week. Sometimes I buy NVidia, sometimes AMD.

    And really, drivers are a non-issue. Been that way for quite a long time now.
  • chizow - Thursday, May 7, 2015 - link

    No they don't need to pay me a thing, I'm quite happy with my salary, this is just a fun way to pass the time. Its always a pleasure holding AMD and their fanboys accountable and I do enjoy going back and reading all the stupid crap AMD fans say over the years, its like reading parallel universe fiction.

    If your bang for the buck preferences and buying requirements change month to month and week to week, that just means your standards and use cases are extremely novice/superficial. The bang doesn't really change, and the buck is always going to be competitive.
  • 01189998819991197253 - Friday, May 8, 2015 - link

    @chizow
    What are you, some sort of GPU SJW?
  • Refuge - Friday, May 8, 2015 - link

    Agreed, I've not had GPU driver issues in years.

    Now games coming out in BETA on the other hand... Lets redirect some anger at them for that. When my hardware and drivers are proper, I shouldn't have to wait 3 months until after release for a game I pre-ordered to be playable....
  • chizow - Wednesday, May 6, 2015 - link

    Yeah not going to dismiss your experiences, but I think its important to keep in mind, there are going to be different use cases beyond your own that may start to show some of the problems with AMD drivers.

    Don't get me wrong, if you have one monitor and occassionally play some games, AMD drivers may be just fine, but if you delve a bit deeper and come to rely on some of these more advanced technologies, and game support of these technologies, that's were you may start running into problems.
  • just4U - Wednesday, May 6, 2015 - link

    I almost never have problems with Amd or Nvidia drivers on 100s of builds /w systems used in a variety of ways. I would be another one who'd say that both companies are on par when it comes to drivers.
  • MisterAnon - Thursday, May 7, 2015 - link

    On the other hand my Nvidia drivers crash when doing simple things like playing League of Legends.

    I think wasting money on a Titan X has fried your brain to the point where you're incapable of being impartial and realizing that Nvidia does everything you accuse AMD of.
  • chizow - Thursday, May 7, 2015 - link

    You must be doing it wrong? Nvidia sponsors League of Legends events and a number of top 10 teams, no drivers crash, must be a PEBCAK issue.

    The funny thing is, I would never dream of spending $1000 on AMD GPUs, there's just too many problems that would need to get resolved first, and no, waiting for a maybe solution 8-12 months down the road isn't going to work when these cards depreciate as fast as they do.
  • Hicks12 - Wednesday, May 6, 2015 - link

    I dont understand the issue with AMD drivers, I go for whatever team makes the best card for the money be it the red or green team.

    I used to rock a gtx480 (yay for free upgrades!) and moved to 7950 later on, drivers if anything have been more stable on the AMD side of late as I have always had issue running multiple monitors on the Nvidia GPUs (very finicky), I would assume its been corrected in the latest batch but I dont know as I current dont own an Nvidia card but I cant find a single broken feature for myself on the AMD front.

    Crossfire does work, not sure where you got that information from? Does it work 100% of the time with 0 faults? Hell no but neither does Nvidia SLI solution.

    Free sync works as advertised and was integrated back into the VESA standard, shows how solid the idea is (yes AMD simply revised an old system to work for this century). G-Sync has issues and Free Sync is still gaining momementum, you pay about £50 more for a Free sync monitor vs a standard one so the small outlay is great if you're looking to replace one, G-Sync requires some consideration as its significantly more expensive and you're locked in with Nvidia, thats fine if you know you will always be rocking a green card but honestly no one knows what will happen in 5 years time!

    G-Sync is good but Free sync does the job (seems a lot of reviewers came to the same conclusion) and will only improve over the next year. It would be nice if Nvidia simply integrated support for free sync then no one would have to ever decide on a GPU manufacturer just because of this silly feature that should have been an industry standard years ago.

    Think ive lost my point... all I see is AMD bashing but it seems unjust as drivers are okay on average for both sides now, if you went back 5 years ago then yes AMD were flaky but not any more unfortunately...
  • chizow - Thursday, May 7, 2015 - link

    Hi, you may want to do some fact-checking before you post in reply.

    CrossFire was in reference to FreeSync, still broken, there was just an update on it saying driver was delayed.

    FreeSync doesn't work as advertised, there are gaps in its VRR windows, at which point it drops and reverts to Vsync Off and exhibits all that unpleasant behavior. Also, on some panels it disables Overdrive completely causing ghosting. And the new Asus panel? Its VRR window is limited to 90Hz using FreeSync but goes up to 144Hz without . Certainly not the 9-240Hz AMD claimed and their fanboys parroted, inaccurately.

    You're locked in on the monitor either way, so once again, why not spend more on the better solution, especially given Nvidia offers the better GPU solution also? What is $1050 ($750 G-Sync + $300 970) in hardware compared to $900 ($600 FreeSync + $300 290X) in hardware to someone spending almost $1000 anyways?
  • testbug00 - Thursday, May 7, 2015 - link

    Er, Freesync works exactly as intended:
    1. In the refresh range that the MONITOR MANUFACTURER CHOOSES it has a variable refresh rate.
    2. What do you mean gaps in VVR? Any proof? All the "major issues" I've seen with Freesync have NOTHING to do with Freesync, instead have to do with the monitor manufacturer or is FUD.
    2. the 9-240Hz range is the specification of the refresh range that adaptive-sync has. For a manufacturer to make a 240Hz monitor and get everything working for it to run 9-240Hz smoothly would be really expensive for the manufacturer.

    And, the fact you cannot tell the difference between 900 and 1050 clearly shows that you are in the top sliver of the population. And, for the record, the difference between the 1440p monitors (what I would pair with a 970/290 for most people) is 480 to 749 dollars. If you're aiming for a 1080p display, G-sync seems to be the better buy.

    However, well... Given you're going for a "value 1440p" build... Well, the AMD Freesync version would currently be ~$1600 (http://pcpartpicker.com/p/MFvb6h) while the Nvidia version would be ~1900 (http://pcpartpicker.com/parts/partlist/). You could equalize the price by going 1080p with Nvidia, or lower it a bit by going 2160p (with max 60Hz, however) also.

    G-sync has some fringe advantages, for $300? I could not recommend that to the people I deal with. And, I spend a lot of time configuring computers for friends and people I barely know. For a 1080p build... Nvidia is currently the winner hands down, however! :)
  • chizow - Thursday, May 7, 2015 - link

    1. Except that's not what AMD said during the run-up. Nowhere did they state such VRR window limitations in all the pressers and interviews they did. They also published a whitepaper that fanboys like Creig dishonestly quoted numerous times stating 9-240Hz supported refresh rates, in a clearly dishonest attempt to make FreeSync look better.

    2. There's plenty of proof, below the stated VRR window, VSync is forced off so you get a lurch followed by ghosting, tearing and stuttering. Any reviews, even Anandtech's superficial review covers this. At high refresh rates, you get the same behavior although you can choose to force it on at the top end. It is less noticeable at high refresh though since any tearing is going to be less incremented.
    3. Again, this was quoted by MANY AMD fanboys as one of the reasons FreeSync was claimed to be superior to G-Sync months before anyone even had a chance to prove it! See, this is exactly the kind of BS/misinformation that AMD put out there that simply dies hard. Same for being free, firmware flashable, standard every DP monitor will have etc. It sets up FALSE and UNREALISTIC expectations that ultimately turn out to be bogus. That's what AMD loves to do though, because they didn't actually have a working solution, they used this kind of FUD and misinformation to try and hold off adoption. Doesn't matter now though, world gets to see what they have been working on for the last 21 months, and its not real good lol.

    I didn't say I couldn't tell the difference. I said $150 for anyone who is spending that much is not going to be too much to spend for the better solution, especially when there's a good chance they won't have to spend that much. With Nvidia, there's a good chance the cards you own already support G-Sync. With AMD, there's a good chance you're going to have to upgrade if you don't already own a 290/X.

    And what are you comparing at 480? The sucky Acer TN to the awesome Swift? Nah, that Acer is junk but if you want a better comparison you can use the Asus IPS 1440p vs. the Acer IPS 1440p and you can see, the difference is only $150-$200. Which is about right given the Asus only supports 90Hz refresh in VRR FreeSync mode. Once again, looks like the Nvidia premium is justified.

    I guess that's good you recommend AMD to people you barely know, it'd probably be pretty hard to get repeat business from them making the kinds of recommendations you're giving without even knowing the kinds of issues you're setting them up for. Certainly better than explaining to them why there's this ghosting they haven't seen since 2007 PMVA days or why their FreeSync panel keeps popping in and out of tearing/stuttering modes I guess.
  • testbug00 - Thursday, May 7, 2015 - link

    1. Freesync slide clearly says "published refresh rate range == 9-240Hz" which is because it's based off of A-sync, which... has a published ranged of 9-240Hz!!!! The issue is not what AMD did. The issue is that people who support or don't support it don't seem to understand that is the SPEC which does not reflect what shipping products will have. Shipping products are doing 40-144Hz. Hence, currently, G-sync has a slight lower range advantage. Depending on which Freesync monitor you get, you may have a range that is even worse.

    A side note before we continue, Anandtech's Freesync review(s) have all been pitiful compared to what some other sites have done. The best one I've managed to read so far in terms of exploring the differences between Gsync and Freesync is techreport (http://techreport.com/review/28073/benq-xl2730z-fr...

    2. Ghosting is caused/controlled by the monitor firmware/control panel Free-sync is not involved.
    Tearing and stuttering happen on every monitor to varying degrees. When below (or above) The monitor runs at max Refresh rate, causing tearing and stuttering.

    3. Once more, AMD was very clear in their slides that that was the published range. The issue is people on both sides blowing things up. A fully implemented A-sync range monitor would be better than current G-sync monitors given the firmware/control panel are adjusted for ghosting/overdrive/etc. That product and QA and such probably would end up costing around the same as Gsync does. So, Freesync can deliver a slightly worse product for a noticeable price reduction, or, a better monitor for the same price. Now, the same price is my guestimate. It could be wrong.

    As for the monitors... I've honestly stopped pushing for nicer IPS given the display can be calibrated properly. Personally, I will pay the extra. Most people I've dealt with aren't willing to. Hence, the larger difference in price due to the monitor.

    Given it was me buying the setup, I would likely end up going for a Freesync display as once Async is enabled by Nvidia/Intel it will be essentially be hardware agnostic. And, afterwards, buy a cheap GPU (750-750ti level of peformance) to drive the games I play at low settings until 16FF comes out. And, in general for recommendations end up being whatever is the best in the price range. ~3 years ago i5+7870 (3 monitor setup). Since the 970 came out... well, that one's obvious. Same for when Hawaii came out.

    I do a bit of support for any issues for that stuff. And, I've not noticed any major issues for either vendor. OF course, my sample size is likely not representative of the population as a whole, and, not large enough given that it was representative. Given you have ~80 identical systems sans 40 Nvidia 40 AMD in a population that represents the average video card owner... Perhaps you could draw some useful information. However, as you've said, you run pretty much exclusively NVidia and have experience with a whole 1 AMD card in the sample.
  • chizow - Friday, May 8, 2015 - link

    1. Again, now you're in the awkward position of making excuses for AMD's published deceptive specs. Don't pull a Creig here and keep insisting some day far off in the future FreeSync may support 9-240Hz but today it is better because an AMD spec sheet said so. Its dishonest, plain and simple. AMD should have published specs according to what they knew and what was available on their test samples, but again, we know this wasn't possible because they published those BS specs when they did not have product!

    2. Did you read the TFT Central review I linked you? It is clearly linked to FreeSync because the FreeSync command overrides and directly conflicts with the OverDrive command, thus disabling anti-ghosting measures only when FreeSync is enabled. Tearing and stuttering don't happen with G-Sync, ever actually, because Nvidia has provisions on both ends of the spectrum to explicitly prevent it. While AMD only deals with these fundamental issues within a much more limited VRR window. So surely, you can agree the fact there are so many asterisks and special cases with FreeSync, AMD was dishonest when they said FreeSync would actually be free and that all that hardware Nvidia was charging for might actually be worth the premium?

    3. No, it is not people on both sides, it is 1 side putting out nonsense and the other calling them on it. AMD put out bullshit misinformation because they didn't have product and they were trying to slow adoption. This is their MO. Their fanboys take this misinformation and run with it, scream it from the rooftops and perpetuate it, and it simply never dies. Even today you still have dim-witted AMD fanboys asking when the firmware flashes will be availalble for their half a decade old monitor. Where did they get this idea? No, it couldn't be AMD at fault could it? The ones who coined the misnomer name FreeSync to begin with and told everyone who would listen that existing monitors on the market could support their spec with a firmware update, for essentially free? But I guess you will exonerate them of this as well when people start seeing some awful ghosting and tearing even at that 30-40Hz range and wonder why FreeSync isn't working at 9Hz like AMD claimed?

    Again, I dont' think people who are going to pay $600+ for a monitor are going to balk at another $100-$150 if it means getting a better product. Indeed, we've seen the top G-Sync panels sold out consistently at launch, like the Asus ROG Swift and Acer 27" IPS. Meanwhile, the FreeSync panels are available in abundance everywhere they are sold.

    You could buy a FreeSync panel hoping Nvidia adopts Adaptive Sync someday, but you'd be making a pretty big mistake with that expectation. Again, what guarantee do you have that even if Nvidia were to adopt an Adaptive Sync solution, that panel would be compatible? That AMD would even allow it? You know these panels are certified under a logo program, one that is trademarked by AMD. In any case I would love to see it honestly, it would just be one less concern when buying an Nvidia GPU as you would get your choice of either Adaptive Sync or G-Sync. Win-win for Nvidia users.

    As for the last bit, sorry to burst your bubble, but the real world isn't 40/40 split. Its more like 60/20 nowadays. And you still generally see less issues with Nvidia users and systems. Also, my experience with AMD isn't just limited to the 290X, my wife actually used an AMD card before we met and her 5850, despite being a media darling, was a disaster when it actually came to game compatiblity. Sims 3, one of the most popular games in the world, you would think AMD had their drivers straight there. Nope, no SM2.0 for pet hair fur lol. Also, broken in-game MSAA support on AMD cards, you had to enable FSAA via driver, which was MUCH more performance intensive. Ultimately I "downgraded" her to an old GTX 280 I had lying around and it was actually a much better experience despite being 15-20% slower on paper.
  • Hicks12 - Friday, May 8, 2015 - link

    Thanks you're correct about the free sync with crossfire, I didn't know you were specifically talking about that so I will give you that :).

    The price though.... Sorry I don't see it as only $150 difference,, a quick look shows the acer is $300 more for the gsync model.... That's insane! You could spend $300 on a decent gpu upgrade really.

    Nope.. Gsync is more mature but free sync is better for the market in the end and its exactly why VESA put it in their standard.
  • chizow - Friday, May 8, 2015 - link

    @Hicks12? Do you already own an R9 290/X? Because if not, that's your $300 right there. And the premium on the Asus is well deserved (its actually $200 now) because its a premium build panel, unlike the Acer which is pretty poor build quality before you even get to the inferior VRR results from FreeSync.

    But hey, feel free to go that route, I think it is important for everyone to go with their gut and see what works best for them!
  • Crunchy005 - Monday, May 11, 2015 - link

    Hmm a nice monitor with freesync and a Gcard UPGRADE for the same price as the other monitor. I don't see the issue there.
  • chizow - Wednesday, May 13, 2015 - link

    Because you are paying the same amount for an inferior solution at that point, but no surprise you don't see an issue there, you are an AMD fan after all.
  • Zefeh - Wednesday, May 6, 2015 - link

    If you haven't been keeping track of information on AMD's 390X, then you would be sorely mistaken. The HBM memory is giving the card around 640GB/s bandwidth at 1.25Ghz - That's DOUBLE that of the 290X 320Gb/s running GDDR5 at 7Ghz. Also, Benchmark rumor mill has the 390X running just above the Titan X in performance at ~$700 price point. Add in the fact that they are going to make a dual-GPU card using this chip is amazing. Watch out, because AMD has a HUGE powerhouse coming in thanks for HBM!
  • chizow - Wednesday, May 6, 2015 - link

    It's actually a full 1024GB/s of bandwidth, but bandwidth alone means nothing. Not much point in putting 20" racing tires on a hamster, for example. But keep hope alive! AMD needs it!
  • MisterAnon - Thursday, May 7, 2015 - link

    @chizow

    Considering a last generation part like the R9 290 matches Nvidia's current 970 in performance your comparison makes no sense.

    It just makes you look like a mad fanboy, but I don't blame you. I wouldn't be able to live with myself without being in perpetual denial either if I wasted money on a Titan X (lol).
  • chizow - Thursday, May 7, 2015 - link

    It actually makes plenty of sense, since the 290 is slower than the 970 and has more bandwidth already. So yeah, adding more bandwidth to a part that doesn't need it or can't make use of it, to any non-idiot, would be like putting more lipstick on a pig. But its OK, I'll let you AMD fanboys learn and be disappointed the hard way.

    And of course you woudln't be able to live with yourself, AMD fanboys are generally tech bottomfeeders, so yeah I know you wouldn't be able to wrap your mind around spending top dollar for the best components, and that likely won't change given the bargain perspective that is clearly most important to you.
  • MisterAnon - Thursday, May 7, 2015 - link

    You are one sad nvidia fanboy. Did his comment about you wasting money on a Titan X really trigger you that hard? Hit a little too deep?

    The performance of a R9 290 from 2013 is just about even with a 970 at 1080p and even greater at higher resolutions despite being a generation behind. If you actually wasted money on a Titan X I can see why you're so mad considering 400 dollar cards are about to come out that essentially match it in value and performance. Ouch.
  • chizow - Thursday, May 7, 2015 - link

    290 is slower and also uses nearly 2x the power but yeah, it is an older, inefficient part. The main benefit of the 970 was that it cut pricing on existing parts, including the 290/X and brought that level of performance down to a $330 price point. Big win for everyone, even AMD fanboys like yourself. How many 290/X do you own btw?

    I mean for $1000 you could buy maybe 3 or 4 bargain basement 290/X and Criss-Cross-Bonfire your way to victory! LMAO no thanks, the thought of dealing with CrossFire drivers and profiles that never get updated.... I'll pass, give me one of the best please: Titan X.
  • testbug00 - Thursday, May 7, 2015 - link

    290 uses about 70-80 more watts than a 970. Or, about 40-50% more. On average gaming load. Peak power while gaming is slightly higher difference.

    IF you run a power virus or heavy FP64 stuff, the 290 can approach 2x+ the power draw of the 970... But, it is also a lot faster than 2x in FP64.
  • chizow - Thursday, May 7, 2015 - link

    No, its more like 70% if you use 135W for the 970, so yeah nearly double. And the 970 is faster than the 290, closer to the 290X in performance but when it comes to power it really is double.
  • testbug00 - Friday, May 8, 2015 - link

    er, from what I see, under a gaming load the power draw from the 970 is closer to 150-160. The 290 closer to 235-245. That's about a 50% increase. Power viruses is 160-170 versus 310-320.

    A 980 uses approx 175-185. 290x about 255-265. Those are under gaming load.

    However, it is still a HUGE DIFFERENCE in LARGE favor of Nvidia. Good for their engineers (well, and their margins/stock price.)
  • chizow - Friday, May 8, 2015 - link

    Yeah again, you can spin the numbers however you like, but at the end of the day, 970 uses almost half the power of the 290, and is still faster than it. Pretty damn amazing!
  • Crunchy005 - Monday, May 11, 2015 - link

    Love how your comments keep pointing out how we can get so much more power with AMD than Nvidia. Man 3 way Cross-fire for the same price as one Nvidia card. Although can't you also get a lot more power from two 980s in SLI than a Titan X for the same price? Wow that is one overpriced card.
  • chizow - Wednesday, May 13, 2015 - link

    How do those 3 cross-fire turdpiles perform with FreeSync? Oh right lol, still broken and delayed. money well spent on that TitanX. :)
  • StevoLincolnite - Saturday, May 9, 2015 - link

    At-least the Catalyst drivers worked when Windows Vista launched, nVidia was responsible for almost 30% of all Windows Vista's blue-screen crashes at one point.
    Yup, great drivers.

    nVidia's drivers have been completely problematic during Windows 10's development.

    AMD drivers on the other hand have never had such massive issues across it's entire market, this false fabrication of lies you tell needs to stop.
    These are multi-billion dollar companies, if they make the wrong move (As all companies do!) then they should be ridiculed for it, nVidia's past driver mistakes makes AMD's look gold plated by comparison.
  • chizow - Wednesday, May 13, 2015 - link

    Given Nvidia had a similar stranglehold on the market so this is really no surprise they had more issues with Vista.

    And Nvidia's drivers have been problematic during win10 development? More nonsense, they were first to have WDDM 2.0 drivers for Win10 preview and DX12 drivers, so clearly they are the leading platform for Win10.

    But I guess the most damning evidence is AMD's own driver release Omega that details 400 bug fixes, some of which are pretty egregious (grey/blackscreen). I guess on one hand its great that AMD actually fixed all these bugs, but it also makes you wonder, what took them so long on some of these? And more importantly, it shows AMD users like you are either full of it when they claim they don't experience driver problems, or you're too embarassed to acknowledge them or file bug reports when they happen as if that will hurt AMD's brand and image more than buggy drivers to begin with!

    http://techreport.com/review/27481/catalyst-omega-...
  • betam4x - Thursday, May 14, 2015 - link

    Not an AMD fanboy (My favorite card is my Maxwell based 750 ti), but you are trolling. I have a radeon 6970 in my gaming machine. My friends all have geforce gtx 760s and 960s. They always have issues with the games we play (GTA V, counterstrike global offensive, etc, mainly alt tabbing, random crashing, etc. especially when streaming). I've never had any issues at all with my 6970. The drivers are rock solid stable.
  • WaltC - Thursday, May 7, 2015 - link

    Anyone who ran out and jumped on a Titan X will have ample time to reflect on the virtues of patience...;)
  • chizow - Thursday, May 7, 2015 - link

    Nah the time to reflect is from all the hopefuls waiting months, to MAYBE attain that level of performance at a lower price point. Life is too short, if the performance is there and you have the coin, why not?

    But I know, AMD fans have the virtue of patience...waiting for that performance, waiting for those driver fixes, wait...how many more months til we maybe see an updated AMD part again? I will be sure to reflect how long I was able to enjoy Titan X (almost 2 months already wow!)
  • 01189998819991197253 - Friday, May 8, 2015 - link

    @chizow
    Life is too short, yet you waste hours making dozens of posts here. It's a safe assumption that your time isn't very valuable.
  • chizow - Friday, May 8, 2015 - link

    My time is valuable, but when you are efficient there's plenty of time during the work day to pass the time. :)
  • Chaitanya - Wednesday, May 6, 2015 - link

    I hope the AMD Rx-300 family of GPUs is energy efficient like the much older Radeon 7xxx series of GPUs.
  • chizow - Wednesday, May 6, 2015 - link

    It will certainly be interesting to see how much total board power AMD will save by going to HBM. They quote ~50% less power from GDDR5, so if we say a typical 250W Hawaii based part and maybe 50W of that is just RAM, 50% reduction going to HBM might be 225W instead of 250W, or roughly 10% decrease just from RAM, wouldn't be bad at all. Maybe another 10-15% drop from refining existing ASICs and you are looking at a 200W part instead of a 250W part. With some perf bumps from arch changes and clock bumps you might be getting close to Maxwell-level efficiency levels, but I wouldn't expect too much from this 2015 gen, the big 2x efficiency bump wasn't on the slide until 2016.

    The one area they will definitely benefit from HBM is with a dual-GPU design, where VRAM takes up massive amounts of board space, especially on an X2 part. They will be able to fit 2xGPUs on there np, but that probably won't necessarily mean a smaller cooler or board size, unless they slap two AIO water blocks on there.
  • testbug00 - Wednesday, May 6, 2015 - link

    I think that AMD is going to probably push for higher clocks on current HBM cards. Unless they somehow manage to get Fiji into a notebook! A 400W+ notebook XD

    Fully agree with HBM dual GPU designs. ^__^
  • testbug00 - Wednesday, May 6, 2015 - link

    A suggestion for the site for things such as this, perhaps do a liveblog for these kinds of events and after that's done separate it into several stories? You have 7 pipeline stories where you probably only need 3-4.
  • Ryan Smith - Wednesday, May 6, 2015 - link

    This one is intentional. We wanted to break up each of these items individually.
  • Creig - Thursday, May 7, 2015 - link

    Any chance for a "vote up/down" button for comments? And negative comments that fall below a certain threshold get automatically hidden? There are certain posters in these comments sections who constantly rant/rave/insult and generally annoy the rest of the people who are trying to hold meaningful discussions. That way, the readers have control and can self-moderate these dialogues.

    Just a suggestion.
  • testbug00 - Thursday, May 7, 2015 - link

    Just having a better threading system would be nice. It gets hard to see who is replying to who once you get past the first 3-4 replies (where they all become the same indent, without a "in response to X post" or something like that.
  • chizow - Thursday, May 7, 2015 - link

    Yeah its annoying when people who don't even buy or use these products give bad advice over and over again. C'mon Creig, put your big boy pants on, stop crying, and join the conversation!
  • Creig - Friday, May 8, 2015 - link

    What's the matter, chizow? Afraid that you fall into the "rant/rave/insult" category? If the people who read these comments find your posts to be helpful and insightful then you won't have anything to worry about. If they find your comments to be annoying and immature, then you likely have bigger problems than being voted down and may want to engage in some serious self-reflection about your attitude.
  • chizow - Friday, May 8, 2015 - link

    I'm not worried about anything Creig lol, have you ever considered the only reason you find my posts annoying and immature is because I call you and like-minded fanboys out for all the stupid things you've said over and over throughout the years?

    Looks like its working, you've been MUCH more responsible in your posting tendencies and recommendations lately.

    Indeed, I haven't seen you misinforming the community lately on why FreeSync is better than G-Sync because some dishonest guy at AMD put 9-240Hz supported on a whitepaper 10 months before AMD even had a viable VRR solution! Mission Accomplished!

    I don't expect you to reply to this either, but I am sure if there was a downvote button, you and all your AMD fanboy buddies would be mad-clicking it!
  • Crunchy005 - Monday, May 11, 2015 - link

    Up/Down vote might give him a visual on how his posts are not wanted...Might knock some sense into him.
  • chizow - Wednesday, May 13, 2015 - link

    Not wanted by whom? AMD fanboys? LOL, onoes, I'm not going to call out AMD fanboys on their typical BS because it hurts their delicate fanboy sensibilities! Yeah. No.
  • testbug00 - Thursday, May 7, 2015 - link

    two articles about GPUs this year and the next. An article on K12 that might as well be integrated into the Zen/skybridge article. However, it is certainly not my site, it is run well. I assume there are reasons for doing things the way they are done! Thanks for the reply.
  • rocketbuddha - Wednesday, May 6, 2015 - link

    So MANTLE is no where to be found??
    Once again only DX-12.
    Why not also add OGL Next aka VULCAN??
  • Refuge - Wednesday, May 6, 2015 - link

    They already announced an end to Mantle, well not in those exact words... but same point yes.
  • Senti - Wednesday, May 6, 2015 - link

    No need to worry about Mantle – as Vulkan is effectively Mantle 2.0.
    I'm also irritated with DX12 hype, it's way too far from broad adoption.
  • Stahn Aileron - Wednesday, May 6, 2015 - link

    Maybe so, but getting the hardware out in lockstep with the API is very useful. ATI did the same thing back in the day with DX9 and the old Radeon 9x00. Not many people like the idea of waiting on hardware releases to catch up with software. The same can be said vice versa, but getting software to match hardware is a LITTLE bit easier: software can be modded and released faster than implementing changes in hardware and getting those to the costumers. Future-proofed hardware isn't bad. If anything, getting the hardware out ahead of the API release means devs will actually start using the new API sooner rather than later.
  • Krysto - Wednesday, May 6, 2015 - link

    Not to mention DX12 only has half of the "new-era graphics API" features that Vulkan has. DX12 is more like Mantle 1.0 or Mantle 1.1.
  • iniudan - Wednesday, May 6, 2015 - link

    Mantle is only experimental now, it's still supported, but no point in mentioning it anymore.

    As for Vulcan it is still in development with an unknown release date, little point in mentioning it for Q2 product, as there likely no software release using it at least before Q4, as Valve likely the first one to will bring out something using it, as the Vulcan experimental Intel driver, the first driver with Vulcan support, was created by peoples subcontracted by Valve, so saying that Source 2 will be the first engine using Vulcan is a safe bet and we can estimate that Source 2 release is planned around SteamOS release, which is in Q4.

    While we know that DirectX 12 is likely coming out in late Q2, along Windows 10 and the update of Xbox One to it.
  • Arizzle4l - Thursday, May 7, 2015 - link

    AMD sites still shows mantle.
    http://www.amd.com/en-us/products/graphics/desktop...
  • extide - Wednesday, May 6, 2015 - link

    Who says freesync can't do 144 hz? It can do any frequency the monitor supports, just someone has to release a monitor with a panel that supports those high rates..
  • dragonsqrrl - Wednesday, May 6, 2015 - link

    "just someone has to release a monitor with a panel that supports those high rates.."

    I think you may have just answered your own question.
  • kyuu - Wednesday, May 6, 2015 - link

    Don't bother arguing with chizow on anything related to AMD. They must have kicked his dog and raped his wife or something. That's the only way I can fathom his irrational viewpoint on anything relate to them.
  • testbug00 - Thursday, May 7, 2015 - link

    Freesync can, the "controversy" is about the fact that ASUS released/is releasing a 144Hz Freesync monitor where the Freesync range is only from 40-90Hz. Nothing to do with what Freesync does, everything to do with how ASUS chooses to implement Freesync.
  • Krysto - Wednesday, May 6, 2015 - link

    I hope to see some fast Vulkan support once the spec is out too, but considering AMD is kind of terrible with open source stuff, I'm not going to keep my hopes up.
  • Pwnstar - Wednesday, May 6, 2015 - link

    Like nVidia is much better:
    http://wccftech.com/nvidia-drivers-open-source-fri...
  • ZeDestructor - Wednesday, May 6, 2015 - link

    At least Nvidia closed-source drivers work on *nix. With current versions of X. Can't say the same for AMD I'm afraid, though you're welcome to loan me a GCN-based AMD GPU.. the last AMD GPU I used was the 4650M, quite a long time ago.
  • testbug00 - Thursday, May 7, 2015 - link

    Vulkan is based heavily off of Mantle/PS4-code/XboxOne-code. Same for DX12. Now, there will certainly be differences and things they need to adapt to for both!
  • der - Wednesday, May 6, 2015 - link

    HBM all the wayyyyyyyyyy

    42 comment is love, 42th comment is life.
  • Refuge - Friday, May 8, 2015 - link

    42nd*
  • NinjaFlo - Thursday, May 7, 2015 - link

    I've bothered to read all of the comments here and I've come to a couple of conclusions. Admittedly, I am a fan of Wintelvidia (can you call it this? lol), but still hold AMD in somewhat high regard. I'd like to just express some of my thoughts and see what you guys think - agree or disagree, enthusiastic or salty, I don't really care; just want to see people's reaction and thoughts.

    First, in relation to all the comments thus far, despite all of the people against @chizow, I gotta hand it to you: you have some big brass ones, inciting the wrath of all the AMD fanboys - though you were kind of being a dick about it.

    Here are my conclusions, based on 'objective' viewpoints:
    1) Does the product you seek to buy offer the performance you wish to achieve?
    2) Does the product you have found meet those requirements at a reasonable [performance/power-consumption/thermal-output/noise] ratio?
    3) Does the product you have chosen offer you stability in all areas of work and/or games?
    4) Does the product you selected offer compatibility and/or support for other similar products?
    5) Does the product you picked limit your upgrade path and/or satisfy it's intended life-cycle?

    Everything else, including monetary/financial/pricing concerns, are subjective... which is where I think a lot of Nvidia and AMD fans disagree.

    For myself, I set a goal - I must get 'at least' 1080p at 60FPS with at least High Quality settings. This can change for many people; when they choose* that they want to play at 1440p or 2160p, etc. Then I decide, what is the best product based on the goals I have set - which are not unreasonable goals. In fact, they are goals or criteria revolving around and founded upon the core concept of striving for efficiency. If I am being wasteful of my time, resources, effort, or energy, then I am not being efficient. And as an individual - whose preferences can be completely be different from another person's (who has every right to choose* what they want) - I value efficiency.

    So when I look at a Radeon 290X, I can go "wow, that's pretty amazing for the price..." but then realize that it does not fulfill the requirements of Goals 2 and 3. Strictly-objectively speaking, the 290X runs too hot and consumes way too much power, even for something "last-gen". The same could have been said for Fermi's GTX 480, but it's 2015 now. We have seen over the last few years that Nvidia has improved my criteria of Goal 2, whereas AMD has succeeded in meeting Goal 1, but not 2 - thus risking 3. Where's the efficiency in that?

    I have friends who have AMD builds who are happy about their purchase, and I respect them for that - and I understand why they are happy. Simply because they had different goals and criteria from my own. Instead of striving for strict efficiency, their goal was to stay within a price range and pick the best product within their limited range - and this is perfectly fine... though I would most often argue that efficiency will 90% of the time, get you the performance that you desire for a reasonable price. But it was their choice to go that route, and it was my choice to go mine. The difference between @chizow and I, is that he prefers to troll and I... well I like to troll too : D

    I plan to upgrade to 4K before the end of this year, and I too, am holding out to see what AMD has too offer... so that Nvidia can release something I know will most likely release a product that will fulfill my demands/goals at a lower price-point than the Titan X. Back when the 295X2 was released, that was super tempting, but it still failed to meet 'rules' 2 and 3, while excelling in 4 (add 295X2 + 290X for Tri-CFX!? the temptation is real). But ultimately, I have reduced my build in physical size in my pursuit for quiet, small-footprint, and efficiently powerful performance. My current computer is a Mini-ITX Hadron Air build, running under 500W consumption, with a i5 4670K (4.4Ghz) with an H100i, 16GB (2x8GB) DDR3L 1600Mhz RAM, GTX 780 SC, 4TB Hard Drive, 2TB Hard Drive, and two SSDs of reasonable size. Don't ask me how I managed to fit an H100i and all that in there; no, it's not the Hydro version; it took a lot of time, effort, and planning, and was a pain in the arse to set up - but it was worth it.

    In the end, I feel that we should push for efficiency, but if your system ends up different from mine because you had different criteria to meet, then that's fine by me...

    ... after all, that's what the glory of PC gaming is about. Building creatively and producing machines for fast-paced, higher-quality gaming, with the level of flexibility that you set, versus exclusivity-bricks that we call consoles.

    #longlivepcmasterrace
  • 01189998819991197253 - Friday, May 8, 2015 - link

    @NinjaFlo
    Unfortunately Nvidia can't figure out HDMI handshaking. It's a shame because I would love to use an Nvida GPU in my HTPC, but HDMI handshake issues are a deal breaker. This issue has persisted for years and Nvidia hasn't even acknowledged it.
  • Hicks12 - Friday, May 8, 2015 - link

    @NinjaFlo, see this is how chizow should structure his messages... I wouldnt say he has balls for being an ignorant poster who simply dismisses every AMD based post as completely made up and Nvidia will always be best because of its halo product (this is why Nvidia rush to have the most expensive but 'title' card as it means people like him religiously by Nvidia :D).

    But anyway, I see your criteria for a 'good' card for your own preference and its great that you found the card and are happy with that purchase (its awesome when we actually buy a card!). Can I just focus on that efficiency criteria though

    Going by Anandtechs benchmarks (http://www.anandtech.com/bench/product/1036?vs=105...
    The R290X does beat the 780 in almost every game benchmark apart from thief, grid and bioshock (depending on resolution), it loses by a very small margin on those but wins by a tangible amount elsewhere.

    The power usage in crysis 3
    780: 327w
    R9 290x: 365w

    Thats not much... almost 12% more power but the FPS seems to for the most part be more than 10% greater than the 780 so it seems to be fairly damn even on efficiency. If 40 watts is a crucial amount then the PSU is really being pushed way too far :P, breathing room is required at least a bit ha!

    If the PC was on 24/7 at load then that 40w difference would be £56 a year ( at least for me on my average energy tariff). Dont know about you, if you game 24/7 then you're lucky and that makes a fair choice, personally I would be able to muster about 3 hours of free time a night at a push due to other commitments so it would be much less for me(£7)...

    Depending obviously when you bought it the price could have been the same or different, launch price the 780 was $649 but the R9 290x was $550 (guess that makes up the different in power? :D).

    The stock coolers of the R9 290 series were horrible, thats the one thing I always say Nvidia has done right is provide a good stock cooler, third party coolers had no issue with the R9 290 series though so it was hardly any different in noise / temps compared to the 780 if they were both the same third parties.

    End of the day people pick what they want, you're spot on that the GPU is decided by the persons criteria, if someone asks me to spec a pc I always get what games they're playing as ultimately that is what everyone wants, some games just play horribly on AMD or Nvidia so its game dependent (to a point obviously).

    I think my reply has turned into something else, I am agreeing with you but trying to remove the idea that the R9 290X is an inefficient beast compared to the 780, its not as it provides a similar increase in FPS compared to the power it draws and its not much more than the 780 already. Comparing it to the 980 is a different matter as its always been this way, Nvidia and AMD have different development schedules since the beginning of time so of course its Nvidias turn to be a 'generation' ahead of AMD.

    Anyway, back to enjoying the wonders of pc gaming :D
  • chizow - Saturday, May 9, 2015 - link

    lmao I've structured them like this in the past before but I really shouldn't have to, it should be obvious to anyone who is interested in these parts, that Nvidia wins in pretty much every consideration other than price:perf, and even then, it is always close enough except on the ultra high-end that Nvidia offers a relevant alternative at a slight premium.

    Being called ignorant by someone who can't even pick up a simple process/generation discussion however, is certainly a first.
  • chizow - Saturday, May 9, 2015 - link

    @Ninjaflow haha great post man, and right on with the comprehensive list of objective criteria. Its something I've laid out many times to AMD fanboys in giving examples of features, tech, considerations that make It a no-brainer to go with Nvidia, but then you just get responses in return from AMD fanboys downplaying or marginalizing such features as unimportant, not bugs, bad for pc gaming etc. Wonder where they get that from? haha.

    Its like straight out of AMD's "marketing" playbook. AMD fanboys love to throw out marketing as if it is some nebulous pejorative thing out there, but when Nvidia and their fans point to actual features and support they use and enjoy daily, its all negative marketing responses in reply. I guess there is a big difference in marketing strategy though, Nvidia markets awesome stuff, AMD markets why you shouldn't care about that awesome stuff. In the end, they certainly do serve their respective portions of the market. Nvidia markets to the overwhelming majority of the market that is willing to pay a slight premium for better features/support, AMD markets to tech bottomfeeders that need to save a few bucks over all else.

    In any case you will want to wait for sure and hold off on that 295X2, as AMD doesn't support HDMI 2.0 on any of their current parts. That will greatly limit your 4K options to actual desktop monitors, but the option to run to both an HDTV or G-Sync 4K monitor would be important to me at this point I think.

    But yes in the end it is about buying products that make PC gaming better, which is another reason to favor Nvidia given all their work and money invested into tech like GameWorks that makes the PC gaming experience better than consoles. I don't think you can just leave it at pcmasterrace anymore though lol, my new thing is:

    #geforcemasterrace and I hate the hole hashtag business! :)
  • Hicks12 - Saturday, May 9, 2015 - link

    You do realize that display port 1.2 is the most common connector used for 4k Monitors.... Haven't even seen a hdmi 2.0 monitor!

    Its important to note gsync doesn't support anything other than display port.... Says it all really right? Display port is the main connection of modern gpus :).

    You still can't admit it that amd and nvidia produce good gpus and ultimately it comes down to the individual. Stop saying the sweaping statement that Nvidia is better at everything because it's flat out bullshit and makes you look like a Rolo haha.

    Price performance is a serious category... You can't ignore that unless you're a die hard fan of the company and will buy anything they release.

    It's pointless even trying to get this through to you as others have said you don't listen and just keep saying the same old crap :).
  • chizow - Monday, May 11, 2015 - link

    Again, where do I confuse the two? You do realize that you have hundreds more options at 4K using HDMI 2.0 in the HDTV space, and often get a better quality panel (IPS vs. desktop monitors still mainly TN) right? I clearly state HDTV *OR* G-Sync 4K, it is all about leaving your options open and Nvidia provides that 2nd option with HDMI 2.0 support.

    Where did I say AMD didn't produce good GPUs? None of which changes the fact Nvidia produces BETTER at similar price points. Again, you can say it comes down to the individual, but that does NOT change the fact Nvidia offers more features and better support of their features in games that results in a superior end-user experience. So yes it is going to simply come down to how much importance you place on price over these other considerations, but as we have seen, the overwhelming majority of the market prefers Nvidia even if AMD leads in areas like price:performance (see 290/X getting slaughtered in the marketplace compared to GTX 970/980).

    It is pointless to keep pointing this out to you as Ninjaflo and others have done when you simply don't understand, there is more that goes into an end-user experience and a buying decision than the FPS in the corner of your screen and the price in your cart, and FreeSync vs. G-Sync is just one more of those considerations.

Log in

Don't have an account? Sign up now