Now with things like this coming to GPUs, I'm still wondering why we still be hamstrung by these glorified set top boxes with 8GB of system memory with probably around 6 to 6.5 usable, maybe a game will use 7.8GB and will be a console exclusive to make under utilized gaming PCs (costing at least several hundred more than than console) artificially look like trash. Seriously, we're going to have people with gaming PCs with 16GB of RAM plus a GPU with 16GB of VRAM at the high end. These people will be hamstrung by all of those on weak integrated graphics or otherwise weak systems. No one gaming on PC wins.
These console things are sorry excuses for gaming machines. They are the poorest value and have not moved gaming technology forward more than a fraction, I'm going to be sitting on a DirectX 13 (if that ever comes about) card, and AMD will be forced to release a "Mantle 2" software api to actually utilize proper gaming hardware again. This has been gaming generation of stagnation, no tech progression that, to me, is meaningful, I despise VR, so don't bother trying to hype me up for something that literally makes me sick. I guess the loss of my $1000+ gaming rig to a thunderstorm surge is no big deal anymore as there is nothing worth buying right now anyway. Still is rather annoying to only have this laptop left for my computer use. I don't really feel like dropping another thousand dollars on a desktop rig so soon.
By the time I can rebuild, all new parts will be available for me to spend less, so I won't need to feel miserable that my rig is an outlier at several times the power of all current consoles, and is being undervalued, underutilized and outright disrespected by the very game developers who want to sell me their console game on PC that runs like garbage on my supposedly super powerful gaming rig because they didn't care enough to offer PC options.
How can I be happy that this monstrosity is coming to PC. A cut down variant would be a 16GB VRAM card. Games barely even utilize the 4 and 8GB cards we have now. The new cards from AMD are probably going to range from low end 2GB all the way to 16GB just for VRAM. Maybe even a 32GB at the top end of the GPU line.
I realize this a workstation card, but that doesn't mean that AMD doesn't have gaming GPUs coming that will have large amounts of VRAM. They most certainly do have HBM2 cards with large amounts of VRAM for them to use. I wonder what PC games will make actual use of the power of that plus a top end Zen or AM4 processor. Sigh.
There is so much /facepalm in this post it's actually mind boggling. In almost 20 years of being a PC hardware and gaming enthusiast, I have never been able to wrap my head around why people continuously create long ranting diatribes about workstation or enterprise hardware and apply that "logic" to gaming hardware to justify their ranting.
You could have left it at the rant about consoles, which was perfectly justified and relatively accurate (as far as them holding back gaming by being outdated and frankly terrible hardware to begin with), but you had to go on, you couldn't just stop there.
On a side note, why the hell didn't you have a surge protector?
It really is an outlier. If I recall the sales statistics out there, the majority of computers sold are priced at a much lower purchase price and are often used to play video games of one sort or another. While a total component and software cost of over $1K isn't considered very expensive among many people who build their own systems and take that sort of thing quite seriously, that chunk of people is a smaller portion of the total home computing market.
Fair enough but in the context of his post, he alluded to saying he built his rig. In the relatively small world of DIY gaming rigs I dont see a 1k rig as an outlier.
LOL. True. I remember attending an Autodesk conference like...15 years ago when I was in college. The CEO at the time mentioned that around 85% of Autodesk software in the field is pirated and they are OK with that (echoing Adobe's stance) because the majority of that is people using it for non-profit, students, or generally cocking about (that was his exact quote) and that these people will hopefully move on to a business application where they will be obligated to purchase a legitimate product.
That, in essence, is the RIGHT perspective for any software developer. How are people curious, or people who want to learn but cant afford to take classes, or even people who take classes but don't want the neutered/cant afford the student version, going to learn your expensive ass software without stealing it?
I have to agree console games are a prison ball on the leg of PC gaming... partially
The fact is developers these days greatly waste RAM and those console restrictions should force them to use it more effectively... I am not entirely convinced it really does though.
Other than that there are plenty ways how more available RAM could be utilized (more loaded resources, pre-computation of stuff running on other available cpu cores, name it...) It won't happen until games are natively 64bit binaries due to 32bit address handling restrictions in 32bit processes.
In workstation environment, that much RAM comes to a good use already. Like for example processing high resolution cinematic camera recordings... or, to that end, special effects studios, whose applications are whole level more demanding on resources...
Seriously, where the heck did this guy come from and why does such a post get thumbs up at Anandtech? Was this guy living in a cave somewhere?
This guy is saying that modern games do not even utilize 4 GB on the GPU, however, Grand Theft Auto V (GTA V) is utilizing 100% of the 3072 MB GDDR5 of my GTX 780 TI AND 8 GB System Memory AND is in addition reserving AN ADDITIONAL 14 GB systems memory! And that is at 1920X1080, and I don't have enough GPU RAM left to maximize Graphics Settings (i.e. "Advanced Graphics" switched off) and even one or tow seetings at the Graphics section of GTA V ere not set to ULTRA because I am using 100% of my GPU RAM.
So, indeed we are living in an era where we are stuck with lesser technology GPUs and we are also stuck with GPUs that have far less RAM than they ought to. IMO, top-end mainstream gaming GPUs whould have come with 12 GB memory.
For the record, the only GPU that can play GTA V with everything set to max at 4k is the 12 GB Titan variant. They tried with a GTX980TI 6GB and the card was choking due to lack of memory. So much for the OP's claim that no game can utilize 4 or 8 GB memory, ahahaha.
If I recall correctly, PC cards have issues with modern console games due to the Heterogeneous Memory Architecture of the consoles (I believe Anandtech did an excellent article on HMA a year or 2 back).
The idea is that game developers, are given a large pool of memory to carve up as they see fit Turns out that a lot of triple-A game makers have been choosing to devote large portions of it to texture, hence the issues on even top tier cards. It seems it isn't a power issue so much as it is a resource availability issue.
Poor programming partially, but the root issue stands. We need more GPU memory. The GPU handles a lot of post processing now, and with textures and resolutions increasing, we will only need more. 4GB is the minimum for modern AAA gaming at 1080p in full detail.
Exactly. I had a 1GB graphics card that COULD run GTA, but not at 1080p without those nagging errors that GTA spits out. As a stopgap, I bought a R9 200 with 3GB of RAM, so I can at least bump to 1080p and keep a lot of things below ultra to make it playable at an average of 35fps with dips into 25fps. Using an relic i7 920 cpu...
I don't mind the cinema standard of 24fps, but it turned out to be a good parameter to know when the GFX card is choking. On the bright side, I have 12GB of RAM, and the GTA executable sucks a whole 8GB for itself, leaving the rest for the system.
So yeah, theory disproven. I need every ounce of GPU and CPU power I can muster to play GTA 5 at least @ 1080p.
I'd have to dump my whole rig and buy a fresh machine with a Titan to dial everything up.
Is it possible there is something else going on? I am not a programmer so I am completely out of my depth here (but I am thinking out loud so please be patient).
What I mean by the "something else", I am reminded of the DX9 requirement (IIRC) that everything in video ram be mirrored in system memory, which, would seriously cut into your available RAM if you were running a 32 bit program.
A year or 2 ago and Anandtech was originally writing some articles about HMA and hUMA, they talked about some of the finer points of memory reservations/sharing and systems/GPUs. My uninformed thoughts on high system memory usage have me wondering if it is possible that there is something about how the instructions on the consoles work that, ends up making a lot of memory duplication when the easiest porting methods are used (as opposed to ripping the game apart and practically remaking the various bits specifically for tradition PC architecture or something that would cost them more money).
Anyway, those are just my thoughts. Anybody have more insight deeper than just " poor optimization"?
Hey, games are serious business. People spend a lot of time arguing about nearly every aspect of video games because they play an important role in the formation of the lens through which they view the world. For some people, absolutely nothing else has ever or will ever matter more than giggling gleefully into a microphone about how amazing it was to cause the fictional death of another person's digital representation in the confines of a game's artificial world. Just because the rest of us muted all the audio and aren't even sitting behind our screens because we're busy living our lives in the material world doesn't diminish the importance to that one person.
So, uh, maybe spend less next time? $1k is nothing for a serious gaming PC, but if it's really as bad as all that, you could drop around $600 and be as happy as someone apparently as bitter as yourself might ever expect to be.
"... consequently retakes their top spot in the market"
I cannot see how that is remotely justified. So this card has a lot of RAM, how does that mean AMD now has the "top spot"? Based on what performance results, etc.? Has anyone tested a 500GB GIS set and compared the interaction performance with NV's current best? Differences could end up being platform dependent, rather than GPU potential. Big data compute on the GPU is so much more complex than oh this card is better 'cos it has more RAM. Storage I/O is also critical in many cases. Some tasks that use lots of GPU RAM may fit into 32GB and then not change much while the task is running, others are swapping out constantly to new data.
I expect generalisations in gamer articles, not pro topics.
The article already noted that "any possible performance impact is data set size dependent", and it was pretty clear from context that the "top spot" comment was specifically about RAM capacity. It's pretty clear they're not talking about benchmarked performance, since they don't purport to even have that yet. It's not hard to tell what they actually mean, which is that AMD is offering a card with more RAM thasn Nvidia, which is directly relevant for some (but not all) of the professional market.
I wonder how this compares to the nVidia's offerings.
And to the person that was ranting about gaming, you're an idiot.
(You're probably also the same type of person who would comment on a Top500 supercomputer post with "But can it play Doom?" or rant about some other equally stupid crap like that. *rolls eyes*)
Interesting that this card is supposed to be for <a href="http://g8.gy">professional graphics workstation</a> type loads, but the discussion here is mostly pertaining to entertainment usage/game usage. Generally I've found that big numbers in professional type equipment doesn't always carry over to exceptional performance in games.
It's also interesting that the technology seems to be advancing so quickly, 6 months later and the $1k cards have changed so much. 8Gb DDR5 is like a minimum number for a real GPU right now. Not that you can always find a decent card, the currency guys are snatching them up pretty quick.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
29 Comments
Back to Article
BMNify - Thursday, April 14, 2016 - link
Now with things like this coming to GPUs, I'm still wondering why we still be hamstrung by these glorified set top boxes with 8GB of system memory with probably around 6 to 6.5 usable, maybe a game will use 7.8GB and will be a console exclusive to make under utilized gaming PCs (costing at least several hundred more than than console) artificially look like trash. Seriously, we're going to have people with gaming PCs with 16GB of RAM plus a GPU with 16GB of VRAM at the high end. These people will be hamstrung by all of those on weak integrated graphics or otherwise weak systems. No one gaming on PC wins.These console things are sorry excuses for gaming machines. They are the poorest value and have not moved gaming technology forward more than a fraction, I'm going to be sitting on a DirectX 13 (if that ever comes about) card, and AMD will be forced to release a "Mantle 2" software api to actually utilize proper gaming hardware again. This has been gaming generation of stagnation, no tech progression that, to me, is meaningful, I despise VR, so don't bother trying to hype me up for something that literally makes me sick. I guess the loss of my $1000+ gaming rig to a thunderstorm surge is no big deal anymore as there is nothing worth buying right now anyway. Still is rather annoying to only have this laptop left for my computer use. I don't really feel like dropping another thousand dollars on a desktop rig so soon.
By the time I can rebuild, all new parts will be available for me to spend less, so I won't need to feel miserable that my rig is an outlier at several times the power of all current consoles, and is being undervalued, underutilized and outright disrespected by the very game developers who want to sell me their console game on PC that runs like garbage on my supposedly super powerful gaming rig because they didn't care enough to offer PC options.
How can I be happy that this monstrosity is coming to PC. A cut down variant would be a 16GB VRAM card. Games barely even utilize the 4 and 8GB cards we have now. The new cards from AMD are probably going to range from low end 2GB all the way to 16GB just for VRAM. Maybe even a 32GB at the top end of the GPU line.
I realize this a workstation card, but that doesn't mean that AMD doesn't have gaming GPUs coming that will have large amounts of VRAM. They most certainly do have HBM2 cards with large amounts of VRAM for them to use. I wonder what PC games will make actual use of the power of that plus a top end Zen or AM4 processor. Sigh.
Kutark - Thursday, April 14, 2016 - link
"I realize this is a workstation card..."No, no I don't think you do understand that.
There is so much /facepalm in this post it's actually mind boggling. In almost 20 years of being a PC hardware and gaming enthusiast, I have never been able to wrap my head around why people continuously create long ranting diatribes about workstation or enterprise hardware and apply that "logic" to gaming hardware to justify their ranting.
You could have left it at the rant about consoles, which was perfectly justified and relatively accurate (as far as them holding back gaming by being outdated and frankly terrible hardware to begin with), but you had to go on, you couldn't just stop there.
On a side note, why the hell didn't you have a surge protector?
Notmyusualid - Thursday, April 14, 2016 - link
+1Salvor - Thursday, April 14, 2016 - link
I'm more surprised a reader of anandtech only spent $1k on their gaming rig, and isn't excited about the prospect of having an excuse to replace it.Manch - Friday, April 15, 2016 - link
Im surprised he called his 1K rig an outlier....BrokenCrayons - Friday, April 15, 2016 - link
It really is an outlier. If I recall the sales statistics out there, the majority of computers sold are priced at a much lower purchase price and are often used to play video games of one sort or another. While a total component and software cost of over $1K isn't considered very expensive among many people who build their own systems and take that sort of thing quite seriously, that chunk of people is a smaller portion of the total home computing market.Manch - Friday, April 15, 2016 - link
Fair enough but in the context of his post, he alluded to saying he built his rig. In the relatively small world of DIY gaming rigs I dont see a 1k rig as an outlier.bpwnes - Saturday, April 16, 2016 - link
who pays for software?Samus - Sunday, April 17, 2016 - link
LOL. True. I remember attending an Autodesk conference like...15 years ago when I was in college. The CEO at the time mentioned that around 85% of Autodesk software in the field is pirated and they are OK with that (echoing Adobe's stance) because the majority of that is people using it for non-profit, students, or generally cocking about (that was his exact quote) and that these people will hopefully move on to a business application where they will be obligated to purchase a legitimate product.That, in essence, is the RIGHT perspective for any software developer. How are people curious, or people who want to learn but cant afford to take classes, or even people who take classes but don't want the neutered/cant afford the student version, going to learn your expensive ass software without stealing it?
HollyDOL - Friday, April 15, 2016 - link
I have to agree console games are a prison ball on the leg of PC gaming... partiallyThe fact is developers these days greatly waste RAM and those console restrictions should force them to use it more effectively... I am not entirely convinced it really does though.
Other than that there are plenty ways how more available RAM could be utilized (more loaded resources, pre-computation of stuff running on other available cpu cores, name it...) It won't happen until games are natively 64bit binaries due to 32bit address handling restrictions in 32bit processes.
In workstation environment, that much RAM comes to a good use already. Like for example processing high resolution cinematic camera recordings... or, to that end, special effects studios, whose applications are whole level more demanding on resources...
Achaios - Friday, April 15, 2016 - link
Seriously, where the heck did this guy come from and why does such a post get thumbs up at Anandtech? Was this guy living in a cave somewhere?This guy is saying that modern games do not even utilize 4 GB on the GPU, however, Grand Theft Auto V (GTA V) is utilizing 100% of the 3072 MB GDDR5 of my GTX 780 TI AND 8 GB System Memory AND is in addition reserving AN ADDITIONAL 14 GB systems memory! And that is at 1920X1080, and I don't have enough GPU RAM left to maximize Graphics Settings (i.e. "Advanced Graphics" switched off) and even one or tow seetings at the Graphics section of GTA V ere not set to ULTRA because I am using 100% of my GPU RAM.
So, indeed we are living in an era where we are stuck with lesser technology GPUs and we are also stuck with GPUs that have far less RAM than they ought to. IMO, top-end mainstream gaming GPUs whould have come with 12 GB memory.
For the record, the only GPU that can play GTA V with everything set to max at 4k is the 12 GB Titan variant. They tried with a GTX980TI 6GB and the card was choking due to lack of memory. So much for the OP's claim that no game can utilize 4 or 8 GB memory, ahahaha.
Manch - Friday, April 15, 2016 - link
I don't think that +1 means what you think it means. That was for the critical reply to the dudes rant. Nobody gave him a thumbs up....mapesdhs - Friday, April 15, 2016 - link
Modded games also use a lot of RAM. Skyrim is a good example, people with complex setups easily use 6GB+.Heck, my customised Crysis config uses almost 4GB. :D
minasnoldo - Friday, April 15, 2016 - link
If I recall correctly, PC cards have issues with modern console games due to the Heterogeneous Memory Architecture of the consoles (I believe Anandtech did an excellent article on HMA a year or 2 back).The idea is that game developers, are given a large pool of memory to carve up as they see fit Turns out that a lot of triple-A game makers have been choosing to devote large portions of it to texture, hence the issues on even top tier cards. It seems it isn't a power issue so much as it is a resource availability issue.
Samus - Sunday, April 17, 2016 - link
Poor programming partially, but the root issue stands. We need more GPU memory. The GPU handles a lot of post processing now, and with textures and resolutions increasing, we will only need more. 4GB is the minimum for modern AAA gaming at 1080p in full detail.Gonemad - Tuesday, April 19, 2016 - link
Exactly. I had a 1GB graphics card that COULD run GTA, but not at 1080p without those nagging errors that GTA spits out. As a stopgap, I bought a R9 200 with 3GB of RAM, so I can at least bump to 1080p and keep a lot of things below ultra to make it playable at an average of 35fps with dips into 25fps. Using an relic i7 920 cpu...I don't mind the cinema standard of 24fps, but it turned out to be a good parameter to know when the GFX card is choking. On the bright side, I have 12GB of RAM, and the GTA executable sucks a whole 8GB for itself, leaving the rest for the system.
So yeah, theory disproven. I need every ounce of GPU and CPU power I can muster to play GTA 5 at least @ 1080p.
I'd have to dump my whole rig and buy a fresh machine with a Titan to dial everything up.
minasnoldo - Tuesday, April 19, 2016 - link
Is it possible there is something else going on? I am not a programmer so I am completely out of my depth here (but I am thinking out loud so please be patient).What I mean by the "something else", I am reminded of the DX9 requirement (IIRC) that everything in video ram be mirrored in system memory, which, would seriously cut into your available RAM if you were running a 32 bit program.
A year or 2 ago and Anandtech was originally writing some articles about HMA and hUMA, they talked about some of the finer points of memory reservations/sharing and systems/GPUs. My uninformed thoughts on high system memory usage have me wondering if it is possible that there is something about how the instructions on the consoles work that, ends up making a lot of memory duplication when the easiest porting methods are used (as opposed to ripping the game apart and practically remaking the various bits specifically for tradition PC architecture or something that would cost them more money).
Anyway, those are just my thoughts. Anybody have more insight deeper than just "
poor optimization"?
06GTOSC - Friday, April 15, 2016 - link
People who can't afford $1500-2000 gaming rigs apologize to you for "holding back" your entertainment.BurntMyBacon - Friday, April 15, 2016 - link
@06GTOSC: "People who can't afford $1500-2000 gaming rigs apologize to you for "holding back" your entertainment."Apology accepted, but there is really no need; it's just a game. ... Oh, wait. You weren't talking to me.
BrokenCrayons - Friday, April 15, 2016 - link
Hey, games are serious business. People spend a lot of time arguing about nearly every aspect of video games because they play an important role in the formation of the lens through which they view the world. For some people, absolutely nothing else has ever or will ever matter more than giggling gleefully into a microphone about how amazing it was to cause the fictional death of another person's digital representation in the confines of a game's artificial world. Just because the rest of us muted all the audio and aren't even sitting behind our screens because we're busy living our lives in the material world doesn't diminish the importance to that one person.Michael Bay - Friday, April 15, 2016 - link
You`ll pay that up later in games, DLCs and other assorted shit.fluxtatic - Saturday, April 16, 2016 - link
So, uh, maybe spend less next time? $1k is nothing for a serious gaming PC, but if it's really as bad as all that, you could drop around $600 and be as happy as someone apparently as bitter as yourself might ever expect to be.mapesdhs - Friday, April 15, 2016 - link
"... consequently retakes their top spot in the market"I cannot see how that is remotely justified. So this card has a lot of RAM, how does that mean AMD now has the "top spot"? Based on what performance results, etc.? Has anyone tested a 500GB GIS set and compared the interaction performance with NV's current best? Differences could end up being platform dependent, rather than GPU potential. Big data compute on the GPU is so much more complex than oh this card is better 'cos it has more RAM. Storage I/O is also critical in many cases. Some tasks that use lots of GPU RAM may fit into 32GB and then not change much while the task is running, others are swapping out constantly to new data.
I expect generalisations in gamer articles, not pro topics.
shelbystripes - Friday, April 15, 2016 - link
The article already noted that "any possible performance impact is data set size dependent", and it was pretty clear from context that the "top spot" comment was specifically about RAM capacity. It's pretty clear they're not talking about benchmarked performance, since they don't purport to even have that yet. It's not hard to tell what they actually mean, which is that AMD is offering a card with more RAM thasn Nvidia, which is directly relevant for some (but not all) of the professional market.I'm not sure what your problem is.
ingwe - Friday, April 15, 2016 - link
Is the 274 W spec for the W9000 a typo is or is actually specced at 274 W and not 275 W? Not a big deal but I am curious.Ryan Smith - Monday, April 18, 2016 - link
That's the official spec.Marcelo Viana - Friday, April 15, 2016 - link
32 GVram still small amount of memory. When 64GVram comes, then i'll take it seriously.alpha754293 - Friday, April 15, 2016 - link
I wonder how this compares to the nVidia's offerings.And to the person that was ranting about gaming, you're an idiot.
(You're probably also the same type of person who would comment on a Top500 supercomputer post with "But can it play Doom?" or rant about some other equally stupid crap like that. *rolls eyes*)
m7nz - Tuesday, June 20, 2017 - link
Interesting that this card is supposed to be for <a href="http://g8.gy">professional graphics workstation</a> type loads, but the discussion here is mostly pertaining to entertainment usage/game usage. Generally I've found that big numbers in professional type equipment doesn't always carry over to exceptional performance in games.It's also interesting that the technology seems to be advancing so quickly, 6 months later and the $1k cards have changed so much. 8Gb DDR5 is like a minimum number for a real GPU right now. Not that you can always find a decent card, the currency guys are snatching them up pretty quick.