Comments Locked

47 Comments

Back to Article

  • tipoo - Tuesday, November 26, 2013 - link

    Yep, that eSRAM sure does take up a lot of space that could have been used elsewhere, if the main memory bandwidth didn't necessitate the eSRAM. Weird, one pool of eSRAM is too far from the GPU logic to be useful, and is right between the two CPU core clusters, what's that for?
  • tipoo - Tuesday, November 26, 2013 - link

    I also don't like how they call a compute unit a core here, as individual shaders are also called cores, which could cause confusion. Already had a few people ask why they have 12-18 cores when PC GPUs have hundreds, haha.
  • dylan522p - Tuesday, November 26, 2013 - link

    Which is why Anand did this. "2 CUs (768 cores) at 853MHz vs. 14 CUs (896 cores)"
  • tipoo - Tuesday, November 26, 2013 - link

    I know, I appreciate that. My "they" was chipworks, though I appreciate everything they do too, it should have been labeled a CU rather than core.
  • patrickjchase - Monday, December 2, 2013 - link

    Actually GPU core-counts are hugely exaggerated, and that ChipWorks' notation is at least as accurate as the way things are marketed in the consumer space.

    In almost every single field except for 3D graphics, a "core" is considered be an entity that executes one or more independent instruction streams ("or more" if it happens to implement SMT/hyperthreading). A core that processes multiple *data* streams via SIMD is still counted as one core. For example, Haswell can process up to 8 32-bit data elements per AVX instruction, but it's still counted as one core.

    GPUs are in fact SIMD machines, just like Haswell's AVX engine. Each "Radeon Graphics Core" in XBone or PS4 executes one instruction stream. The fact that each such instruction is applied to 64 data elements in parallel does not make it "64 cores" any more than AVX support makes a single Haswell "8 cores".

    There a [weak] counterargument here: GPU's have a trick that AVX doesn't (yet), namely the ability to *emulate* multiple instruction streams via vector predicates. What this means is that the entire vector engine executes the same instruction stream, but individual lanes can be selectively enabled/disabled. This allows divergent execution (different lanes appear to execute different code) though at a huge performance cost since every lane in a GPU core has to execute every instruction that any lane requires. Most people other than GPU manufacturers would say that's not sufficient to qualify the lanes as independent cores.

    Short summary: ChipWorks has it right. Anand probably knows that as well as I do, but used the more common "marketing notation" to avoid confusing his readers.
  • SetiroN - Wednesday, November 27, 2013 - link

    Because it's intuitive enough to anyone who reads chipworks.
  • tipoo - Thursday, November 28, 2013 - link

    More clueless people look at Chipworks images than you would think, lol. It was huge on the PS4 subreddit, blogs of all sorts, etc.
  • Kevin G - Wednesday, November 27, 2013 - link

    My guess would be some logic for coherency. There are two CPU clusters, a GPU, audio DSP's and other IO that all have a need to be coherent with at least one other part of the system.

    Another idea would be memory for exclusive usage by the system's hypervisor and/or DRM.

    I don't think that it'd be a L3 cache as Jaguar uses inclusive caches in a quad core cluster. Scaling to eight cores have to have an L3 cache more than twice the size of a cluster's L2 cache.
  • tipoo - Wednesday, November 27, 2013 - link

    Perhaps, but the PS4 seems to lack such an area for SRAM outside of the CPU caches, it must be something the One has that the PS4 doesn't, as if it was for coherency wouldn't both need it?
  • dylan522p - Tuesday, November 26, 2013 - link

    Are we ever gonna get a write up on Kings Landing? I have been reading tons about it and think I know most of what Intel showed off and understand it, but I wish you guys did one.
  • Mr Perfect - Wednesday, November 27, 2013 - link

    Kings Landing? Do you mean Knights Landing?
  • LemmingOverlord - Wednesday, November 27, 2013 - link

    A geek wrapped in a GoT fan mindset
  • dylan522p - Wednesday, November 27, 2013 - link

    Yup. I am reading the books currently and it accidentally popped up in my head and I typed it instead of Knights.
  • tipoo - Tuesday, November 26, 2013 - link

    So the larger pool of eSRAM can only be accessed by the CPU after copying to main memory, right? I wonder if the smaller pool right by the CPU part instead of the GPU part might have to do with the MOVE engines, or perhaps SHAPE, or if it's just a scratchpad between CPU and GPU...
  • DanNeely - Tuesday, November 26, 2013 - link

    IIRC the CPU cache is sram; maybe the block between the CPU clusters is a top level cache for them.
  • Kevin G - Wednesday, November 27, 2013 - link

    The CPU should be able to read/write to the 32 MB of eSRAM. That catch is that the bandwidth will be bottlenecked by the CPU-GPU link on die as well as take a small latency hit due to going through the GPU's memory controller. The bandwidth and latency should still be better than the DDR3 main memory.
  • milli - Wednesday, November 27, 2013 - link

    Better latency yes but I don't think more bandwidth. As far as I know, there's 'only' a 30GB/s link between the CPU cluster and the rest of the system. But I don't think anybody is going to use the eSRAM for the CPU.
  • Alexvrb - Wednesday, November 27, 2013 - link

    For the CPU, they won't be using the 32MB of eSRAM. The quad channel DDR3 provides way more bandwidth than the CPU needs, and it's fairly low latency to boot.
  • Owls - Tuesday, November 26, 2013 - link

    I have no dog in this fight as I will get both consoles anyway (already got the PS4) but it's been proven by everyone that the PS4 has the superior design. However, I still have some hope that the XBone will even out in the graphics department down the line.
  • Morawka - Wednesday, November 27, 2013 - link

    I think the general consensus on reviewers is that xbox one is the one to get if you can only get 1 console. The podcasters are going nuts over the xbox one's kinect and tv features.
  • cjb110 - Wednesday, November 27, 2013 - link

    Though the TV features are currently lacking if you're anywhere but the US, and actually broken if you're in a PAL region, with the 60Hz/50Hz issue.

    Both of these are fixable though.
  • Wolfpup - Wednesday, December 4, 2013 - link

    If you can only get one, the one to get would be the one with more exclusive games that you prefer (baring in mind Microsoft's been horrible with that, the past few years with the 360).

    If you're neutral, or care more about third party games, PS4 is considerably more powerful and PS4 versions of games will always look or run better.
  • bill5 - Wednesday, November 27, 2013 - link

    "Superior design" meaning what?

    The Xbone design basically enabled DDR3 instead of GDDR5, which is significant cost savings.

    People just dont see that because XB1's $100 more expensive. But if you check the P&L statements, you'll see Sony is very good at losing money and MS is pretty good at making it. XB1 is likely already profitable while PS4 is probably losing some money at launch. Plus of course, the XB1 includes an extra expense in Kinect.

    I bet MS could rip Kinect out, swallow a few losses like Sony is already doing, and sell XB1 for 299 right now. at that point they'd kill PS4 in sales and you'd start to understand the positives of XB1 design. No it apparently cant stand up to PS4 in raw power, but like past MS consoles (such as the 360 which equaled PS3 in power despite a year headstart, or the Xbox 2001 which destroyed PS2 in power), it's likely smartly and efficiently designed.

    In a few years the ESRAM will be shrinking nicely, but the GDDR5 vs DDR3 cost discrepancy favoring XB1 will remain forever. whether MS chooses to pass the savings enabled by the design on to consumers or just line Bill Gates swimming pool with more dollars is a different matter.

    Long story short, while I disagree with MS's general strategy here and would have preferred a more powerful system, I like the XB1 design for what it is, it's "close enough" to PS4 that it will get every port with visual differences the average joe will never, ever notice, and one could argue that that's really all Microsoft needs in the power department. Remember PS2 vs XBOX? PS2 trounced despite I'm guessing a much more significant power gap than what exists between XB1 and PS4 (for example, Xbox had twice as much RAM as PS2).
  • Andromeduck - Wednesday, November 27, 2013 - link

    kinect is estimated at $75

    the console + controller package is more expensive, larger and cost more than the PS4's and all for a HDMI passthrough and 8GB flash?

    lol
  • 0ldman79 - Wednesday, November 27, 2013 - link

    I agree, the XBox One is good enough for the general user. I'm sure that the majority of the people out there will claim it looks better than the PS4, regardless of anything else.

    I disagree about the DDR3 vs GDDR5 price difference. Just look at the memory market right now, if anything adding in the production numbers of the PS4 the cost of GDDR5 will go down.
  • darkfalz - Saturday, November 30, 2013 - link

    From the little I've seen it seems many PS4 games are running at 1080p while the XBone version is in many cases running at 720p - if you think that's not noticible (particilarly on a 50+ inch TV) then you're crazy. 720p looks like garbage on my 55" TV. Still, I'm excited because the quality of console ports is surely to improve (last few years of upscaled 720p and/or 30 FPS locked console ports for PC has really been a trying time).
  • Wolfpup - Wednesday, December 4, 2013 - link

    The estimates I've seen, both cost slightly less to make than they're sold for. When you factor in the store's cut and all the other expenses, both would be losing money.
  • ivan256 - Friday, December 6, 2013 - link

    "But if you check the P&L statements, you'll see Sony is very good at losing money and MS is pretty good at making it."

    This may be true, but they aren't making that money on XBox. They've lost over a billion dollars on it, and there is investor pressure to spin it off.
  • LordConrad - Wednesday, November 27, 2013 - link

    I'm staying away from the XBox this round for two reasons: I only use consoles for gaming so I have no interest in Microsoft's vision of controlling my living room, and I don't really care for the XBox exclusive games. The software of both consoles will improve over time but the hardware is set for the next 7-10 years, and Sony has the best gaming hardware this time around.

    Don't complain XBox fans, you just had your turn. The 360 was much loved by game developers because the CPU and Unified Memory made it much easier to program.
  • BehindEnemyLines - Wednesday, November 27, 2013 - link

    What's there to complain? Most people who buy the XB1 already know its graphical capability is less than the PS4 based on the released specifications. I am more interested in the XB1 because it has entertainment capabilities that match more for my use case. I'll wait for Rev 2 of the XB1 before jumping in.

    Honestly, having a console sitting there idling doesn't make sense to me. Looking at what Ryse is able to do with 900p/30fps and 85,000 poly without LOD is amazing, so it appears to me, at least, to be more than capable for the next 7-10 years. And that's for an at-launch game, so I am sure more amazing graphics will come next year.
  • andrewaggb - Wednesday, November 27, 2013 - link

    I bought an XB1. I knew it was slower than the PS4, but quite frankly both are much slower than my PC so that wasn't really a factor.
    If I can get it on PC I always pick PC. I got XB1 for the things I can't get on PC, which is essentially Kinect, some exclusives (Halo), some console only titles (like skylanders), stuff like that.

    So far I find my power brick too loud, so I put my system in low power mode, (longer startup time, no updating when off), which is a downer.
    I also found the TV integration was kinda lame because unless I missed something, it doesn't seem to have Canadian channels listed, so I ended up not bothering with the TV integration after trying it out for a couple days.
    Live TV also froze a couple times and wouldn't work again until I unplugged the HDMI and plugged it back in. This may have been a cable issue as it felt like it only went in 95%, but I didn't bother diagnosing it further.
    Kinect voice controls work fairly well for me, but my kids are hit or miss. Image recognition is great for me, so so for my younger kids.

    Skype was ok, but kinect's camera really washes out with lights in the background, I've moved which TV it's on now and might try skype again.

    So... I'm a bit disappointed. Killer instinct is fun, the new controller is substantially lighter than the previous one, kinect does seem to be quite a bit better, but it's not awesome by any stretch.

    I played a blu ray movie, it worked ok, though I think there was the occasional hiccup (like a dropped frame or something).

    I've contemplating trying to send my power brick back, I called in on day 2 of having it and it sounded like I had to return the entire console.

    Anyways, in my opinion, it's ok... but not awesome.
  • andrewaggb - Wednesday, November 27, 2013 - link

    And to be clear about my power brick issue, there is a fan in the power brick, that if you have your console in quick start mode/instant on, whatever the fan seems to stay on 24/7 or something. In a quiet room I could hear it very clearly and found it completely unacceptable. Some other people report the same issue, but other's don't, so I could just have a loud fan/bad power brick.

    Something to be aware of anyways.
  • Andromeduck - Wednesday, November 27, 2013 - link

    what really? there's a fan in the power brick?!
  • bill5 - Wednesday, November 27, 2013 - link

    I would like to comment on one of Anand's comments, that MS likely couldn't both enable 2 CU's AND hit 853 mhz.

    I guess this could well be true, but I also think it's a very strong possibility MS just blanched at the yield expense of enabling the 2 redundant CU's (whereas presumably a 53 mhz upclock might have cost them next to nothing). Which imo, was a very shortsighted move if they did so.

    I really really wish MS had enabled those 2 CU's. At that point you'd have a 1.5 teraflop system, with more CPU speed (due to higher clock) and more peak bandwidth than PS4. In other words you'd probably have essentially a hardware "tie" with PS4 at worst.

    I'd also of course preferred a larger GPU upclock, considering AMD GPU's around 10-14 CU's easily are sold stock at 1 GHZ and up.

    I have a hunch MS prioritized a quiet, living room friendly machine over any noise increase at all with more performance, and I would disagree with that priority.

    I also heard a possibly technically knowledgeable source tell me he thinks MS may not have clocked the GPU higher because it would have "destroyed the timing windows" on the ~doubling of ESRAM bandwidth that was "discovered" by MS a few months ago. I have no idea if that idea has any technical validity whatsoever. But if that's the case, then I can understand why MS wasn't more aggressive on the GPU clock, as increasing the GPU clock slightly in exchange for halving the ESRAM BW likely would have been a major net negative on performance, obviously.

    Anyways, I have a feeling MS's minor, late, CPU and GPU clock increases probably were just enough to really put a fly IN PS4's ointment to where PS4 wont show major graphical superiority over Xbox One in the future. Typical devious, smarter than Sony, Microsoft...

    I would also say there was a lot of generic FUD on Neogaf about how PS4 was just going to obliterate XB1 by even more than the gap suggests such as:

    -XB1 has effectively only 68 GB/s bandwidth (these people willfully ignored/poo poo'd ESRAM) and thus will be destroyed/crippled.

    -XB1 wont allow "coding to the metal", therefore will be brutally crippled as opposed to PS4

    -PS4 would have 7+GB of RAM available for games vs 5GB for XB1, would have 1/0 CPU cores reserved for OS vs 2, the list really went on and on.

    It appears none of that FUD materialized at all, and in real terms XB1 is "keeping up" just fine.
  • Andromeduck - Wednesday, November 27, 2013 - link

    1M sales in NA vs 1M world wide

    you decide
  • r47 - Wednesday, November 27, 2013 - link

    realy? do you know the price difference between US and the rest of the world?
    as an example... the ps4:

    ps4 US -> 399$ (=245$)
    ps4 UK -> 349£ (=567$)

    and same goes for Xbox...Im not saying that ps4 is going to sell more or less than xbox.. but you should take many things in consideration...
    oh, and by the way.. world wide is not only 13 countries.
  • Zingam - Friday, November 29, 2013 - link

    You Brits shouldn't complain about prices! You are rich and if you need money you can always extort the colonies!

    There are more than 13 countries? Send the marines to destroy them!
  • darkfalz - Saturday, November 30, 2013 - link

    I've got a Sony TV and BluRay home threatre system, but the thing is I hate the Playstation controllers, always have... but I'll sit this round out as the only reason I really bought an Xbox 360 was to play some SEGA Dreamcast sequels released only for XB/360 and SEGA are porting most of their stuff (albeit badly in most instances) to PC. PC is just much cheaper too in terms of games cost and I don't need to stick a new disk in every time.
  • tipoo - Thursday, November 28, 2013 - link

    I still dont' think it would be a tie, so long as the PS4 still had double the ROPs and TMUs. The ONe could have 40 compute units for all the help it would do to being output limited to 16 ROPs.
  • TheinsanegamerN - Monday, December 2, 2013 - link

    if the one had 40 compute units it would haveroughly 30 ROPs. ROP count scales with CU count.
  • TheinsanegamerN - Monday, December 2, 2013 - link

    i should also point out, that if the one had 40 compute units, it would have double the gpu hardware of the ps4.
  • mikk - Wednesday, November 27, 2013 - link

    "I'm not sure if we'll see either company move to 20nm, they may wait until 14/16nm in order to realize real area/cost savings"

    This part doesn't make sense, there is no area scaling and cost saving with 16nm from TSMC, it's a 20nm Finfet process without area scaling.
  • Gabrielsp85 - Wednesday, November 27, 2013 - link

    There any way to point where the 50MCU the architects said the X1 had are??? also when you scale X1 and PS4 SoC, the X1 GPU cores are bigger, does that mean anything??
  • tipoo - Thursday, November 28, 2013 - link

    Different layout optimizations probably. But I think the One CUs were actually shorter when scaled.
  • Gabrielsp85 - Friday, November 29, 2013 - link

    They are actually longer, I'll share a picture when I find it, could also mean X1 use a newer version of GCN?
  • pliiny - Thursday, December 5, 2013 - link

    Anyone think the SRAM on the Xbox one soc is related to the Mantle API?
  • pliiny - Thursday, December 5, 2013 - link

    Does anyone think the eSRAM on the xbox one soc is related to the Mantle API?

Log in

Don't have an account? Sign up now