It's not that bad. I picked up an X79 ASRock Extreme6 for $220, which is around what you'll pay for the good Z68/Z77 boards and I still got all of the X79 features.
"I'd gladly take a IVB-E, even hex core, but that damned X79 makes me throw up when I just think about spending that much on a platform. :/"
And be screwed.
"von Krupp - Thursday, October 03, 2013 - link It's not that bad. I picked up an X79 ASRock Extreme6 for $220, which is around what you'll pay for the good Z68/Z77 boards and I still got all of the X79 features."
Tell that to owners of original not so cheap Intel motherboards, DX79SI. They need to buy new motherboard for IVB-E cpu, no UEFI update like other manufacturers.
Not if they actually bought one when it was more expensive then waited until these long cycles allowed you to go and buy a second one on the cheap (ie., 670 when they were $400, then another when they were $250).
Except that you might need the two or four graphics cards to get good enough performance, whereas there's often no real performance benefit to more than four cores (for gaming).
Take Starcraft 2, a game which can bring any CPU to its knees, the game is run on one core, with AI and some other stuff offloaded to a second core. This is a fairly common way for games to work as it's easier to make them this way.
I still run an O.C. Q6600 :) but my GPU just died (8800GTS 512MB). Do you suspect that the lack of fps on Civ V for the Q9400 is due more to the motherboard limitations of PCIE 1.1 or more caused by the shortcomings of an old architecture? I don't want to spend a lot of money on a new high end GPU if my Q6600 would be crippling it... but my mobo has PCIE 2.0 x16 so it's not a real apples to apples comparison w/ the shown Q9400 results.
Had PrecisionX running and logging stuff in the background while I ran the benchmark. Turned out the biggest FPS drops coincided with the lowest GPU utilization, and that pretty much nailed the fact that my Q6600 @ 3.0 was severely bottlenecking the game.
Tried it again with CPU-Z, and indeed the FPS drops aligned with high CPU usage.
I upgraded from a Q6600 last year and it really did make a difference. If you're not looking to upgrade you CPU I'd get something like a Radeon 7850 and save the rest for a full rebuild in a year or two,
The problem is that StarCraft II is threaded HORRIBLY. It's single-threaded performance or bust, and that's really easy to quantify. HotS may have been released this year, but its architecture is from 2003.
This is absolutely correct. It can murder any CPU, but the game engine runs entirely on one core, with part of another used for a few extra things (networking, AI, etc).
This is why some people who are really in to Starcraft 2 are configuring their desktops with low turbo settings on 3 cores and one very-high setting on the fourth to get that extra tiny bit of performance. I'm not too sure how well it works but some people swear by it.
It's not really representative of most games, everyone knows it's highly CPU limited...Most games are GPU limited as proven by this, yet a lot of people seem unaware of that.
Another choice I was considering for gaming: the i5 3350p.
This is the cheapest i5 available on this side of the pond and still Ivy, so it allows the 4 bin overclocking. Since haswell, intel does not allow any overclocking anymore for non-K parts.
In addition, Z77 motherboards are quite a bit cheaper than Z87 for the moment.
So for a 30$+ cheaper than the 4430, you get 3.7/3.6/3.6/3.5ghz Ivy vs 3.2/3.0ghz Haswell.
The platform isn't that upgradeable but with Intel moving to 2-year cadences for desktop upgrades, the performance should stay relevant for at least 4 years...
It's shocking to see how well the dual core Intel parts and the two-module AMD chips fare, even at 1440p with a single GPU. With respect to single-GPU gaming, opting to pull some $ out of the CPU/MB fund in order to buy a better GPU is certainly more advisable.
Those who invested in the X58/1366 platform certainly got their money's worth. Frankly, even buying a secondhand 1366 platform is a good idea if it's cheaper than a new quad-core 1155/1150 + mobo. Going from an SSD running on 3GB/s to 6GB/s really isn't noticeable. I've done this twice on two separate platforms and the only difference I've seen is with respect to bootup speed.
You also have to figure that this graph will change with the newer generation of console ports. I have an inclining that 2/4 threads might stumble a bit with some more demanding titles. We might even see AVX play a more significant role as well
i am reviving my x58 MSI board, with Syba sata 3 controller, and i really notice a difference on my long video editing files. that was my whole point of modding up. Cheers. good discussion
I really appreciated what this article tried to accomplish, and I think it does shed some light on some aspects of what you were trying to test...but someone at AnandTech couldn't throw you a bone and get you a pair of higher-end GPUs to test? 580s are a bit long in the tooth to garner any meaningful results. Maybe Gigabyte could have kicked a pair of 680s to you?
Also, it would have been nice to see some Battlefield 3 results, since it is widely touted as a title that scales extremely well with both CPU (and shows big differences with HT) and GPU, especially in MultiPlayer, and will be especially relevant in the next few months as Battlefield 4 launches.
I find testing for CPU performance is not a strong suit of reviewers. The test has lots of data but it is missing the situations that gamers end up in which do require CPU performance.
Starcraft 2. Just run a replay of an 8 player map at 4x-8x speed and most dual core notebooks practically break down. Total War set unit size to epic and run some huge battle. That is where this games is great but it drains cpu resources like crazy.
Shooters or racing games are examples where the CPU has to do nothing but feed the GPU which is really the least CPU intensive stuff. Mulitplayer adds quite a bit of overhead but it is still not something if you play you need to worry much about your CPU. When testing CPU performance kick all those shooters to the curve and focus on RTS games with huge unit sizes. It is the minimum frames at these games that require CPU performance. The situations where it gets annoying that in the biggest battles the CPU cannot keep up. Starcraft 2 on medium runs on almost any GPU but it can bring slower CPU quickly to its limits.
COH2 is planned for our next benchmark update to cover the RTS side. I know Rome II is also a possibility, assuming we can get a good benchmark together. As I mentioned, if possible I'd like to batch file it up so I can work on other things rather than moderate a 5 minute benchmark (x4 for repetitions for a single number, x4-8 for GPU configs to complete a CPU analysis, x25+ CPUs for scope).
If you have any other suggestions for the 2014 update, please let me know - email address is in the article (or click on my name at the top). Ian
"COH2 is planned for our next benchmark update to cover the RTS side."
Excellent. This is one of my most-played games. In addition, I wouldn't be surprised if subsequent patches for this game didn't noticeably improve it's multi-threaded performance, so having older results will be nice to have once these patches are released in order to track the improvements.
Yes to strategy games. Supreme Commander (the original, not the horrible "2" version), 80km maps, 8 players, 2000 unit limit, replay. Stuff like that.
You could also throw in some sandbox games; TES is a good choice for its many CPU constrained situations, GTA5 possibly, ... (results may vary depending on threading tweaks).
Why is the i3-3225 missing from most of the CPU benchmarks? From the beginning of that webpage, it doesn't appear until "Explicit Finite Difference Grid Solver (2D)".
It was one of the first CPUs I tested and I only focused on the GPU results at that time - I ran my SystemCompute benchmark just to see what it was like. I have not gone back to retest as of yet, though on the platform refresh I'll make sure to add the numbers.
> It is possible to consider the non-IGP versions of the A8-5600K, such as the FX-4xxx variant or the Athlon X4 750K. But as we have not had these chips in to test, it would be unethical to suggest them without having data to back them up. Watch this space, we have processors in the list to test.
I think you should make this a priority - one could save ~$20 with the 750K, which can make a big difference on a low budget.
For something that old testing came down to what Ian and the hardware vendors were able to scavenge up. A 965 and 9400 on the shelf somewhere beat a p45 and 9650 that need bought.
No, it doesn't. That processor (low frequency, half the cache) on that motherboard (no pcie 2.0) really doesn't do Core 2 Quad justice. A top model would probably beat (in my opinion it would certainly beat) similarly clocked Phenom 2 processors and be higher on the ladder.
So again we see tests with games that are known not to scale with more CPU cores. There are games however that show clear benefits, your site simply doesn't test them. Its not universally true that more cores or HT make a difference but maybe it would be a good idea to focus on those games we know do benefit like Metro last light, Hitman absolution, Medal of honour warfighter and some areas of Crysis 3.
The problem here is that its the games that support more multithreading, so to give true impression you need to test a much wider and modern set of games. To do otherwise is pretty misleading.
Honestly, if the stats for single GPU's weren't all at about the same level, this would be an issue. It isn't until you get to multiple GPU's - an area that you start to see some differentiation. But that level begins to become very expensive very quickly. I'd posit that if you're already into multiple high-end video cards, the price difference between dual and quad core is relatively insignificant anyway, so the point is moot.
appreciate the review, but it seems like the choice of games and settings makes the results primarily reflect a GPU constrained situation (1440p max settings for a CPU test?). It would be nice to see some of the newer engines which utilize more cores as most people will be buying CPU for titles in the future. I'm personally more interested in the delta between the CPU's when in CPU bound situations. Early benchmarks of next gen engines have shown larger differences between 8 threads vs 4 threads.
Precisely. Also only 2% of us even own 1440p monitors and I'm guessing the super small % of us in a terrible economy that have say $550 to blow on a PC (the price of the FIRST 1440p monitor model you'd actually recognize the name of on newegg-asus model (122reviews) – and the only one with more than 12reviews) would buy anything BUT a monitor that would probably require 2 vid cards to fully utilize anyway. Raise your hand if you're planning on buying a $550 monitor instead of say, buying a near top end maxwell next year? I see no hands. Since 98% of us are on 1920x1200 or LESS (and more to the point a good 60% are less than 1920x1080), I'm guessing we are all planning on buying either a top vid card, or if you're in the 60% or so that have UNDER 1080p, you'll buy a $100-200 monitor (1080p upgrade to 22in-24in) and a $350-450 vid card to max out your game play.
Translation: These results affect less than 2% of us and are pointless for another few years at the very least. I'm planning on buying a 1440p monitor but LONG after I get my maxwell. The vid card improves almost everything I'll do in games. The monitor only works well if I have the VID CARD muscle ALREADY. Most people of the super small 2% running 1440p or up have two vid cards to push the monitors (whatever they have). I don't want to buy a monitor and say "oh crap, all my games got super slow" for the next few years (1440p for me is a purchase once a name brand is $400 at 27in – only $150 away…LOL). I refuse to run anything but native and won't turn stuff off. I don't see the point in buying a beautiful monitor if I have to turn in into crap to get higher fps anyway... :)
Who is this article for? Start writing articles for 98% of your readers, not 2%. Also you'll find the cpu's are far more important where that 98% is running as fewer games are gpu bound. I find it almost stupid to recommend AMD these days for cpus and that stupidity grows even more as vid cards get faster. So basically if you are running 1080p and plan to for a while look at the cpu separation on the triple cards and consider that what you'll see as cpu results. If you want a good indication of what I mean, see the first or second 1440p article here and CTRL-F my nick. I listed all the games previously that part like the red sea leaving AMD cpus in the dust (it's quite a bit longer than CIV 5...ROFL). I gave links to the benchmarks showing all those games. http://www.anandtech.com/comments/6985/choosing-a-... There's the comments section on the 2nd 1440p article for the lazy people :)
Note that even here in this article TWO of the games aren't playable on single cards...LOL. 34fps avg in metro 2033 means you'll be hitting low 20's or worse MINIMUM. Sleeping dogs is already under 30fps AVG, so not even playable at avg fps let alone MIN fps you will hit (again TEENS probably). So if you buy that fancy new $550+ monitor (because only a retard or a gambler buys a $350 korean job from ebay etc...LOL) get used to slide shows and stutter gaming for even SINGLE 7970's in a lot of games never mind everything below sucking even more. Raise your hand if you have money for a $550 monitor AND a second vid card...ROFL. And these imaginary people this article is for, apparently should buy a $110 CPU from AMD to pair with this setup...ROFLMAO.
REALISTIC Recommendations: Buy a GPU first. Buy a great CPU second (and don't bother with AMD unless you're absolutely broke). Buy that 1440p monitor if your single card is above 7970 already or you're planning shortly on maxwell or some such card. As we move to unreal 4 engine, cryengine 3.5 (or cryengine 4th gen…whatever) etc next year, get ready to feel the pain of that 1440p monitor even more if you're not above 7970. So again, this article should be considered largely irrelevant for most people unless you can fork over for the top end cards AND that monitor they test here. On top of this, as soon as you tell me you have the cash for both of those, what they heck are you doing talking $100 AMD cpus?...LOL.
And for AMD gpu lovers like the whole anandtech team it seems...Where's the NV portal site? (I love the gpus, just not their drivers): http://hothardware.com/News/Origin-PC-Ditching-AMD... Origin AND Valve have abandoned AMD even in the face of new gpus. Origin spells it right out exactly as we already know: "Wasielewski offered a further clarifying statement from Alvaro Masis, one of Origin’s technical support managers, who said, “Primarily the overall issues have been stability of the cards, overheating, performance, scaling, and the amount of time to receive new drivers on both desktop and mobile GPUs.”
http://www.pcworld.com/article/2052184/whats-behin... More data, nearly twice the failure rate at another vendor confirming why the first probably dropped AMD. I’d call a 5% rate bad, never mind AMD’s 1yr rate of nearly 8% failure (and nearly 9% over 3yrs). Cutting RMAs nearly in half certainly saves a company some money, never mind all the driver issues AMD still has and has had for 2yrs. A person adds a monitor and calls tech support about AMD eyefinity right? If they add a gpu they call about crossfire next?...ROFL. I hope AMD starts putting more effort into drivers, or the hardware sucks no matter how good the silicon is. As a boutique vendor at the high end surely the crossfire and multi-monitor situation affects them more than most who don't even ship high end stuff really (read:overly expensive...heh)
Note one of the games here performs worse with 3 cards than 2. So I guess even anandtech accidentally shows AMD's drivers still suck for triple's. 12 cpus in civ5 post above 107fps with 2 7970's, but only 5 can do over 107fps with 3 cards...Talk about going backwards. These tests, while wasted on 98% of us, should have at the least been done with the GPU maker who has properly functioning drivers WITH 2 or 3 CARDS :)
I may be a little biased here since I'm still rocking a Q6600 (albeit fairly OC'd). But with all the other high end platforms you used, why not use a DDR3 X48/P45 for S775?
I say this because NOBODY who reads this article would still be running a mobo that old with pcie 1.1, especially in multi-gpu configuration.
I know you dealt with this criticism in the intro, and I understand the reasoning (consistency, repeatability, etc) but I'm going to criticize anyways...
These CPU results are to me fairly insignificant and not worth the many hours of testing, given that the majority of cases where CPU muscle is important are multiplayer (BF3/Crysis 3/BF4/etc). As you can see even from your benchmark data, these single player scenarios just don't really care about CPU all that much - even in multi-GPU. That's COMPLETELY different in the multiplayer games above.
Pretty much the only single player game I'm aware of that will eat up CPU power is Crysis 3. That game should at least be added to this test suite, in my opinion. I know it has no built in benchmark, but it would at least serve as a point of contact between the world of single player CPU-agnostic GPU-bound tests like these and the world of CPU-hungry multiplayer gaming.
I am sorry to say that, but I never expected to see a completely useless test at AnandTech. 99% of singleplayer games are fine with a Core i3, 99.9% of all games are fine with a stock Core i5, but there is that 0.1 % that is not, and it is mostly multiplayer. Go look at the BF4 beta tests, where even a Haswell i7 is a bottleneck. Even BF3 muliplayer performs better with a modern i5 than with a 1156/1366 CPU.
And with next-gen games right around the corner, the situation might change drastically with more and more games needing a very fast quad core CPU.
Agree. What's the point of running time demos (need less cpu grunt) on single player games (need less cpu grunt) at very high res/settings (gpu bound max fps) with no min fps (so weak cpu bottlenecks hidden).
Makes those with weak cpu's feel better and leads to lots of "surprising my AMD processor is good enough" comments, but it actually for many people it's not.
Personally I'd say that's a load of BS. I work with a lot of different setups, and unless your an enthusiast the average gamer really can't tell the difference. Their coming off of older setups already so unless your cutting a ton of corners you can easily go the AMD route for a good majority of them.
I don't understand the recommendation for "at least quad core" for Civilization V.
Having looked at task manager during the game, it quickly becomes apparent that the game is effectively entirely single threaded. It doesn't even have a separate thread for video rendering vs. AI, or if it does, they completely block each other. Setting the CPU affinity to keep the game on a single core makes absolutely no difference in that game.
We see some of our biggest variations in CPU performance in Civilization V, where it is clear that a modern Intel processor (Ivy/Haswell), at least quad core, is needed to get the job done for the higher frame rates. Arguably any high-end AMD processor will perform >60 FPS in our testing here as well, perhaps making the point moot.
I've got to say, I'm impressed with the common sense approach, both to the setup of a benchmark of this size, and some of the conclusions I'm reading.
I'm interested to see how many AMD processor's end up above the "good enough to not bottleneck the GPU setup" line. I wonder if they will be cost effective vs and Intel setup.
A future experiment of interest to me is whether or not more budget oriented chipsets significantly hinder performance. I guess the question that's on my mind is "Is there any situation in which a faster processor on a board with lesser capabilities would be outperformed by a (somewhat) slower processor on a board with greater capabilities?" Or put a different way, "How much processing power is it logical to sacrifice in pursuit of a better platform (I.E. more PCIe lanes for multiGPU setups)?"
I second this request. From the limited amount of tests I could find so far it seems that saving money on the CPU and investing it into the GPU is the way to go for most games. That seems to include even seemingly unbalanced combinations like a Pentium and a GTX 780 beating a Quad-Core and a GTX 770.
I was quite surprised to see the Sandy Bridge chips hanging in there. There doesn't seem to be much need to upgrade if you have a i5-2500K or i7-2600k, especially if you factor in how easy they are to overclock for 4.5Ghz and sometimes beyond.
Wasn't much of a bump for ivyB or Haswell really.. Put all three on a table ("STOCK") /w similar hardware and I'd lay money on 99.9% not being able to tell the difference. CPU's have been going sideways in performance rather than upwards. (My opinion..) for sometime now.
What's interesting is Socket1366 cpu's are finally beginning to show some age..
Performance hasn't been increasing (as much) because of the focus on power consumption in laptops. That and AMD's utterly noncompetitive products at the high end.
I could 100% tell you which system was which if I had a Kill-A-Watt, though.
Yep.. you should also be able to tell the difference simply by measuring heat.. The SandyB's tend to run a little cooler than the IvyB although they must have done something in Haswell since it does run cooler in normal operation.. but heats up rather quickly under load just like the IvyB. But on the surface their all fairly comparable I think anyway.
My main system is still rocking an i7-920. These charts help explain rationally what my brain must have somehow known subconsciously: that there's not yet much reason to upgrade. (I'm discounting the +50% gains on the CPU benchmarks, because my i7-920 is overclocked, making the gains much less. And I'm rarely CPU bound for long.)
I would like a 6GB/sec SATA controller some day. My poor SSDs must be very frustrated with their host.
I'm in a similar boat: using i7-930 @4GHz. Seriously, who runs those wonderful Nehalem CPUs on default clocks when they easily overclock 1.5x? And with this overclock advantage of the newer CPUs is really underwhelming: far less than i7-920 line here shows.
As for SSD, I use PCI-E based one and it's probably still faster or at least on par with newest SATA ones.
A5 - was your 920 a C0 stepping? Mine is a D0, which at the time of purchase I remember going way out of my way to check the stepping before pulling the trigger
I to invested in the venerable (speak only in awe hushed whispers), i7 920 which I promptly overclocked to 3.6Ghz. This little jewel has been going strong for Goodness almost half a decade? and stable as a rock and I notice holding it's own very well even up against the latest and greatest. This is a testament to competition and engineering when competition in the CPU arena existed. I have long switched from dual GPU's to single but dual core cards on a single fat 16x pci-e bus even though my Evga X58SLI board supports higher. I'll ride the wave one more year and see what new gear crashes in next year. Hopefully a new Nvidia architecture that will inspire me to upgrade everything.
"our next update will focus solely on the AMD midrange."
Please don't do that. PLEASE include at least 3 Intel CPU's for comparison. It doesn't matter if the FX8320 does well in benchmarks if for another $40 bucks I can get a i54670 that runs 50% faster. These are hypothetical numbers, obviously, but then Intel will be faster. By how much matters, once you factor in price and energy draw especially.
It's hard making sense of AMD data in comparison to Intel. As near as I can tell their sitting at just beyond i7920 performance these days but /w all the new features. It gets confusing when you look at the X4 X6 older stuff though since some of that is actually faster... yet somehow only compares favorably to Intel's 9X Core2 stuff.
Why 3? The i5 entry level 4430 beats out every AMD chip on the market in most instances. Adding in more simply confuses people and adds more fodder for fanboys to fight over.. and I think it taxes the patience of most of us that already know what's what in the cpu arena.
Simple rule of thumb. If your on a budget you may want to go AMD to get all the "other" bells and whistles your looking to buy or.. if you have a more to spend your starting point will be the i54430.
Excellent article Ian, I really like the inclusion of older CPU's. It's a good basis in which to decide if it's "time" to upgrade on that front. Most of the people I know are not on the bleeding edge of technology. Many sit back in 2009 with minor updates to video and Hard Drives. Anyway.. Well done lots to sift thru.
Amazing data. I do wonder whether the testing at max settings is a good idea though. The variation in performance can be extreme. Just watch the Metro 2033 benchmark play out. Does that look like the kind of experience you'd want to play?
Perhaps more importantly though, the arrival of next-gen console changes everything.
Did you see the news that Watch Dogs is x64 only? That's just the tip of the iceberg. Developers need to go wide to make the most out of six available Jaguar cores. Jobs-based scheduling over up to eight cores will become the norm rather than the exception. The gap between i5 vs. i7 will widen. AMD FX will suddenly become a lot more interesting.
In short order, I'd expect to see dual core CPUs and less capable quads start to look much less capable very quickly. i5 vs. i7 will see a much larger gulf in performance.
Check out the CPU data here for the Battlefield 4 beta:
So yeah, it's a very good showing for AMD, but not as good as what you indicate. Also, according to sweclockers, an overclocked i5 is still superior to an overclocked 83xx CPU, so make of that what you wish.
I'm just glad we're seeing games starting to use more than 2-4 threads effectively.
Much more likely is that games will just become less and less reliant on CPU power because of the terrible netbook processors in the consoles and will instead rely more and more on the GPU. The PC versions of games will just be the same game with a high res texture pack and some extra graphics bling to use up GPU cycles while your processor sits around shuffling a little data.
I'm not sure AMD will benefit that much. As soon as consumer CPUs have a reason to have more cores they're just release a new chip with more cores. There is absolutely no reason that they can't release a 8 or ever 12 core desktop processor, they're already selling them for servers.
Forgot to mention, Watch Dogs is probably x64 only because they want to use more than 2GB of RAM (which is the limit for the user-mode memory partition in Win32).
That's pretty much just the games they picked. If you could reliably benchmark large scale PC games like Planetside 2, or other popular large scale MMO's reliably you'd pretty much see the exact opposite. The trouble is, it seems like no MMO makers give you reliable benchmarking tools so you can't use them for tests like these.
I would really like to see a CPU comparison for strategy games. For example, one could have a save game of a far advanced game in Civilization 5 or Total War with many AI players on the largest map and then see how the waiting time varies between the different CPUs. This should be feasible, shouldn't it? I'm running an i5 2500k @4.6ghz and it just isn't cutting it for Civilization 5 on a large map once you're far into the game, it would be nice to see whether getting hyperthreading and more cores would be worth it.
Having waited the ridiculous amounts of time between turns on Civ V, and having dual monitors, I put task manager up on the second monitor while it was running, to see that Civ V *IS NOT MULTITHREADED. AT ALL*. Setting the CPU affinity to make it use only 1 logical core makes absolutely no performance difference at all! The only thing I can think of for why a better result would be seen on quad-core systems would be that it likes having a larger L3 cache.
Thank you for doing this, it's quite informative. I just have one suggestion, perhaps you could get a Lynnfield CPU into these benchmarks. I've been happily using my i5-750 for about 4 years now, but I'm unsure if the performance would be closer to a i7-920 or Q9400. I'm thinking it may be getting close to time to upgrade, but I've still never come across a game or app that seems to choke it.
+1 to Lynnfield. My i5-750 is still running great at a gentle ~3.5GHz, and I haven't really felt the burn in games, but I also am not running multi-GPU. Still I'd love to see how well it stacks up to the competition.
I love articles like this! Excellent stuff, thanks!
I'm still confused why you guys don't have an i5-3570k in your line-up. Of all of the processors, that's probably the most crucial to have, being it's performance for the price and popularity for builds. These tests give me little to go on without that processor, as important as it is for the general builder!
Also, CPU benchmarks for gaming aren't as necessary with single player games. A necessary contrast for CPU comparison will be for MMOs or multiplayer FPS runs. Obviously it's more difficult to get accurate baseline results for such instances, but a large amount of runs should at least minimize any variables between each testing instance and give a broader definition for how well each processor will perform.
If you guys could get on the latest MMOs and test out these rigs, that'd be where I see charts for CPU comparison really come into play for gaming.
Final Fantasy XIV has a benchmark, but I don't know if it fits on their needs. Maybe worth check it although I'm not sure if that represents real world gameplay.
Guys and Gals:, i have started using Video editing software, Coral, and Photo software ACD17, and need the best out of my i7 960, so, i have spent considerable time fitting with a new SYBA x4 SATA 3 controller, "50" bucks, and a better CPU cooler, double fan, big aluminum / copper beast. out come it works, SSD's are working like they should in Raid 0, / with 2.5 hdd backup, using hot swap. and i went back to the literature " stories " of over clocking, and settled on a 4.2 overclock. Letting you all know it words really, really good. i am almost ashamed of myself, " being techie " that it took me so long to get practical. . i am now resigned to waiting for DDR 4, and PCIe 3.x for future considerations. good article/ thanks
update, i really parked the beast cpu 960 at 3.8x Mhz, 4.2 is too HOT, the fan rate is NOISy, and it is unstable, as i am wondering if my hard drive crashes are Malware, or just unstable, NONE of that at 3.8x, i will adjust my PCIe rate up from 133 MHz, say towards 137 or even 140 if stable, on all add in cards. it only works to your least stable card, i have settled on a Marvel, hdd controller card, for cheap , cost effective, bottom line, i like this article, and after 3 days of work, i am on to doing work with my computer. thanks all. Cheers Thomas Gx yvr.ca Vancouver Canada
Every time an article of this sort is written, the conclusion is the same: In the vast majority of cases, due to GPU bottlenecks the differences between Cpus are so minute that no-one would ever notice the performance difference in game. Yawn.
Every time an article of this sort is written, the conclusion is the same. In the vast majority of cases, due to GPU bottlenecks, the performance differences between CPUs are so minimal that no-one would notice the difference in game. Yawn.
Every time an article of this sort is written, the conclusion is the same. In the vast majority of cases, due to GPU bottlenecks, the performance differences between CPUs are so minimal that no-one would notice the difference in game. Yawn.
This is the 3rd time I am posting this comment as it seems to be continually removed. Yet it is a legitimate and non offensive comment. What happened to freedom of expression at Anandtech?
would be nice to see how the haswell pentiums (like the g3420) do as low budget low power gaming cpu. too bad none of the review sites so far have deemed them worthy of a review so far.
"Of course we would suggest Haswell over Ivy Bridge based on Haswell being that newer platform."
If only Haswell OCs were equal to IB OCs. With Haswell you are STUCK at 4.2-4.6Ghz, depending on your luck, and going water wont help. With IB 4.4-5.0Ghz in usual, and the more money you invest in cooling , the better will be your OC. This luck of the draw in Haswell, and the walls in OCing at Z87 should be considered, especially at triple and quad GPU builds aiming at 4k gaming, where a bad overclock is the doom of the entire system.
I have a Dell XPS 420 with Q6600. With the 8800GT (512 MB) card I was getting about 40 fps with medium settings. When I upgraded to a GTX 670, I got about 60 FPS with high settings, a very noticeable improvement. In my experience, a quad core Q6600 is still a pretty competent gamer with a strong graphics card on all but the most extreme games.
I'm one of those with a D0 i7 920 and it's been running at 3.8GHz (19x200bclk with 'only' 1600C9 memory, 12GB) for over 4 years. I suppose I'll just have to wait for a nice native PCIe SSD to avoid the old SATA controller and I'm golden for a good while more. It's just my HD6970 that could use replacement at some point (1920x1200 reso, nothing crazy).
for my x58, i put in one of these "" SYBA SI-PEX40057 PCI-Express 2.0 "" fifty bucks, makes the newer ssd's rock, i am still happy with my platform and video work, and jpg work, is flash twice as good. " we love our i7-920"s " Cheers, good thread this, all power users, / good fun
When I upgraded to my current intel 520( due to being 3x bigger than previous ssd), I looked into such cards but they were pretty bad and except for sequencial 128kb read, slower than the intel sata controller. I see this is a new version of the marvell controller but is it actually comparable to an intel sata 3 controller this time?
a low cost raid controller yes, 64kb, 128kb, show the merit of raid 0, at 6 Gbs each, i was doubtful myself, but took the test of the device, for i need better video editing performance, at least it works :) now we have to watch out for the 12Gbs devices coming soon, imo for games, not much notice of improvement to be seen, but in big data transfers, sata 3 improvements, can be had for low costs. gl, trying it out, borrow a card to try, if you can, Cheers.
fact is you can set to 32 K blocks, or 64K, 'only" , but is a true Marvel controller chip, in the Syba, and is on the PCI e buss. , Control M, sets the chipset, works rite off, is quick, but, there is a hint, that the lanes are only 5 Gbit second, still is a fine patch upgrade, on low cost 6Gbit second ssd's i am in for 2x120 ssd's and controller for 250,
Results are typical for a variety of games where the resolution is set to 1920 x 1200. Games include Dirt 3, Civilization V, Guild Wars 2, Mechwarrior Living Legends, Diablo 3, Starcraft 2, etc.
I'm really kind of shocked to see how well Nehalem stands up still in many benchmarks. If you adjust the i7 920 benchmarks to make up for the difference in frequency between it and the 4770K, it's not half bad. I used the difference between the i7-920 and i7-950 to determine how the benchmark scaled on Nehalem. If it was close enough to linearly (+/- 1%), I considered it. I saw a 6% - 40% performance advantage for Haswell across the CPU tests, which is actually smaller than I expected for an almost 6 year old chip. (Obviously this includes differences in the platforms too.) Striking that even in 6 years the speed hasn't even doubled.
I'm still on an i7-920@3.6, so this was very relevant to me. If it were 40% across the board, it might be more compelling, but quite a few were more like 15%, 20%, etc. Now I understand Haswell is going to OC a lot further than this one, so in that way you could get the performance diff up there.
I'm just dumbfounded that this Nehalem has lasted me 4.5 years already and it still doesn't feel slow. On the one hand its great value for the money, but on the other hand its a little disappointing to see performance curve drop off like it has over the past 6 years.
That would be a fun project. Make a graph showing average CPU performance increases over the last 30 years.
As interesting these chips are for getting maximum performance from the high w parts. Will you do an article about the low TDP parts that are the true masterpieces Intel makes. I live in Denmark and energy cost more and more. My PC is running almost nonstop and I am curious how well these chips perform in a gaming environment. How far behind are they on performance and what kind of power cost on a year based on an average workload / idle time are we talking about you could save. I find the low power chips to be Intel's true stars. Do more with less. Maybe even throw in a power house chip from 2 years ago for comparison. That would be an interesting article
Congrats to those who did testing part. Can't wait to see AMD added.
Too bad Pentium XE 955/965 (ie. Presler B1/C1 @ 3,46/3,73GHz) didn't "cut it" for this comparson :( Hyper Threading and "Last of Netburst" legacy could be interesting in comparison with low end, fully-intergated setups, like VIA Nano or AMD Fusion. + There is also a possibility that todays multithreaded programs would better utilize the 4 threads of this kind of CPU, maybe to the point of matching Core 2 Duo's...
Either way, to sum it up in two words : GREAT WORK.
A good article, and nice to see an update now that new CPU's are out.
Wouldn't it be nice if you could have all the benefits of X79 for multi GPU configurations, but without the added cost over Z87? Well actually you can, if you take in to account the quad-core LGA2011 CPUs.
The i7-4820K is no more expensive than the i7-4770K, and motherboard costs are very similar too. So people seriously considering 3 or 4 GPUs might be very interested in this option, to gain the benefits of extra PCIe lane allocation without the extra cost of a hex core CPU.
Ian, would you please consider adding i7-3820 and/or i7-4820K to the next update? It would be nice to see how well, or how badly, they fare against the competition.
Sorry but i don't understand this review. What's the point of recommanding different CPU on the only basis of single/dual/tri/quad GPU ?
First, the GPU power is not related to the number of GPU only, with 2x660 you get lower performance than 1x780, but if i read the conclusion for 2x660 you recommand FX-8350 but A8-5600K for 1x780 ?
Second, for example with only a 7970 with a small CPU or a big CPU you get exactly the same performance on Sleeping Dogs 2560*1440 max settings. But what kind of player will keep a setting that offer 28 fps on such a carde ? None ! They will lower the graphic settings related to the GPU only to a point that they will get a higher framerate, like the 80 fps you get with three card.
Whatever the number/power of the GPU, as soon as it's not a lower-end card, the CPU needed to get playable framerate is the same with a GTX 660 or 2xGTX 780 as soon as you don't use graphics settings related to the GPU that lower the framerate that can be sustained by the GPU under the framerate that can be sustained by the CPU.
You can recommand different CPU to get more than 40/60/80/120 fps in some games (but good luck since integrated benchmark are generally not using the most CPU bound scene), but recommand different CPU for single/dual/tri/quad GPU seems a non-sense for me.
Damn. How can you call this a CPU comparisson with data like this. The games are run at such extreme values that in no way they represent the impact of a CPU. Sleeping dogs is just 4 graphs with 28fps, how can any respected researcher show this data without severe shame. To add insult to injury, the vast majority seems to think this is how CPU tests are done and call it a nice review. Literally my heart sank as i read through these comments. Noone (except a few ignored), not even the reviewers has a clue what they're on about. This way of CPU-reviewing in games needs to stop. This isn't just uninformative, it's worse; It's completely misleading. Test games at 800x600 low settings, and pay no mind to those people calling for "real-world benchmarks". Stay true to what's real, instead of appealing to the community.
Forgive me if this is pointed out in the article and I have missed it, but it is worth pointing out. Battlefield 4 will use up to 8 cores/threads. My i7-860 @ 4.0ghz with hyperthreading is outperforming a friends Ivy bridge (3570k) at 4.4ghz without hyperthreading, so much so that my frames are better using a GTX680 against his GTX780.
This could be the product of the "beta", but I do believe it is a sign of things to come. The new consoles are most likely going to influence multithreaded performance greatly considering the lower sinlge thread performance present in the systems.
I have been planning on rebuilding with haswell early next year and was planning on getting a 4670k, but have now changed that decision to going with a 4770k due to this experience. Just my two cents. Cheers!
It doesn't, because it doesn't exactly capture the dynamics of displaying several player models at once. It does a decent job at displaying several preprogrammed models at once.
The FF benchmarks have been a fairly low estimation of actual game performance when it comes to more demanding instances of raids and large crowds. With that said, they do better than most other canned benchmarks for determining the performance of a machine. Given it's consistent testing environment, I guess it wouldn't hurt to use it as a go-to benchmark.
But where is the Civilization V end-of-turn benchmark? I don't care about the frame rates, I care about the times I'm staring at the screen waiting for the game to finish its calculations!
I don't normally comment (as the reviews are generally excellent), but I was actually shocked to see the choice of graphics card(s) for this roundup. Nobody buying a gaming CPU is going to have stuff that slow, right? So many of the tests result in framerates under 60fps, etc.
I'm finally in the process of building a new desktop, mini-ITX. Gonna use a 4570S CPU. Primary duties will be media streaming but I'll game on it too. The computer it's replacing? 650i SLI chipset based computer running an E8400 Core 2 duo. I can still max out Mass Effect games with no issue. Minecraft maxes out the CPU but that's just because Java sucks. So that old 2007 era computer is still a viable gaming machine with the GTX460 in it. Talk of needing to replace a Nehalem CPU soon seems kind of absurd to me. But then again I have no interest in Far Cry or Crysis.
well damn. it seems like if I am single gpu gaming at 1080p, cpu doesn't matter much at all? a 5800k would do the job well enough.
question: will aa amd 5800k bottleneck a gtx780? or a 290x? in 1080p. or it doesn't matter at all? since the resolution is so low. I am sure I am staying in 1080p for at least 5 more years. and my current pc parts are really old(c2d e8500 + 460 1gb) and thinking of upgrading. I am sure a 780 or 290x would last 5 or more years, so kinda want a matching cpu.
This article, prompted me to pull my P6T Asus mb out and replace it with a MSI 7666 which holds the 1366 I7 chip, i put in a 960 over my 920 , and clocked it up to 3.8x so far, and with my Nvidia 470 gpu, n raid 0 2x120 Gbyte ssd's things are rocking along, really good, it seems compared to the high end stuff presented here. , i had to install a cheap Syba controller card, on the Marvell chip set, set to 32 kbits, over 64 kbs, better data storage efficiency over speed, n a 4 channel usb 3.0 card, and it is good to go. , since i bought good , near new , used, i am in it for say half, price. and it works for me, i had a tough go, with microsoft critical patch updates. flooded on oct. 8th, n 15th. so my system restore points, crashed, , i am now set on NO automatic downloads, and all is good, " this is like a 3 day experience " i don't want to go through again, , enjoyed the article, and comments. good comments, thanks guys and girls. and am looking for the DD$4 stuff n Haswell super processor of 14, or early 15, now on to doing work with my computer :) , Cheers, all , have fun with candies, next, and have a good Xmas, buy yourselves something nice. lets keep america working, rtg. Vancouver Canada
Your review is talking about recommendations based on # of GPU's but seems to make the assumption GPU = graphics card. I have a GTX 690 and am looking to possibly upgrade my cpu/mobo, what would your recommendation be keeping in mind in the future I may buy another gtx690 to boost performance? What CPU PLX or non PLX combo do I need to satisfy two 690's in sli?
You don't need PLX with dual-SLI, you don't even need a second GTX 690 :P I myself would never ever consider spending so much money on a video card, but I guess you play on multiple 30'' monitors with the maximum available resolution. Each with it's own. If you insist, then get the Intel i7-4960X, a socket 2011 X79 Asus motherboard with a PLX chip on it and 3 Nvidia GTX Titan. That would surely give you at least 150 FPS in any game except those that are specifically designed not to give more than what the designers want, like Crysis.
I'm on a i7-860 since 2010 and HT was also a decision factor for buying it. But over the years I don't think HT has helped me that much with what I did and do on the PC. So now, after reading this article -which is very helpful- I think a i5-4670(K) with it's $100 lower price difference will suffice. Unless...upcoming games like The Elder Scrolls Online(that I want to play) will make use of HT, but I don't think so. Does anyone know of any game that makes good use of Hyper-Threading, or at least 4 cores?
Goddammit. 6 years I spent without upgrading my rig, now I come back to anandtech and I can't understand one a single one of those benchmarks.
Hell, WHERE ARE THE CPU CLOCK SPEEDS? How the hell Intel and AMD expect me to understand this gibberish the use to name their processors, I want to compare IPC on every bench I see, I want to see em every test how the ghz of one CPU compares to another. I'm not going to read those benchs with a cpu dictionary trying to interpret every name on this list, nor have I a good enough memory to remember what CPU have more cache ou clock speed than the other as described in the first page.
6 years I stood away from the hardware scene, now I came back and I can't understand anything.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
137 Comments
Back to Article
tim851 - Thursday, October 3, 2013 - link
You know, once you go Quad-GPU, you're spending so much money already that not going with Ivy Bridge-E seems stupid.In the same vein I'd argue that a person buying 2 high end graphics cards should just pay 100 bucks more to get the 4770K and some peace of mind.
Death666Angel - Thursday, October 3, 2013 - link
I'd gladly take a IVB-E, even hex core, but that damned X79 makes me throw up when I just think about spending that much on a platform. :/von Krupp - Thursday, October 3, 2013 - link
It's not that bad. I picked up an X79 ASRock Extreme6 for $220, which is around what you'll pay for the good Z68/Z77 boards and I still got all of the X79 features.cpupro - Sunday, October 6, 2013 - link
"I'd gladly take a IVB-E, even hex core, but that damned X79 makes me throw up when I justthink about spending that much on a platform. :/"
And be screwed.
"von Krupp - Thursday, October 03, 2013 - link
It's not that bad. I picked up an X79 ASRock Extreme6 for $220, which is around what you'll pay
for the good Z68/Z77 boards and I still got all of the X79 features."
Tell that to owners of original not so cheap Intel motherboards, DX79SI. They need to buy new motherboard for IVB-E cpu, no UEFI update like other manufacturers.
HisDivineOrder - Thursday, October 3, 2013 - link
Not if they actually bought one when it was more expensive then waited until these long cycles allowed you to go and buy a second one on the cheap (ie., 670 when they were $400, then another when they were $250).althaz - Thursday, October 3, 2013 - link
Except that you might need the two or four graphics cards to get good enough performance, whereas there's often no real performance benefit to more than four cores (for gaming).Take Starcraft 2, a game which can bring any CPU to its knees, the game is run on one core, with AI and some other stuff offloaded to a second core. This is a fairly common way for games to work as it's easier to make them this way.
Jon Tseng - Thursday, October 3, 2013 - link
<sigh> it was so much easier back in the day when you could just overclock a Q6600 and job done. :-pJlHADJOE - Thursday, October 3, 2013 - link
You can still do the same thing today with the 3/4930k.Back in the day the Q6600 was basically the 2nd tier HEDT SKU, much like the 4930k is today, perhaps even higher considering the $851 launch price.
rygaroo - Thursday, October 3, 2013 - link
I still run an O.C. Q6600 :) but my GPU just died (8800GTS 512MB). Do you suspect that the lack of fps on Civ V for the Q9400 is due more to the motherboard limitations of PCIE 1.1 or more caused by the shortcomings of an old architecture? I don't want to spend a lot of money on a new high end GPU if my Q6600 would be crippling it... but my mobo has PCIE 2.0 x16 so it's not a real apples to apples comparison w/ the shown Q9400 results.JlHADJOE - Friday, October 4, 2013 - link
I tested for that in the FFIV benchmark.Had PrecisionX running and logging stuff in the background while I ran the benchmark. Turned out the biggest FPS drops coincided with the lowest GPU utilization, and that pretty much nailed the fact that my Q6600 @ 3.0 was severely bottlenecking the game.
Tried it again with CPU-Z, and indeed the FPS drops aligned with high CPU usage.
rygaroo - Sunday, October 6, 2013 - link
thanks for the info!Flunk - Friday, October 4, 2013 - link
I upgraded from a Q6600 last year and it really did make a difference. If you're not looking to upgrade you CPU I'd get something like a Radeon 7850 and save the rest for a full rebuild in a year or two,rygaroo - Sunday, October 6, 2013 - link
That sounds a pretty decent plan. Thanks for the recommendation!Felix_Ram - Sunday, October 6, 2013 - link
You mean overclock an i5-2500k and job done.Scarier - Thursday, October 3, 2013 - link
I'm surprised many people do not use Starcraft 2 or Heart of the Swarm to benchmark CPUs more often.I've noticed a much bigger increase in that particular game going from i7 920 to 3770k.
Jaguar36 - Thursday, October 3, 2013 - link
I'd lvoe to see some more SC2 benchmarks. Single player may not be that demanding but 4v4 with big armies will crush any CPU.Dustin Sklavos - Thursday, October 3, 2013 - link
The problem is that StarCraft II is threaded HORRIBLY. It's single-threaded performance or bust, and that's really easy to quantify. HotS may have been released this year, but its architecture is from 2003.althaz - Thursday, October 3, 2013 - link
This is absolutely correct. It can murder any CPU, but the game engine runs entirely on one core, with part of another used for a few extra things (networking, AI, etc).Flunk - Friday, October 4, 2013 - link
This is why some people who are really in to Starcraft 2 are configuring their desktops with low turbo settings on 3 cores and one very-high setting on the fourth to get that extra tiny bit of performance. I'm not too sure how well it works but some people swear by it.cbrownx88 - Thursday, October 3, 2013 - link
Starcraft2 and BF3/4 pleeeaseDemocrab - Friday, October 4, 2013 - link
It's not really representative of most games, everyone knows it's highly CPU limited...Most games are GPU limited as proven by this, yet a lot of people seem unaware of that.Spoelie - Thursday, October 3, 2013 - link
Another choice I was considering for gaming: the i5 3350p.This is the cheapest i5 available on this side of the pond and still Ivy, so it allows the 4 bin overclocking. Since haswell, intel does not allow any overclocking anymore for non-K parts.
In addition, Z77 motherboards are quite a bit cheaper than Z87 for the moment.
So for a 30$+ cheaper than the 4430, you get 3.7/3.6/3.6/3.5ghz Ivy vs 3.2/3.0ghz Haswell.
The platform isn't that upgradeable but with Intel moving to 2-year cadences for desktop upgrades, the performance should stay relevant for at least 4 years...
mrdude - Thursday, October 3, 2013 - link
Amazing article, Ian. Thanks a ton.It's shocking to see how well the dual core Intel parts and the two-module AMD chips fare, even at 1440p with a single GPU. With respect to single-GPU gaming, opting to pull some $ out of the CPU/MB fund in order to buy a better GPU is certainly more advisable.
Those who invested in the X58/1366 platform certainly got their money's worth. Frankly, even buying a secondhand 1366 platform is a good idea if it's cheaper than a new quad-core 1155/1150 + mobo. Going from an SSD running on 3GB/s to 6GB/s really isn't noticeable. I've done this twice on two separate platforms and the only difference I've seen is with respect to bootup speed.
You also have to figure that this graph will change with the newer generation of console ports. I have an inclining that 2/4 threads might stumble a bit with some more demanding titles. We might even see AVX play a more significant role as well
cbrownx88 - Thursday, October 3, 2013 - link
BF4 is gonna bump me from X58 I believe... way more CPU bound than BF3 (1920x1200@4.2ghz)snouter - Wednesday, October 30, 2013 - link
BF4 makes my 4930k work more than I thought it would.BOMBOVA - Friday, October 4, 2013 - link
i am reviving my x58 MSI board, with Syba sata 3 controller, and i really notice a difference on my long video editing files. that was my whole point of modding up. Cheers. good discussionchizow - Thursday, October 3, 2013 - link
I really appreciated what this article tried to accomplish, and I think it does shed some light on some aspects of what you were trying to test...but someone at AnandTech couldn't throw you a bone and get you a pair of higher-end GPUs to test? 580s are a bit long in the tooth to garner any meaningful results. Maybe Gigabyte could have kicked a pair of 680s to you?Also, it would have been nice to see some Battlefield 3 results, since it is widely touted as a title that scales extremely well with both CPU (and shows big differences with HT) and GPU, especially in MultiPlayer, and will be especially relevant in the next few months as Battlefield 4 launches.
dusk007 - Thursday, October 3, 2013 - link
I find testing for CPU performance is not a strong suit of reviewers. The test has lots of data but it is missing the situations that gamers end up in which do require CPU performance.Starcraft 2. Just run a replay of an 8 player map at 4x-8x speed and most dual core notebooks practically break down.
Total War set unit size to epic and run some huge battle. That is where this games is great but it drains cpu resources like crazy.
Shooters or racing games are examples where the CPU has to do nothing but feed the GPU which is really the least CPU intensive stuff. Mulitplayer adds quite a bit of overhead but it is still not something if you play you need to worry much about your CPU.
When testing CPU performance kick all those shooters to the curve and focus on RTS games with huge unit sizes.
It is the minimum frames at these games that require CPU performance. The situations where it gets annoying that in the biggest battles the CPU cannot keep up. Starcraft 2 on medium runs on almost any GPU but it can bring slower CPU quickly to its limits.
IanCutress - Thursday, October 3, 2013 - link
COH2 is planned for our next benchmark update to cover the RTS side. I know Rome II is also a possibility, assuming we can get a good benchmark together. As I mentioned, if possible I'd like to batch file it up so I can work on other things rather than moderate a 5 minute benchmark (x4 for repetitions for a single number, x4-8 for GPU configs to complete a CPU analysis, x25+ CPUs for scope).If you have any other suggestions for the 2014 update, please let me know - email address is in the article (or click on my name at the top).
Ian
anubis44 - Saturday, October 5, 2013 - link
"COH2 is planned for our next benchmark update to cover the RTS side."Excellent. This is one of my most-played games. In addition, I wouldn't be surprised if subsequent patches for this game didn't noticeably improve it's multi-threaded performance, so having older results will be nice to have once these patches are released in order to track the improvements.
CrystalBay - Thursday, October 3, 2013 - link
"Watch Dogs" coming on PC requires a octo-core or better for Ultra settings.jimhsu - Saturday, February 22, 2014 - link
Yes to strategy games. Supreme Commander (the original, not the horrible "2" version), 80km maps, 8 players, 2000 unit limit, replay. Stuff like that.You could also throw in some sandbox games; TES is a good choice for its many CPU constrained situations, GTA5 possibly, ... (results may vary depending on threading tweaks).
romrunning - Thursday, October 3, 2013 - link
Why is the i3-3225 missing from most of the CPU benchmarks? From the beginning of that webpage, it doesn't appear until "Explicit Finite Difference Grid Solver (2D)".IanCutress - Thursday, October 3, 2013 - link
It was one of the first CPUs I tested and I only focused on the GPU results at that time - I ran my SystemCompute benchmark just to see what it was like. I have not gone back to retest as of yet, though on the platform refresh I'll make sure to add the numbers.Ian
crimson117 - Thursday, October 3, 2013 - link
> It is possible to consider the non-IGP versions of the A8-5600K, such as the FX-4xxx variant or the Athlon X4 750K. But as we have not had these chips in to test, it would be unethical to suggest them without having data to back them up. Watch this space, we have processors in the list to test.I think you should make this a priority - one could save ~$20 with the 750K, which can make a big difference on a low budget.
Kai Robinson - Thursday, October 3, 2013 - link
Why choose the P965 chipset for an LGA775 motherboard, instead of the P35 or P45 chipsets? And why no mention of the Q9650?DanNeely - Thursday, October 3, 2013 - link
For something that old testing came down to what Ian and the hardware vendors were able to scavenge up. A 965 and 9400 on the shelf somewhere beat a p45 and 9650 that need bought.cosminmcm - Thursday, October 3, 2013 - link
No, it doesn't. That processor (low frequency, half the cache) on that motherboard (no pcie 2.0) really doesn't do Core 2 Quad justice. A top model would probably beat (in my opinion it would certainly beat) similarly clocked Phenom 2 processors and be higher on the ladder.beggerking@yahoo.com - Thursday, October 3, 2013 - link
thats surprising... after all these years, i5-2500k still is a beast of a CPU...dishayu - Friday, October 4, 2013 - link
"all these" = 2.BrightCandle - Thursday, October 3, 2013 - link
So again we see tests with games that are known not to scale with more CPU cores. There are games however that show clear benefits, your site simply doesn't test them. Its not universally true that more cores or HT make a difference but maybe it would be a good idea to focus on those games we know do benefit like Metro last light, Hitman absolution, Medal of honour warfighter and some areas of Crysis 3.The problem here is that its the games that support more multithreading, so to give true impression you need to test a much wider and modern set of games. To do otherwise is pretty misleading.
althaz - Thursday, October 3, 2013 - link
To test only those games would be more misleading, as the vast majority of games are barely multithreaded at all.erple2 - Thursday, October 3, 2013 - link
Honestly, if the stats for single GPU's weren't all at about the same level, this would be an issue. It isn't until you get to multiple GPU's - an area that you start to see some differentiation. But that level begins to become very expensive very quickly. I'd posit that if you're already into multiple high-end video cards, the price difference between dual and quad core is relatively insignificant anyway, so the point is moot.Pheesh - Thursday, October 3, 2013 - link
appreciate the review, but it seems like the choice of games and settings makes the results primarily reflect a GPU constrained situation (1440p max settings for a CPU test?). It would be nice to see some of the newer engines which utilize more cores as most people will be buying CPU for titles in the future. I'm personally more interested in the delta between the CPU's when in CPU bound situations. Early benchmarks of next gen engines have shown larger differences between 8 threads vs 4 threads.cbrownx88 - Friday, October 4, 2013 - link
amenTheJian - Sunday, October 6, 2013 - link
Precisely. Also only 2% of us even own 1440p monitors and I'm guessing the super small % of us in a terrible economy that have say $550 to blow on a PC (the price of the FIRST 1440p monitor model you'd actually recognize the name of on newegg-asus model (122reviews) – and the only one with more than 12reviews) would buy anything BUT a monitor that would probably require 2 vid cards to fully utilize anyway. Raise your hand if you're planning on buying a $550 monitor instead of say, buying a near top end maxwell next year? I see no hands. Since 98% of us are on 1920x1200 or LESS (and more to the point a good 60% are less than 1920x1080), I'm guessing we are all planning on buying either a top vid card, or if you're in the 60% or so that have UNDER 1080p, you'll buy a $100-200 monitor (1080p upgrade to 22in-24in) and a $350-450 vid card to max out your game play.Translation: These results affect less than 2% of us and are pointless for another few years at the very least. I'm planning on buying a 1440p monitor but LONG after I get my maxwell. The vid card improves almost everything I'll do in games. The monitor only works well if I have the VID CARD muscle ALREADY. Most people of the super small 2% running 1440p or up have two vid cards to push the monitors (whatever they have). I don't want to buy a monitor and say "oh crap, all my games got super slow" for the next few years (1440p for me is a purchase once a name brand is $400 at 27in – only $150 away…LOL). I refuse to run anything but native and won't turn stuff off. I don't see the point in buying a beautiful monitor if I have to turn in into crap to get higher fps anyway... :)
Who is this article for? Start writing articles for 98% of your readers, not 2%. Also you'll find the cpu's are far more important where that 98% is running as fewer games are gpu bound. I find it almost stupid to recommend AMD these days for cpus and that stupidity grows even more as vid cards get faster. So basically if you are running 1080p and plan to for a while look at the cpu separation on the triple cards and consider that what you'll see as cpu results. If you want a good indication of what I mean, see the first or second 1440p article here and CTRL-F my nick. I listed all the games previously that part like the red sea leaving AMD cpus in the dust (it's quite a bit longer than CIV 5...ROFL). I gave links to the benchmarks showing all those games.
http://www.anandtech.com/comments/6985/choosing-a-...
There's the comments section on the 2nd 1440p article for the lazy people :)
Note that even here in this article TWO of the games aren't playable on single cards...LOL. 34fps avg in metro 2033 means you'll be hitting low 20's or worse MINIMUM. Sleeping dogs is already under 30fps AVG, so not even playable at avg fps let alone MIN fps you will hit (again TEENS probably). So if you buy that fancy new $550+ monitor (because only a retard or a gambler buys a $350 korean job from ebay etc...LOL) get used to slide shows and stutter gaming for even SINGLE 7970's in a lot of games never mind everything below sucking even more. Raise your hand if you have money for a $550 monitor AND a second vid card...ROFL. And these imaginary people this article is for, apparently should buy a $110 CPU from AMD to pair with this setup...ROFLMAO.
REALISTIC Recommendations:
Buy a GPU first.
Buy a great CPU second (and don't bother with AMD unless you're absolutely broke).
Buy that 1440p monitor if your single card is above 7970 already or you're planning shortly on maxwell or some such card. As we move to unreal 4 engine, cryengine 3.5 (or cryengine 4th gen…whatever) etc next year, get ready to feel the pain of that 1440p monitor even more if you're not above 7970. So again, this article should be considered largely irrelevant for most people unless you can fork over for the top end cards AND that monitor they test here. On top of this, as soon as you tell me you have the cash for both of those, what they heck are you doing talking $100 AMD cpus?...LOL.
And for AMD gpu lovers like the whole anandtech team it seems...Where's the NV portal site? (I love the gpus, just not their drivers):
http://hothardware.com/News/Origin-PC-Ditching-AMD...
Origin AND Valve have abandoned AMD even in the face of new gpus. Origin spells it right out exactly as we already know:
"Wasielewski offered a further clarifying statement from Alvaro Masis, one of Origin’s technical support managers, who said, “Primarily the overall issues have been stability of the cards, overheating, performance, scaling, and the amount of time to receive new drivers on both desktop and mobile GPUs.”
http://www.pcworld.com/article/2052184/whats-behin...
More data, nearly twice the failure rate at another vendor confirming why the first probably dropped AMD. I’d call a 5% rate bad, never mind AMD’s 1yr rate of nearly 8% failure (and nearly 9% over 3yrs). Cutting RMAs nearly in half certainly saves a company some money, never mind all the driver issues AMD still has and has had for 2yrs. A person adds a monitor and calls tech support about AMD eyefinity right? If they add a gpu they call about crossfire next?...ROFL. I hope AMD starts putting more effort into drivers, or the hardware sucks no matter how good the silicon is. As a boutique vendor at the high end surely the crossfire and multi-monitor situation affects them more than most who don't even ship high end stuff really (read:overly expensive...heh)
Note one of the games here performs worse with 3 cards than 2. So I guess even anandtech accidentally shows AMD's drivers still suck for triple's. 12 cpus in civ5 post above 107fps with 2 7970's, but only 5 can do over 107fps with 3 cards...Talk about going backwards. These tests, while wasted on 98% of us, should have at the least been done with the GPU maker who has properly functioning drivers WITH 2 or 3 CARDS :)
CrispySilicon - Thursday, October 3, 2013 - link
What gives Anand?I may be a little biased here since I'm still rocking a Q6600 (albeit fairly OC'd). But with all the other high end platforms you used, why not use a DDR3 X48/P45 for S775?
I say this because NOBODY who reads this article would still be running a mobo that old with pcie 1.1, especially in multi-gpu configuration.
dishayu - Friday, October 4, 2013 - link
I share you opinion on the matter, although I'm myself still running a Q6600 on a MSI P965 Platinum with an AMD HD6670. :PThomasS31 - Thursday, October 3, 2013 - link
Please also add Battlefield 4 to the game tests in the next update(s)/2014.I think it will be very relevant, based on the beta experience.
tackle70 - Thursday, October 3, 2013 - link
I know you dealt with this criticism in the intro, and I understand the reasoning (consistency, repeatability, etc) but I'm going to criticize anyways...These CPU results are to me fairly insignificant and not worth the many hours of testing, given that the majority of cases where CPU muscle is important are multiplayer (BF3/Crysis 3/BF4/etc). As you can see even from your benchmark data, these single player scenarios just don't really care about CPU all that much - even in multi-GPU. That's COMPLETELY different in the multiplayer games above.
Pretty much the only single player game I'm aware of that will eat up CPU power is Crysis 3. That game should at least be added to this test suite, in my opinion. I know it has no built in benchmark, but it would at least serve as a point of contact between the world of single player CPU-agnostic GPU-bound tests like these and the world of CPU-hungry multiplayer gaming.
Harry Lloyd - Thursday, October 3, 2013 - link
I am sorry to say that, but I never expected to see a completely useless test at AnandTech. 99% of singleplayer games are fine with a Core i3, 99.9% of all games are fine with a stock Core i5, but there is that 0.1 % that is not, and it is mostly multiplayer.Go look at the BF4 beta tests, where even a Haswell i7 is a bottleneck. Even BF3 muliplayer performs better with a modern i5 than with a 1156/1366 CPU.
And with next-gen games right around the corner, the situation might change drastically with more and more games needing a very fast quad core CPU.
Dribble - Thursday, October 3, 2013 - link
Agree. What's the point of running time demos (need less cpu grunt) on single player games (need less cpu grunt) at very high res/settings (gpu bound max fps) with no min fps (so weak cpu bottlenecks hidden).Makes those with weak cpu's feel better and leads to lots of "surprising my AMD processor is good enough" comments, but it actually for many people it's not.
just4U - Thursday, October 3, 2013 - link
Personally I'd say that's a load of BS. I work with a lot of different setups, and unless your an enthusiast the average gamer really can't tell the difference. Their coming off of older setups already so unless your cutting a ton of corners you can easily go the AMD route for a good majority of them.glugglug - Thursday, October 3, 2013 - link
I don't understand the recommendation for "at least quad core" for Civilization V.Having looked at task manager during the game, it quickly becomes apparent that the game is effectively entirely single threaded. It doesn't even have a separate thread for video rendering vs. AI, or if it does, they completely block each other. Setting the CPU affinity to keep the game on a single core makes absolutely no difference in that game.
JPForums - Thursday, October 3, 2013 - link
I've got to say, I'm impressed with the common sense approach, both to the setup of a benchmark of this size, and some of the conclusions I'm reading.
I'm interested to see how many AMD processor's end up above the "good enough to not bottleneck the GPU setup" line. I wonder if they will be cost effective vs and Intel setup.
A future experiment of interest to me is whether or not more budget oriented chipsets significantly hinder performance. I guess the question that's on my mind is "Is there any situation in which a faster processor on a board with lesser capabilities would be outperformed by a (somewhat) slower processor on a board with greater capabilities?" Or put a different way, "How much processing power is it logical to sacrifice in pursuit of a better platform (I.E. more PCIe lanes for multiGPU setups)?"
KaarlisK - Thursday, October 3, 2013 - link
Could you please test a Haswell Pentium? In comparison to the i3, it has only slight lower frequency, but no HT, and is way cheaper.ShieTar - Thursday, October 3, 2013 - link
I second this request. From the limited amount of tests I could find so far it seems that saving money on the CPU and investing it into the GPU is the way to go for most games. That seems to include even seemingly unbalanced combinations like a Pentium and a GTX 780 beating a Quad-Core and a GTX 770.Flunk - Thursday, October 3, 2013 - link
I was quite surprised to see the Sandy Bridge chips hanging in there. There doesn't seem to be much need to upgrade if you have a i5-2500K or i7-2600k, especially if you factor in how easy they are to overclock for 4.5Ghz and sometimes beyond.just4U - Thursday, October 3, 2013 - link
Wasn't much of a bump for ivyB or Haswell really.. Put all three on a table ("STOCK") /w similar hardware and I'd lay money on 99.9% not being able to tell the difference. CPU's have been going sideways in performance rather than upwards. (My opinion..) for sometime now.What's interesting is Socket1366 cpu's are finally beginning to show some age..
A5 - Thursday, October 3, 2013 - link
Performance hasn't been increasing (as much) because of the focus on power consumption in laptops. That and AMD's utterly noncompetitive products at the high end.I could 100% tell you which system was which if I had a Kill-A-Watt, though.
A5 - Thursday, October 3, 2013 - link
To finish that thought, I do wish Intel still had some mainstream (aka cheaper) 130W CPUs on their normal platform.just4U - Thursday, October 3, 2013 - link
Yep.. you should also be able to tell the difference simply by measuring heat.. The SandyB's tend to run a little cooler than the IvyB although they must have done something in Haswell since it does run cooler in normal operation.. but heats up rather quickly under load just like the IvyB. But on the surface their all fairly comparable I think anyway.brucek2 - Thursday, October 3, 2013 - link
My main system is still rocking an i7-920. These charts help explain rationally what my brain must have somehow known subconsciously: that there's not yet much reason to upgrade. (I'm discounting the +50% gains on the CPU benchmarks, because my i7-920 is overclocked, making the gains much less. And I'm rarely CPU bound for long.)I would like a 6GB/sec SATA controller some day. My poor SSDs must be very frustrated with their host.
Senti - Thursday, October 3, 2013 - link
I'm in a similar boat: using i7-930 @4GHz. Seriously, who runs those wonderful Nehalem CPUs on default clocks when they easily overclock 1.5x? And with this overclock advantage of the newer CPUs is really underwhelming: far less than i7-920 line here shows.As for SSD, I use PCI-E based one and it's probably still faster or at least on par with newest SATA ones.
A5 - Thursday, October 3, 2013 - link
My 920 refused to go over 3 GHz after I updated the BIOS one day. Before that I still only got 3.5 or so.My 4770K is a crappy overclocker, too. Maybe it's just me :-p
cbrownx88 - Friday, October 4, 2013 - link
A5 - was your 920 a C0 stepping? Mine is a D0, which at the time of purchase I remember going way out of my way to check the stepping before pulling the triggerBOMBOVA - Saturday, October 26, 2013 - link
i put in a value pcie 6 Gbs, Syba controller card, only capable of 32k or 64k blocks, but is value at less than fifty bucks. works well,ninjaquick - Thursday, October 3, 2013 - link
Isn't Win7 old? Benchmarks like these should be run on the latest Windows, at least IMHO.brucek2 - Thursday, October 3, 2013 - link
Hasn't Win8 been rejected by large numbers of desktop enthusiasts & gamers? Its adoption rate on older platforms like many included here is pitiful.Fortunately my sense from other articles is that its not likely to have made a significant difference either way?
DanNeely - Thursday, October 3, 2013 - link
The Steam HW survey has W8 at 16.4% vs 66.8% for W7.I suspect W7 is being used in order to keep results directly comparable to historic results.
warezme - Thursday, October 3, 2013 - link
I to invested in the venerable (speak only in awe hushed whispers), i7 920 which I promptly overclocked to 3.6Ghz. This little jewel has been going strong for Goodness almost half a decade? and stable as a rock and I notice holding it's own very well even up against the latest and greatest. This is a testament to competition and engineering when competition in the CPU arena existed. I have long switched from dual GPU's to single but dual core cards on a single fat 16x pci-e bus even though my Evga X58SLI board supports higher. I'll ride the wave one more year and see what new gear crashes in next year. Hopefully a new Nvidia architecture that will inspire me to upgrade everything.Hrel - Thursday, October 3, 2013 - link
"our next update will focus solely on the AMD midrange."Please don't do that. PLEASE include at least 3 Intel CPU's for comparison. It doesn't matter if the FX8320 does well in benchmarks if for another $40 bucks I can get a i54670 that runs 50% faster. These are hypothetical numbers, obviously, but then Intel will be faster. By how much matters, once you factor in price and energy draw especially.
A5 - Thursday, October 3, 2013 - link
The old numbers will still be there for comparison. The next update is just *adding* more AMD data.just4U - Thursday, October 3, 2013 - link
It's hard making sense of AMD data in comparison to Intel. As near as I can tell their sitting at just beyond i7920 performance these days but /w all the new features. It gets confusing when you look at the X4 X6 older stuff though since some of that is actually faster... yet somehow only compares favorably to Intel's 9X Core2 stuff.just4U - Thursday, October 3, 2013 - link
Why 3? The i5 entry level 4430 beats out every AMD chip on the market in most instances. Adding in more simply confuses people and adds more fodder for fanboys to fight over.. and I think it taxes the patience of most of us that already know what's what in the cpu arena.Simple rule of thumb. If your on a budget you may want to go AMD to get all the "other" bells and whistles your looking to buy or.. if you have a more to spend your starting point will be the i54430.
just4U - Thursday, October 3, 2013 - link
Excellent article Ian, I really like the inclusion of older CPU's. It's a good basis in which to decide if it's "time" to upgrade on that front. Most of the people I know are not on the bleeding edge of technology. Many sit back in 2009 with minor updates to video and Hard Drives. Anyway.. Well done lots to sift thru.Jackie60 - Thursday, October 3, 2013 - link
At last Anandtech is doing some meaningful second decade of 21st century testing. Well done and keep it up ffs!SolMiester - Thursday, October 3, 2013 - link
Can someone please tell me why we are using 2+ yr old GPUs?A5 - Thursday, October 3, 2013 - link
You could read the article.OrphanageExplosion - Thursday, October 3, 2013 - link
Amazing data. I do wonder whether the testing at max settings is a good idea though. The variation in performance can be extreme. Just watch the Metro 2033 benchmark play out. Does that look like the kind of experience you'd want to play?Perhaps more importantly though, the arrival of next-gen console changes everything.
Did you see the news that Watch Dogs is x64 only? That's just the tip of the iceberg. Developers need to go wide to make the most out of six available Jaguar cores. Jobs-based scheduling over up to eight cores will become the norm rather than the exception. The gap between i5 vs. i7 will widen. AMD FX will suddenly become a lot more interesting.
In short order, I'd expect to see dual core CPUs and less capable quads start to look much less capable very quickly. i5 vs. i7 will see a much larger gulf in performance.
Check out the CPU data here for the Battlefield 4 beta:
http://gamegpu.ru/action-/-fps-/-tps/battlefield-4...
The dual cores are being maxed out, FX-8350 is up there with the 3930K (!)
tackle70 - Thursday, October 3, 2013 - link
The 8350 is with the 2600k, not the 3930k...So yeah, it's a very good showing for AMD, but not as good as what you indicate. Also, according to sweclockers, an overclocked i5 is still superior to an overclocked 83xx CPU, so make of that what you wish.
I'm just glad we're seeing games starting to use more than 2-4 threads effectively.
Traciatim - Thursday, October 3, 2013 - link
Much more likely is that games will just become less and less reliant on CPU power because of the terrible netbook processors in the consoles and will instead rely more and more on the GPU. The PC versions of games will just be the same game with a high res texture pack and some extra graphics bling to use up GPU cycles while your processor sits around shuffling a little data.Flunk - Friday, October 4, 2013 - link
I'm not sure AMD will benefit that much. As soon as consumer CPUs have a reason to have more cores they're just release a new chip with more cores. There is absolutely no reason that they can't release a 8 or ever 12 core desktop processor, they're already selling them for servers.Flunk - Friday, October 4, 2013 - link
Forgot to mention, Watch Dogs is probably x64 only because they want to use more than 2GB of RAM (which is the limit for the user-mode memory partition in Win32).Nirvanaosc - Thursday, October 3, 2013 - link
Looking just at the gaming results, does this means that almost any CPU is capable to feed the GPU at 1440p and it is always GPU limited?Nirvanaosc - Thursday, October 3, 2013 - link
I mean in single GPU config.Traciatim - Thursday, October 3, 2013 - link
That's pretty much just the games they picked. If you could reliably benchmark large scale PC games like Planetside 2, or other popular large scale MMO's reliably you'd pretty much see the exact opposite. The trouble is, it seems like no MMO makers give you reliable benchmarking tools so you can't use them for tests like these.ryccoh - Thursday, October 3, 2013 - link
I would really like to see a CPU comparison for strategy games.For example, one could have a save game of a far advanced game in Civilization 5 or Total War with many AI players on the largest map and then see how the waiting time varies between the different CPUs. This should be feasible, shouldn't it?
I'm running an i5 2500k @4.6ghz and it just isn't cutting it for Civilization 5 on a large map once you're far into the game, it would be nice to see whether getting hyperthreading and more cores would be worth it.
glugglug - Thursday, October 3, 2013 - link
Having waited the ridiculous amounts of time between turns on Civ V, and having dual monitors, I put task manager up on the second monitor while it was running, to see that Civ V *IS NOT MULTITHREADED. AT ALL*. Setting the CPU affinity to make it use only 1 logical core makes absolutely no performance difference at all! The only thing I can think of for why a better result would be seen on quad-core systems would be that it likes having a larger L3 cache.glugglug - Thursday, October 3, 2013 - link
P.S. If my "Civ V just likes cache" theory is right, an Iris Pro laptop should be the ultimate Civ V machine.konondrum - Thursday, October 3, 2013 - link
Thank you for doing this, it's quite informative. I just have one suggestion, perhaps you could get a Lynnfield CPU into these benchmarks. I've been happily using my i5-750 for about 4 years now, but I'm unsure if the performance would be closer to a i7-920 or Q9400. I'm thinking it may be getting close to time to upgrade, but I've still never come across a game or app that seems to choke it.teiglin - Saturday, October 5, 2013 - link
+1 to Lynnfield. My i5-750 is still running great at a gentle ~3.5GHz, and I haven't really felt the burn in games, but I also am not running multi-GPU. Still I'd love to see how well it stacks up to the competition.cbrownx88 - Thursday, October 3, 2013 - link
BF3/4 pleeeease.BF4 beta seems very CPU-bound on an i7920 at 4.2ghz at 1920x1200... very different story than BF3 (where I currently didn't feel the need to update)
pandemonium - Friday, October 4, 2013 - link
I love articles like this! Excellent stuff, thanks!I'm still confused why you guys don't have an i5-3570k in your line-up. Of all of the processors, that's probably the most crucial to have, being it's performance for the price and popularity for builds. These tests give me little to go on without that processor, as important as it is for the general builder!
pandemonium - Friday, October 4, 2013 - link
Also, CPU benchmarks for gaming aren't as necessary with single player games. A necessary contrast for CPU comparison will be for MMOs or multiplayer FPS runs. Obviously it's more difficult to get accurate baseline results for such instances, but a large amount of runs should at least minimize any variables between each testing instance and give a broader definition for how well each processor will perform.If you guys could get on the latest MMOs and test out these rigs, that'd be where I see charts for CPU comparison really come into play for gaming.
Nirvanaosc - Friday, October 4, 2013 - link
Final Fantasy XIV has a benchmark, but I don't know if it fits on their needs. Maybe worth check it although I'm not sure if that represents real world gameplay.BOMBOVA - Friday, October 4, 2013 - link
Guys and Gals:, i have started using Video editing software, Coral, and Photo software ACD17, and need the best out of my i7 960, so, i have spent considerable time fitting with a new SYBA x4 SATA 3 controller, "50" bucks, and a better CPU cooler, double fan, big aluminum / copper beast. out come it works, SSD's are working like they should in Raid 0, / with 2.5 hdd backup, using hot swap. and i went back to the literature " stories " of over clocking, and settled on a 4.2 overclock. Letting you all know it words really, really good. i am almost ashamed of myself, " being techie " that it took me so long to get practical. . i am now resigned to waiting for DDR 4, and PCIe 3.x for future considerations. good article/ thanksBOMBOVA - Saturday, October 26, 2013 - link
update, i really parked the beast cpu 960 at 3.8x Mhz, 4.2 is too HOT, the fan rate is NOISy, and it is unstable, as i am wondering if my hard drive crashes are Malware, or just unstable, NONE of that at 3.8x, i will adjust my PCIe rate up from 133 MHz, say towards 137 or even 140 if stable, on all add in cards. it only works to your least stable card, i have settled on a Marvel, hdd controller card, for cheap , cost effective, bottom line, i like this article, and after 3 days of work, i am on to doing work with my computer. thanks all. Cheers Thomas Gx yvr.ca Vancouver CanadaRanger101 - Friday, October 4, 2013 - link
Every time an article of this sort is written, the conclusion is the same: In the vast majority of cases, due to GPU bottlenecks the differences between Cpus are so minute that no-one would ever notice the performance difference in game. Yawn.Ranger101 - Friday, October 4, 2013 - link
Every time an article of this sort is written, the conclusion is the same. In the vast majority of cases, due to GPU bottlenecks, the performance differences between CPUs are so minimal that no-one would notice the difference in game. Yawn.Ranger101 - Friday, October 4, 2013 - link
Every time an article of this sort is written, the conclusion is the same. In the vast majority of cases, due to GPU bottlenecks, the performance differences between CPUs are so minimal that no-one would notice the difference in game. Yawn.This is the 3rd time I am posting this comment as it seems to be continually removed. Yet it is a legitimate and non offensive comment. What happened to freedom of expression at Anandtech?
Flunk - Friday, October 4, 2013 - link
I'm seeing all three of your postsdingetje - Friday, October 4, 2013 - link
would be nice to see how the haswell pentiums (like the g3420) do as low budget low power gaming cpu.too bad none of the review sites so far have deemed them worthy of a review so far.
geok1ng - Friday, October 4, 2013 - link
"Of course we would suggest Haswell over Ivy Bridge based on Haswell being that newer platform."If only Haswell OCs were equal to IB OCs. With Haswell you are STUCK at 4.2-4.6Ghz, depending on your luck, and going water wont help. With IB 4.4-5.0Ghz in usual, and the more money you invest in cooling , the better will be your OC. This luck of the draw in Haswell, and the walls in OCing at Z87 should be considered, especially at triple and quad GPU builds aiming at 4k gaming, where a bad overclock is the doom of the entire system.
coachingjoy - Friday, October 4, 2013 - link
good job, like your work.meliketrolls - Friday, October 4, 2013 - link
Of course AMD CPUs will have better scores. It's just that... AMD is WAAAAY better than Intel.R-Type - Friday, October 4, 2013 - link
I have a Dell XPS 420 with Q6600. With the 8800GT (512 MB) card I was getting about 40 fps with medium settings. When I upgraded to a GTX 670, I got about 60 FPS with high settings, a very noticeable improvement. In my experience, a quad core Q6600 is still a pretty competent gamer with a strong graphics card on all but the most extreme games.R3dox - Friday, October 4, 2013 - link
I'm one of those with a D0 i7 920 and it's been running at 3.8GHz (19x200bclk with 'only' 1600C9 memory, 12GB) for over 4 years. I suppose I'll just have to wait for a nice native PCIe SSD to avoid the old SATA controller and I'm golden for a good while more. It's just my HD6970 that could use replacement at some point (1920x1200 reso, nothing crazy).BOMBOVA - Friday, October 4, 2013 - link
for my x58, i put in one of these "" SYBA SI-PEX40057 PCI-Express 2.0 "" fifty bucks, makes the newer ssd's rock, i am still happy with my platform and video work, and jpg work, is flash twice as good. " we love our i7-920"s " Cheers, good thread this, all power users, / good funR3dox - Sunday, October 6, 2013 - link
When I upgraded to my current intel 520( due to being 3x bigger than previous ssd), I looked into such cards but they were pretty bad and except for sequencial 128kb read, slower than the intel sata controller. I see this is a new version of the marvell controller but is it actually comparable to an intel sata 3 controller this time?BOMBOVA - Sunday, October 6, 2013 - link
a low cost raid controller yes, 64kb, 128kb, show the merit of raid 0, at 6 Gbs each, i was doubtful myself, but took the test of the device, for i need better video editing performance, at least it works :) now we have to watch out for the 12Gbs devices coming soon, imo for games, not much notice of improvement to be seen, but in big data transfers, sata 3 improvements, can be had for low costs. gl, trying it out, borrow a card to try, if you can, Cheers.BOMBOVA - Saturday, October 26, 2013 - link
fact is you can set to 32 K blocks, or 64K, 'only" , but is a true Marvel controller chip, in the Syba, and is on the PCI e buss. , Control M, sets the chipset, works rite off, is quick, but, there is a hint, that the lanes are only 5 Gbit second, still is a fine patch upgrade, on low cost 6Gbit second ssd's i am in for 2x120 ssd's and controller for 250,R-Type - Friday, October 4, 2013 - link
Results are typical for a variety of games where the resolution is set to 1920 x 1200. Games include Dirt 3, Civilization V, Guild Wars 2, Mechwarrior Living Legends, Diablo 3, Starcraft 2, etc.augiem - Friday, October 4, 2013 - link
I'm really kind of shocked to see how well Nehalem stands up still in many benchmarks. If you adjust the i7 920 benchmarks to make up for the difference in frequency between it and the 4770K, it's not half bad. I used the difference between the i7-920 and i7-950 to determine how the benchmark scaled on Nehalem. If it was close enough to linearly (+/- 1%), I considered it. I saw a 6% - 40% performance advantage for Haswell across the CPU tests, which is actually smaller than I expected for an almost 6 year old chip. (Obviously this includes differences in the platforms too.) Striking that even in 6 years the speed hasn't even doubled.I'm still on an i7-920@3.6, so this was very relevant to me. If it were 40% across the board, it might be more compelling, but quite a few were more like 15%, 20%, etc. Now I understand Haswell is going to OC a lot further than this one, so in that way you could get the performance diff up there.
I'm just dumbfounded that this Nehalem has lasted me 4.5 years already and it still doesn't feel slow. On the one hand its great value for the money, but on the other hand its a little disappointing to see performance curve drop off like it has over the past 6 years.
That would be a fun project. Make a graph showing average CPU performance increases over the last 30 years.
Genericuser1234 - Saturday, October 5, 2013 - link
As interesting these chips are for getting maximum performance from the high w parts. Will you do an article about the low TDP parts that are the true masterpieces Intel makes. I live in Denmark and energy cost more and more. My PC is running almost nonstop and I am curious how well these chips perform in a gaming environment. How far behind are they on performance and what kind of power cost on a year based on an average workload / idle time are we talking about you could save. I find the low power chips to be Intel's true stars. Do more with less. Maybe even throw in a power house chip from 2 years ago for comparison. That would be an interesting articleagent_x007 - Saturday, October 5, 2013 - link
Congrats to those who did testing part.Can't wait to see AMD added.
Too bad Pentium XE 955/965 (ie. Presler B1/C1 @ 3,46/3,73GHz) didn't "cut it" for this comparson :(
Hyper Threading and "Last of Netburst" legacy could be interesting in comparison with low end, fully-intergated setups, like VIA Nano or AMD Fusion.
+ There is also a possibility that todays multithreaded programs would better utilize the 4 threads of this kind of CPU, maybe to the point of matching Core 2 Duo's...
Either way, to sum it up in two words : GREAT WORK.
khanov - Saturday, October 5, 2013 - link
A good article, and nice to see an update now that new CPU's are out.Wouldn't it be nice if you could have all the benefits of X79 for multi GPU configurations, but without the added cost over Z87? Well actually you can, if you take in to account the quad-core LGA2011 CPUs.
The i7-4820K is no more expensive than the i7-4770K, and motherboard costs are very similar too. So people seriously considering 3 or 4 GPUs might be very interested in this option, to gain the benefits of extra PCIe lane allocation without the extra cost of a hex core CPU.
Ian, would you please consider adding i7-3820 and/or i7-4820K to the next update? It would be nice to see how well, or how badly, they fare against the competition.
MarcHFR - Sunday, October 6, 2013 - link
Hi all,Sorry but i don't understand this review. What's the point of recommanding different CPU on the only basis of single/dual/tri/quad GPU ?
First, the GPU power is not related to the number of GPU only, with 2x660 you get lower performance than 1x780, but if i read the conclusion for 2x660 you recommand FX-8350 but A8-5600K for 1x780 ?
Second, for example with only a 7970 with a small CPU or a big CPU you get exactly the same performance on Sleeping Dogs 2560*1440 max settings. But what kind of player will keep a setting that offer 28 fps on such a carde ? None ! They will lower the graphic settings related to the GPU only to a point that they will get a higher framerate, like the 80 fps you get with three card.
Whatever the number/power of the GPU, as soon as it's not a lower-end card, the CPU needed to get playable framerate is the same with a GTX 660 or 2xGTX 780 as soon as you don't use graphics settings related to the GPU that lower the framerate that can be sustained by the GPU under the framerate that can be sustained by the CPU.
You can recommand different CPU to get more than 40/60/80/120 fps in some games (but good luck since integrated benchmark are generally not using the most CPU bound scene), but recommand different CPU for single/dual/tri/quad GPU seems a non-sense for me.
Majesticii - Sunday, October 6, 2013 - link
Damn. How can you call this a CPU comparisson with data like this. The games are run at such extreme values that in no way they represent the impact of a CPU. Sleeping dogs is just 4 graphs with 28fps, how can any respected researcher show this data without severe shame. To add insult to injury, the vast majority seems to think this is how CPU tests are done and call it a nice review. Literally my heart sank as i read through these comments. Noone (except a few ignored), not even the reviewers has a clue what they're on about. This way of CPU-reviewing in games needs to stop. This isn't just uninformative, it's worse; It's completely misleading. Test games at 800x600 low settings, and pay no mind to those people calling for "real-world benchmarks". Stay true to what's real, instead of appealing to the community.3Ball - Monday, October 7, 2013 - link
Forgive me if this is pointed out in the article and I have missed it, but it is worth pointing out. Battlefield 4 will use up to 8 cores/threads. My i7-860 @ 4.0ghz with hyperthreading is outperforming a friends Ivy bridge (3570k) at 4.4ghz without hyperthreading, so much so that my frames are better using a GTX680 against his GTX780.This could be the product of the "beta", but I do believe it is a sign of things to come. The new consoles are most likely going to influence multithreaded performance greatly considering the lower sinlge thread performance present in the systems.
I have been planning on rebuilding with haswell early next year and was planning on getting a 4670k, but have now changed that decision to going with a 4770k due to this experience. Just my two cents. Cheers!
pandemonium - Tuesday, October 8, 2013 - link
It doesn't, because it doesn't exactly capture the dynamics of displaying several player models at once. It does a decent job at displaying several preprogrammed models at once.The FF benchmarks have been a fairly low estimation of actual game performance when it comes to more demanding instances of raids and large crowds. With that said, they do better than most other canned benchmarks for determining the performance of a machine. Given it's consistent testing environment, I guess it wouldn't hurt to use it as a go-to benchmark.
Tormeh - Tuesday, October 8, 2013 - link
But where is the Civilization V end-of-turn benchmark? I don't care about the frame rates, I care about the times I'm staring at the screen waiting for the game to finish its calculations!defiler99 - Thursday, October 10, 2013 - link
I don't normally comment (as the reviews are generally excellent), but I was actually shocked to see the choice of graphics card(s) for this roundup. Nobody buying a gaming CPU is going to have stuff that slow, right? So many of the tests result in framerates under 60fps, etc.DPOverLord - Thursday, October 10, 2013 - link
Be great to see this with the new 4930K, Titan @ 1600pdennphill - Thursday, October 10, 2013 - link
Learn to write in the English language - or at least use the grammar checker. I wince reading this article. (But thanks for the effort. Content is OK)Hrel - Monday, October 14, 2013 - link
I'm finally in the process of building a new desktop, mini-ITX. Gonna use a 4570S CPU. Primary duties will be media streaming but I'll game on it too. The computer it's replacing? 650i SLI chipset based computer running an E8400 Core 2 duo. I can still max out Mass Effect games with no issue. Minecraft maxes out the CPU but that's just because Java sucks. So that old 2007 era computer is still a viable gaming machine with the GTX460 in it. Talk of needing to replace a Nehalem CPU soon seems kind of absurd to me. But then again I have no interest in Far Cry or Crysis.markthema3 - Tuesday, October 15, 2013 - link
What about The Witcher 2 for a benchmark? I have yet to see anything be more intense than that game's Ubersampling option.SeriousTodd - Tuesday, October 15, 2013 - link
What are the disadvantages of buying a 4770K?Enterprise24 - Saturday, October 19, 2013 - link
Wanna see Total War Rome II in real time tactical mode (Probably the most CPU intensive game).boozzer - Thursday, October 24, 2013 - link
well damn. it seems like if I am single gpu gaming at 1080p, cpu doesn't matter much at all? a 5800k would do the job well enough.question: will aa amd 5800k bottleneck a gtx780? or a 290x? in 1080p. or it doesn't matter at all? since the resolution is so low. I am sure I am staying in 1080p for at least 5 more years. and my current pc parts are really old(c2d e8500 + 460 1gb) and thinking of upgrading. I am sure a 780 or 290x would last 5 or more years, so kinda want a matching cpu.
BOMBOVA - Saturday, October 26, 2013 - link
This article, prompted me to pull my P6T Asus mb out and replace it with a MSI 7666 which holds the 1366 I7 chip, i put in a 960 over my 920 , and clocked it up to 3.8x so far, and with my Nvidia 470 gpu, n raid 0 2x120 Gbyte ssd's things are rocking along, really good, it seems compared to the high end stuff presented here. , i had to install a cheap Syba controller card, on the Marvell chip set, set to 32 kbits, over 64 kbs, better data storage efficiency over speed, n a 4 channel usb 3.0 card, and it is good to go. , since i bought good , near new , used, i am in it for say half, price. and it works for me, i had a tough go, with microsoft critical patch updates. flooded on oct. 8th, n 15th. so my system restore points, crashed, , i am now set on NO automatic downloads, and all is good, " this is like a 3 day experience " i don't want to go through again, , enjoyed the article, and comments. good comments, thanks guys and girls. and am looking for the DD$4 stuff n Haswell super processor of 14, or early 15, now on to doing work with my computer :) , Cheers, all , have fun with candies, next, and have a good Xmas, buy yourselves something nice. lets keep america working, rtg. Vancouver CanadaWHISP - Tuesday, November 5, 2013 - link
Your review is talking about recommendations based on # of GPU's but seems to make the assumption GPU = graphics card. I have a GTX 690 and am looking to possibly upgrade my cpu/mobo, what would your recommendation be keeping in mind in the future I may buy another gtx690 to boost performance? What CPU PLX or non PLX combo do I need to satisfy two 690's in sli?Gastec - Tuesday, November 5, 2013 - link
You don't need PLX with dual-SLI, you don't even need a second GTX 690 :P I myself would never ever consider spending so much money on a video card, but I guess you play on multiple 30'' monitors with the maximum available resolution. Each with it's own. If you insist, then get the Intel i7-4960X, a socket 2011 X79 Asus motherboard with a PLX chip on it and 3 Nvidia GTX Titan. That would surely give you at least 150 FPS in any game except those that are specifically designed not to give more than what the designers want, like Crysis.Gastec - Tuesday, November 5, 2013 - link
I'm on a i7-860 since 2010 and HT was also a decision factor for buying it. But over the years I don't think HT has helped me that much with what I did and do on the PC. So now, after reading this article -which is very helpful- I think a i5-4670(K) with it's $100 lower price difference will suffice. Unless...upcoming games like The Elder Scrolls Online(that I want to play) will make use of HT, but I don't think so. Does anyone know of any game that makes good use of Hyper-Threading, or at least 4 cores?BlackOmega - Friday, November 8, 2013 - link
Goddammit. 6 years I spent without upgrading my rig, now I come back to anandtech and I can't understand one a single one of those benchmarks.Hell, WHERE ARE THE CPU CLOCK SPEEDS? How the hell Intel and AMD expect me to understand this gibberish the use to name their processors, I want to compare IPC on every bench I see, I want to see em every test how the ghz of one CPU compares to another. I'm not going to read those benchs with a cpu dictionary trying to interpret every name on this list, nor have I a good enough memory to remember what CPU have more cache ou clock speed than the other as described in the first page.
6 years I stood away from the hardware scene, now I came back and I can't understand anything.
/frustrated
BlackOmega - Friday, November 8, 2013 - link
ps: AMD and Intel naming scheme suck, give us back clock speed.oranos - Tuesday, November 26, 2013 - link
2500k best value gaming processor of all time :)