I can barely notice the difference between 720P and 1080I on my 32" LCD. Will people notice the difference between 1080P and 4K on a 61" screen?
It seems we have crossed the point where improvements in HD video playback on Sandy Bridge and post-Sandy Bridge machines are discernible to normal people with normal screens.
I spoke to a high-end audiophile/videophile dealer, and he tells me that the state of video technology (Blu-Ray) is pretty stable. In fact, it is more stable than it has ever been in the past 40 years. I don't think "improvements" like 4K are going to be noticed by those other consumers in the top 1%. This seems like a first-world problem to me - how to cope with the arrival of 4K?
... Anything being discussed on a Web site like Anandtech is going to be "a first-world problem"...
That being said, there's not much of a difference between 720 lines of non-interlaced picture and 1080 lines of interlaced picture... If anything a 720P picture tends to be a little better looking than 1080I.
The transition to 4K can't come soon enough. I'm less concerned with video playback and more concerned with desktop real estate - I'd love to have one monitor with more resolution than two 1080P monitors in tandem.
Why does an iOS device's Retina Display work in the minds of the consumers? What prevents one from wishing for a Retina Display in the TV or computer monitor? The latter is what will drive 4K adoption.
The reason 4K will definitely get a warmer welcome compared to 3D is the fact that there are no ill-effects (eye strain / headaches) in 4K compared to 3D.
We can certainly hope, though with 1080p having been the de-facto high-end standard for desktops for almost a decade I'm not holding my breath.
Until there's an affordable alternative for improving vertical resolution on the desktop I'll stick to my two 1280*1024 displays.
Don't get me wrong, I'd love to see the improvements in resolution made in mobile displays spill over into the desktop but I'd not be surprised if the most affordable way of getting a 2048*1536 display on the desktop ends up being a gutted Wi-Fi iPad blu-tacked to your current desktop display.
Higher latency and ghosting that maybe one in fifty thousand users will notice, if that. This issue has been blown out of all proportion by the measurable stats at all costs brigade - MY SCREEN HAS 2MS SO IT MUST BE BETTER. The average human eye cannot detect any kind of ghosting/input lag in anything under a 10-14ms refresh window. Only the most seasoned pro gamers would notice, and only if you sat the monitors side by side.
A slight loss in meaningless statistics is worth it if you get better, more vibrant looking pictures and something where you CAN actually see the difference.
I take it you've done hundreds of hours of research and documented your studies and methodology so we can look at the results.
What if Anand did videocard reviews the same way your spouting out these "facts". They would be worthless conjector, just like your information.
Drop the, but its a really small number argument. Until you really document what the human eye/brain is capable all your saying its a really small number.
Well Thz is a really small number to. And we can the human body can pick up things as little as 700 Tera Hz. Its called the EYE!.
Look, you're of a different opinion - that's fine.
I, however, don't want IPS.
Because I can't appreciate the "vibrant" colors, nor the better accuracy or bigger viewing angles.
Indeed, my preferred display has a slightly cold hue and I always turn saturation and brightness way down because it makes the display more restful for my eyes.
I work with text and when I don't do that I play games.
I'd much rather have a 120Hz display with even lower latency than I'd take any improvement in areas that I don't care about and won't even notice.
Also, if you're going to make outlandish claims about how many people can or cannot notice this or that you should probably back it up.
Exodite, you act like IPS has awful latency or something.
If we were talking about PVA, I wouldn't be responding to an otherwise reasonable arguement, but we're not. The latency between IPS and TN is virtually identical, especially to the human eye and mind. High frame (1/1000) cameras are required to even measure the difference between IPS and TN.
Yes, TN is 'superior' with its 2ms latency, but IPS is superior with its <6ms latency, 97.4% Adobe RGB accuracy, 180 degree bi-plane viewing angles, and lower power consumption/heat output (either in LED or cold cathode configurations) due to less grid processing.
This arguement is closed. Anybody who says they can tell a difference between 2ms and sub 6ms displays is being a whiny bitch.
Anyone that says they can tell any difference between a 65% and 95% color gamut is whiny bitch.
See, I can play that game too!
Even if I were to buy your "factual" argument, and I don't, I've clearly stated that I care nothing about the things you consider advantages.
I sit facing the center of my display, brightness and gamma is turned down to minimum levels and saturation is low. Measured power draw at the socket is 9W.
It's a 2MS TN panel, obviously.
All I want is more vertical space at a reasonable price, though a 120Hz display would be nice as well.
My friend is running a 5ms 1080p eIPS display and between that and what I have I'd still pick my current display.
End of the day it's personal preference, which I made abundantly clear in my first post.
Though it seems displays, and IPS panels in general, is starting to attract the same amount of douchiness as the audiophile community.
Oh, I know I shouldn't--REALLY shouldn't--get involved in this. But you would have to be monochromatically colorblind in order to not see the difference between 65% and 95% color gamut.
I'm not saying that the 95% gamut is better for everyone; in fact, unless the 95% monitor has a decent sRGB setting, the 65% monitor is probably better for most people. But to suggest that you have to be a hyper-sensitive "whiny b---h" to tell the difference between the two is to take an indefensible position.
Point being that whatever the difference is I bet you the same can be said about latency.
Besides, as I've said from the start it's about the things that you personally appreciate.
My preferred settings absolutely destroy any kind of color fidelity anyway, and that doesn't even slightly matter as I don't work with professional imagery.
But I can most definitely appreciate the difference between TN and even eIPS when it comes to gaming. And I consider the former superior.
I don't /mind/ higher color fidelity or better viewing angles, I'm just sure as hell not going to pay any extra for it.
I agree completely that, as you say, "it's about the things you personally appreciate." If you have color settings you like that work on a TN monitor that you can stand to deal with for long periods of time without eye strain, I would never tell you that you should not use them because they don't conform to some arbitrary standard. Everybody's eyes and brain wiring are different, and there are plenty of reasons why people use computers that don't involve color accuracy.
But as it happens, you picked a poor counterexample, because I defy you to put a Dell U2412M (~68% of aRGB) next to a U2410 set to aRGB mode (somewhere close to 100% of aRGB) and tell me you can't see a difference.
For that matter, I challenge you to find me someone who literally can't see the difference between the two in terms of color reproduction. That person will have something seriously wrong with their color vision.
To be fair the counterexample wasn't about being correct, because the poster I replied to weren't, but rather about showing what an asshat argument he was making.
That said it's about the frame of reference.
Would you be able to tell the difference working with RAW images pulled from your DSLR or other high-quality imagery?
Sure, side-by-side I have no doubt you would.
Would you be able to tell the difference when viewing the desktop, a simple web form or an editor where the only color are black, white, two shades of blue and grey?
Especially once both displays are calibrated to the point I'm comfortable with them. (Cold hue, 0% brightness, low saturation, negative gamma, high contrast.)
I'd like to see a "blind" test on this. Is there a percieved difference between 6 and 2ms? Blind as in the test subjects (nyahahaa) does not know what ms they look at.
Test with both a 60 and 120hz display. I would guess the moving object, an explorer window, for instance, would simply be easier to look at and look less blurred as it moves over the screen. People used to fast paced gaming on CRT monitors or "3d ready" 120Hz monitors would see more of a difference.
I really don't see any need for improvement in video resolution just yet. I myself have nearly perfect eyesight and can be extremely annoyed by artifacts, blocky compression, etc, but I find 720p to be detailed enough even for action movies which rely solely on the special effects. In most movies 1080p appears too sharp to me, add to that the fact that most movies are already oversharpened and post-processed and the increased bitrate (and therefore filesize) of 1080p and I see more downside than upside to it. This all goes double for 4K video.
That being said, I do still want 4K badly for gaming, viewing pictures, reading text, there's tons of things it'll be useful for. But not for film, not for me.
Another advantage of a 4K screen (one that has at least 2160 vertical resolution) is that you could have alternating-line passive 3D at full 1080p resolution for each eye. I'm not an expert on how this all works, but it seems to me that the circular polarization layer is a sort of afterthought for the LCD manufacturing process, which is why vertical viewing angles are narrow (there's a gap between the pixels and the 3D polarizing layer).
In my opinion, it would be pretty awesome if that layer were integrated into the panel in such a way that vertical viewing angles weren't an issue, and so that any monitor is basically a 3D monitor (especially high-quality IPS displays). But I don't really know how practical that is.
4K is a very big deal for a couple reasons: pixel density and film transparency.
From the perspective of pixel density, I happily point to the ASUS Transformer 1080p, iPad 3, and any 2560x 27" or 30" monitor. Once you go dense, you never go... back... Anyway, as great as 1080p is, as great as Blu-ray is, it could be so much better! I project 1080p at about 120" in my dedicated home theater - it looks great - but I will upgrade to 4K without hesitation.
Which leads me to the concept of film transparency. While many modern movies are natively being shot in 4K using RED or similar digital cameras, the majority are still on good ol' 35mm film. 4K is considered by most professionals and enthusiasts to be the baseline for an excellent transfer of a 35mm source to the digital space - some argue 6K-8K is ideal. Factor in 65mm, 70mm, and IMAX and you want to scan your original negative in at least 8K to capture all the fine detail (as far as I know, no one is professionally scanning above 8K yet).
Of course recording on RED4K or scanning 35mm at 4K or 8K is a pointless venture if video filtering like noise reduction or edge enhancement are applied during the mastering or encoding process. Like smearing poop on a diamond.
You can't bring up "normal" people when discussing the bleeding edge. The argument is moot. Those folks don't jump on board for any new technology until it hits the Walmart Black Friday ad.
While I agree with most everything there is something I would like to nit pick on, While making a digital copy of old film in what ever format you use, more often than not a lot of touching up needs to be done. Wizard of OZ and all the 007 films can be an example. (I am ignoring the remastering of Star Wars and Lucas deciding to add in 'features' vs giving us a cleaned up remaster sans bonuses.) Still when your spending millions in remaster I expect at least not muddy the entire thing up.
However I feel we need to bring in higher bitrates first. I will not apologize over this, yes encoders are great but a 4mbs 1080p stream still is not as good as nice as a 20mb-60mb vbr blu-ray film The feeling that a craptastic 4k or even 2k bitrate will ruin the expedience for the non informed. Also notice I am ignore an entire difference debate whether the current can candle true HD streaming to every household, at least in the US.
Higher bit rates will be inherent with 4K or 2K over 1080p, but bit rates aren't the be all end all. 4K will likely use HVEC H.265 which offers double the compression with better quality than H.264.
Fixing scratches, tears, or other issues with film elements should never be a reason for mass application of filtering.
And for the OP, 32", really? Its completely understandable you don't see the difference on a screen that size. Step up to a 60" screen and then go compare 720p to 1080p (who uses 1080i anymore, oh thats right, crappy 32" LCDs. Don't get me wrong, I own 2, but they go in the bedroom and my office, not my Family Room.)
I think 60" +/- 5" is pretty much the norm now a days for the average middle class family's main movie watching TV.
1080i @ 60 fields per second when deinterlaced is the same as 1080p @ 30 fields per second. The picture quality is almost entirely dependent upon your display's ability to deinterlace. However, cable TV is generally of a lower bit rate than OTA or satellite.
2560x 27" and 30" monitors are NOT very pixel dense. 27" is slightly more dense (~12.5% more dense) than the standard display but the 30" is only about 4% more dense than a standard display
a 1920x1080 13.3" display is 71.88% more dense than a standard display.
On a 32" you will certainly not see a difference between 720p and 1080p - it is barely visible on a 40". Once you go to 52"+ however the difference becomes visible.
On a 61" screen as you suggest the difference will be quite visible.
Having said that I am still very happy with the Quality of properly mastered DVD's which are only 576p on my 47" TV.
It's not that I can't tell the difference, its just that it doesn't matter to me that much, which is why I also don't bother with MadVR and all that, and just stick to Windows Media Center for my HTPC.
Have you ever seen a 4k display on a uncompressed signal? The clarity is just astounding.
I'm more concerned about the ability to deliver content with that kind of bandwidth requirements. We already get hdtv signals that are so compressed that they're barely better than a really really good SDTV signal.
Most of what you see labeled "HD" (or variants thereof) is marketing bullshit. You're not getting HD when the bitrate is 3 megabits per second, especially when anything on the screen is moving.
You can blow a VHS picture up to "HD" resolution, and it won't be HD. That's exactly what's happening in most consumer devices today.
"4K" is rapidly emerging as the next fraud. We'll see the same crap blown up to 3840 x whatever (barely even 4K by any standard), but containing 1K of real resolution if you're lucky.
The era of increasing quality is over, as consumers prove over and over that they don't care about quality.
"Otherwise, the Ivy Bridge platform has everything that a HTPC user would ever need."
I'd like an open-source (or at least) free encoder that supports QuickSync and not having to be picky with my DRAM purchase to use GPU-accelerated decoders before I say that.
Other than that, it seems to be good enough for the basic HTPC functionality - can't wait for the new i3s and Pentiums to see if the low-end parts are good enough!
DVRMSToolbox (DTB) has included a QS capable transcoding solution for over a year. The main benefit to using it vs. the other retail options is that it supports EDL files during transcoding.
I've experimented with madVR a bit but in the end the problems with playing back DVDs and Blu-rays with menus has so far stopped me from using it seriously. However, I've seen reports claiming that Ivy Bridge includes higher quality upscaling within Windows Media Player (as part of the EVR I suppose). Any evidence of this?
You can take a look at the PowerDVD chroma upscaling screenshots linked in the text. I was really surprised at the quality (until I zoomed to 300%, I couldn't actually decipher the difference between PowerDVD and madVR!). Similar behavior with MPC-HC using MPCVideoDec.
Btw, can you link me to the reports that you mention?
This was one of them. I also found a comment from one of the engineers that explained that they were using a higher quality upsampling algorithm too but I can't find it now.
Also, when we are complaining about 23.976Hz versus something like 23.972 how can you be sure that your measurement is accurate? I would think that for most HTPC users the important thing is that the video clock and audio clock are derived from a common clock. Is there some way you can check for this? I'm also interested to know if automatic lip-sync over HDMI is working properly - it doesn't seem to work on my AMD E-450.
Whether the clock is accurate or not, what matters it the number of frames dropped or repeated by the renderer because of this. madVR clearly indicates this in the Statistics.
Yes, you are right about video and audio clock derived from a common clock, but I am not sure on how to check for this.
Does lip sync not work for you on E-450, but does work on some other machine? I have played with the e-450 only briefly in the Zotac Zbox Nano XS, and I did watch one movie completely. I didn't have lip sync issues to warrant digging in further.. I do agree my sample set is extremely small.
I agree that what matters is dropped frames. I'm not absolutely sure how madVR decides when to drop frames. As I see it there are four options
1) lock playback to the video clock and drop or repeat audio frames 2) lock playback to the audio clock and drop or repeat video frames 3) lock playback to the video clock and resample the audio 4) lock playback to some other clock (maybe the processor clock) and drop or repeat both video and audio frames.
My guess its probably doing 2 which would make the reported dropped frames a good measurement. If it was doing 1 or 3 then it wouldn't drop frames. If its doing 4 then I'd argue that its a faulty renderer.
Regarding the lip sync its difficult to be very scientific about it because I don't have any suitable test material. My TV definitely introduces a significant delay and for some reason I haven't had much luck correcting it with manual adjustment on my AV receiver. Maybe it varies with frame rate or maybe the delay is outside the range I can set manually. When I enable automatic lip sync it does seem to correct things for the set top box and standalone DVD player but for my E-450 (an ASUS mini-ITX motherboard) it seems to be way off. Its quite possible its a bug in PowerDVD or that it depends on the format of the audio track or I don't know what else.
I do have machines that I could try but it would really help to have some test material in a range of frame rates and audio formats.
This article is great commentary on the video aspects of an Intel HTPC setup however nowhere on either the processor discussions or the Z77 motherboard articles was any attempt made to actually review the audio portions of HTPC setups which is still a major part of any Home Theater.
IMO if you want a complete comprehensive look at HTPC capabilities of any platform addressing such things as audio decoding, audio pass through over HDMI and audio quality are a must until then it is not a complete review.
Has there been some kind of study on HTPC users to find out what the average is?
To me the big problem with this article is that it makes too many assumptions, the biggest of which is that we are all just watching videos on our tv.
I do recognize that there is a market for that, but I'm sure that I speak for most of us when I say that I hope that is just the beginning of the HTPC and not the goal.
When an integrated GPU can game at 1080p (or hopefully better... let me know. Until then my own "HTPC" will have a graphics card.
I kind of have to agree. video/audio playback maybe the *primary* function, but as my HTPC is hooked up to the biggest screen in the apartment, I wouldn't mind throwing the odd game on there.
My current HTPC does (very) light gaming, overnight video transcoding, light photoshop, and the (very rare) video edit. Oh, and it plays video and audio. Please don't ask what it is.
Why are you testing with a HD4000? The 4000 only comes in the higher and more costly chips? Most lowwer/Mid Ivy chips will use HD2500 video. The price differance is enough to buy a cheaper chip and get a full sep. video card that has its own memory, or wait for Trinity.
Not really. I think HD4000 is just about right for an HTPC. Later, when the Ivy Bridge core-i3's come out, I think the i3-3225, with HD4000, will be the first choice for HTPCs.
If the i7-3770T is actually ever available to buy then from a power consumption point of view it would also be a good choice (with plenty of CPU headroom for the times where GPU decoding doesn't work) . From a cost point of view it might be a bit on the high side I suppose.
Getting an i7 for an HTPC is like getting a Mustang GT500 to drive Miss Daisy. Come on now, is AT a review site for Daddy Warbucks types?
Ok serious question though. What's the Intel IVB driver / HW acceleration situation on Linux? I couldn't imagine dropping $100 on Windows 7 for something as simple as HTPC functionality. For nvidia we're talking $10 Geforce card + VDPAU + lowest end CPU + least amount of RAM possible + linux = HTPC solution. Or a Zotac box. Can Intel compete with that?
This review is really testing the HD4000 implementation. When the dual-cores are released with the HD4000, the GPU will be exactly the same, so almost everything will be directly applicable there too.
With that P8H77-M config, if you use a double slot GPU in one PCIex16 slot (and so lose one PCIex1 slot) and use TV tuners in both on the remaining slots PCIex1 and PCIex16 does using the second PCIex16 slot result in the first PCIex16 running at x8 ?
If the second PCIe is occupied, then it will cause the first x16 to run at x8. Both these slots are electrically connected, so when you need even one lane, it takes eight away from the first PCIe slot for it.
I see you didn't test madVR in Full Screen Exclusive mode - can you elaborate on the reason for this please? I read over at missingremote that FSE improved the situation significantly for madVR with the HD4000?
The FSE mode performed visibly worse for me compared to FSW in the few cases that I tried. I have got the rest of the settings that Andrew @ MR used. I may try it and see if it improves things. My aim was to get madVR to render without any dropped frames, and I was able to get that at DDR3-1600 (which is what Andrew used too) for almost all the clips I had (except 720p60, which I didn't try till yesterday).
Video decoding and rendering benchmarks Can you provide the learning guide how you've got those scores? It will be very helpful for some of us... I know about hqv score.. but this one is new to me.. kindly help :) From where can I get these benchmarks if i have to compare my existing system with the IVB results?
In the article there is a promise of a BIOS update to fix the 23.97Hz issue. Wasn't something similar also promised for sandy bridge in the same article over a year ago!! That never happened did it. I want to build a HTPC already!
Well, something did happen with SNB.. they got it to 23.972 Hz :) If you think about it, video cards with AMD and NVIDIA GPUs also end up in the 23.974 - 23.978 range, and only very rarely do I actually see a GPU outputting exactly 23.976023976 Hz.
If Intel gets between 23.974 - 23.978 in a stable manner, I will consider the matter closed.
Is there still the problem like with SB that the driver puts color space to limited range when connecting to the tv with HDMI and resets it with every refresh rate switch/reboot with the integrated graphics?
Is there a point to getting an H77 board with Ivy Bridge if all you are using it for is as an HTPC (sans overclock)? I can't tell what the benefit would be to justify the price increase.
Nice article! As always. About the note: "The good news is that Intel is claiming that this issue is fully resolved in the latest production BIOS on their motherboard. This means that BIOS updates to the current boards from other manufacturers should also get the fix. Hopefully, we should be able to independently test and confirm this soon."
What does it mean exactly? Does it mean that this BIOS update should get refresh rate closer to the 23.976 than it was in your test? And "on their motherboard" - does it mean that this BIOS update is for Intel MB only?
True that in AMD and nVidia the out of the box refresh rate for 23 is never precisely 23.976, but the custom timings on nVidia allows you to get closer to is. There is no custom timing settings on the HD4000, right?
I've been looking at an Ivy Bridge setup with the H77/Z77 chipset but I can't find any information about the audio support? Can it bitstream TrueHD and DTS-HD tracks? The older chipsets do it so I would find it strange that the new ones don't, but I don't see it mentioned on any of the new boards or in the intel information.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
70 Comments
Back to Article
anirudhs - Monday, April 23, 2012 - link
I can barely notice the difference between 720P and 1080I on my 32" LCD. Will people notice the difference between 1080P and 4K on a 61" screen?It seems we have crossed the point where improvements in HD video playback on Sandy Bridge and post-Sandy Bridge machines are discernible to normal people with normal screens.
I spoke to a high-end audiophile/videophile dealer, and he tells me that the state of video technology (Blu-Ray) is pretty stable. In fact, it is more stable than it has ever been in the past 40 years. I don't think "improvements" like 4K are going to be noticed by those other consumers in the top 1%. This seems like a first-world problem to me - how to cope with the arrival of 4K?
digitalrefuse - Monday, April 23, 2012 - link
... Anything being discussed on a Web site like Anandtech is going to be "a first-world problem"...That being said, there's not much of a difference between 720 lines of non-interlaced picture and 1080 lines of interlaced picture... If anything a 720P picture tends to be a little better looking than 1080I.
The transition to 4K can't come soon enough. I'm less concerned with video playback and more concerned with desktop real estate - I'd love to have one monitor with more resolution than two 1080P monitors in tandem.
ganeshts - Monday, April 23, 2012 - link
OK, one of my favourite topics :)Why does an iOS device's Retina Display work in the minds of the consumers? What prevents one from wishing for a Retina Display in the TV or computer monitor? The latter is what will drive 4K adoption.
The reason 4K will definitely get a warmer welcome compared to 3D is the fact that there are no ill-effects (eye strain / headaches) in 4K compared to 3D.
Exodite - Monday, April 23, 2012 - link
We can certainly hope, though with 1080p having been the de-facto high-end standard for desktops for almost a decade I'm not holding my breath.Until there's an affordable alternative for improving vertical resolution on the desktop I'll stick to my two 1280*1024 displays.
Don't get me wrong, I'd love to see the improvements in resolution made in mobile displays spill over into the desktop but I'd not be surprised if the most affordable way of getting a 2048*1536 display on the desktop ends up being a gutted Wi-Fi iPad blu-tacked to your current desktop display.
aliasfox - Monday, April 23, 2012 - link
It would be IPS, too!:-P
Exodite - Monday, April 23, 2012 - link
Personally I couldn't care less about IPS, though I acknowledge some do.Any trade-off in latency or ghosting just isn't worth it, as accurate color reproduction and better viewing angles just doesn't matter to me.
ZekkPacus - Monday, April 23, 2012 - link
Higher latency and ghosting that maybe one in fifty thousand users will notice, if that. This issue has been blown out of all proportion by the measurable stats at all costs brigade - MY SCREEN HAS 2MS SO IT MUST BE BETTER. The average human eye cannot detect any kind of ghosting/input lag in anything under a 10-14ms refresh window. Only the most seasoned pro gamers would notice, and only if you sat the monitors side by side.A slight loss in meaningless statistics is worth it if you get better, more vibrant looking pictures and something where you CAN actually see the difference.
SlyNine - Tuesday, April 24, 2012 - link
I take it you've done hundreds of hours of research and documented your studies and methodology so we can look at the results.What if Anand did videocard reviews the same way your spouting out these "facts". They would be worthless conjector, just like your information.
Drop the, but its a really small number argument. Until you really document what the human eye/brain is capable all your saying its a really small number.
Well Thz is a really small number to. And we can the human body can pick up things as little as 700 Tera Hz. Its called the EYE!.
Exodite - Tuesday, April 24, 2012 - link
Look, you're of a different opinion - that's fine.I, however, don't want IPS.
Because I can't appreciate the "vibrant" colors, nor the better accuracy or bigger viewing angles.
Indeed, my preferred display has a slightly cold hue and I always turn saturation and brightness way down because it makes the display more restful for my eyes.
I work with text and when I don't do that I play games.
I'd much rather have a 120Hz display with even lower latency than I'd take any improvement in areas that I don't care about and won't even notice.
Also, if you're going to make outlandish claims about how many people can or cannot notice this or that you should probably back it up.
Samus - Tuesday, April 24, 2012 - link
Exodite, you act like IPS has awful latency or something.If we were talking about PVA, I wouldn't be responding to an otherwise reasonable arguement, but we're not. The latency between IPS and TN is virtually identical, especially to the human eye and mind. High frame (1/1000) cameras are required to even measure the difference between IPS and TN.
Yes, TN is 'superior' with its 2ms latency, but IPS is superior with its <6ms latency, 97.4% Adobe RGB accuracy, 180 degree bi-plane viewing angles, and lower power consumption/heat output (either in LED or cold cathode configurations) due to less grid processing.
This arguement is closed. Anybody who says they can tell a difference between 2ms and sub 6ms displays is being a whiny bitch.
Exodite - Tuesday, April 24, 2012 - link
Anyone that says they can tell any difference between a 65% and 95% color gamut is whiny bitch.See, I can play that game too!
Even if I were to buy your "factual" argument, and I don't, I've clearly stated that I care nothing about the things you consider advantages.
I sit facing the center of my display, brightness and gamma is turned down to minimum levels and saturation is low. Measured power draw at the socket is 9W.
It's a 2MS TN panel, obviously.
All I want is more vertical space at a reasonable price, though a 120Hz display would be nice as well.
My friend is running a 5ms 1080p eIPS display and between that and what I have I'd still pick my current display.
End of the day it's personal preference, which I made abundantly clear in my first post.
Though it seems displays, and IPS panels in general, is starting to attract the same amount of douchiness as the audiophile community.
Old_Fogie_Late_Bloomer - Tuesday, April 24, 2012 - link
Oh, I know I shouldn't--REALLY shouldn't--get involved in this. But you would have to be monochromatically colorblind in order to not see the difference between 65% and 95% color gamut.I'm not saying that the 95% gamut is better for everyone; in fact, unless the 95% monitor has a decent sRGB setting, the 65% monitor is probably better for most people. But to suggest that you have to be a hyper-sensitive "whiny b---h" to tell the difference between the two is to take an indefensible position.
Exodite - Tuesday, April 24, 2012 - link
Yeah, you shouldn't have gotten into this.Point being that whatever the difference is I bet you the same can be said about latency.
Besides, as I've said from the start it's about the things that you personally appreciate.
My preferred settings absolutely destroy any kind of color fidelity anyway, and that doesn't even slightly matter as I don't work with professional imagery.
But I can most definitely appreciate the difference between TN and even eIPS when it comes to gaming. And I consider the former superior.
I don't /mind/ higher color fidelity or better viewing angles, I'm just sure as hell not going to pay any extra for it.
Old_Fogie_Late_Bloomer - Wednesday, April 25, 2012 - link
I agree completely that, as you say, "it's about the things you personally appreciate." If you have color settings you like that work on a TN monitor that you can stand to deal with for long periods of time without eye strain, I would never tell you that you should not use them because they don't conform to some arbitrary standard. Everybody's eyes and brain wiring are different, and there are plenty of reasons why people use computers that don't involve color accuracy.But as it happens, you picked a poor counterexample, because I defy you to put a Dell U2412M (~68% of aRGB) next to a U2410 set to aRGB mode (somewhere close to 100% of aRGB) and tell me you can't see a difference.
For that matter, I challenge you to find me someone who literally can't see the difference between the two in terms of color reproduction. That person will have something seriously wrong with their color vision.
Exodite - Wednesday, April 25, 2012 - link
To be fair the counterexample wasn't about being correct, because the poster I replied to weren't, but rather about showing what an asshat argument he was making.That said it's about the frame of reference.
Would you be able to tell the difference working with RAW images pulled from your DSLR or other high-quality imagery?
Sure, side-by-side I have no doubt you would.
Would you be able to tell the difference when viewing the desktop, a simple web form or an editor where the only color are black, white, two shades of blue and grey?
Especially once both displays are calibrated to the point I'm comfortable with them. (Cold hue, 0% brightness, low saturation, negative gamma, high contrast.)
I dare say not.
DarkUltra - Monday, April 30, 2012 - link
I'd like to see a "blind" test on this. Is there a percieved difference between 6 and 2ms? Blind as in the test subjects (nyahahaa) does not know what ms they look at.Test with both a 60 and 120hz display. I would guess the moving object, an explorer window, for instance, would simply be easier to look at and look less blurred as it moves over the screen. People used to fast paced gaming on CRT monitors or "3d ready" 120Hz monitors would see more of a difference.
Origin32 - Saturday, April 28, 2012 - link
I really don't see any need for improvement in video resolution just yet. I myself have nearly perfect eyesight and can be extremely annoyed by artifacts, blocky compression, etc, but I find 720p to be detailed enough even for action movies which rely solely on the special effects. In most movies 1080p appears too sharp to me, add to that the fact that most movies are already oversharpened and post-processed and the increased bitrate (and therefore filesize) of 1080p and I see more downside than upside to it.This all goes double for 4K video.
That being said, I do still want 4K badly for gaming, viewing pictures, reading text, there's tons of things it'll be useful for.
But not for film, not for me.
Old_Fogie_Late_Bloomer - Monday, April 23, 2012 - link
Another advantage of a 4K screen (one that has at least 2160 vertical resolution) is that you could have alternating-line passive 3D at full 1080p resolution for each eye. I'm not an expert on how this all works, but it seems to me that the circular polarization layer is a sort of afterthought for the LCD manufacturing process, which is why vertical viewing angles are narrow (there's a gap between the pixels and the 3D polarizing layer).In my opinion, it would be pretty awesome if that layer were integrated into the panel in such a way that vertical viewing angles weren't an issue, and so that any monitor is basically a 3D monitor (especially high-quality IPS displays). But I don't really know how practical that is.
peterfares - Thursday, September 27, 2012 - link
a 2560x1600 monitor (available for years) has 1.975 times the amount of pixels as a 1920x1080 screen.4K would be even better, though!
nathanddrews - Monday, April 23, 2012 - link
4K is a very big deal for a couple reasons: pixel density and film transparency.From the perspective of pixel density, I happily point to the ASUS Transformer 1080p, iPad 3, and any 2560x 27" or 30" monitor. Once you go dense, you never go... back... Anyway, as great as 1080p is, as great as Blu-ray is, it could be so much better! I project 1080p at about 120" in my dedicated home theater - it looks great - but I will upgrade to 4K without hesitation.
Which leads me to the concept of film transparency. While many modern movies are natively being shot in 4K using RED or similar digital cameras, the majority are still on good ol' 35mm film. 4K is considered by most professionals and enthusiasts to be the baseline for an excellent transfer of a 35mm source to the digital space - some argue 6K-8K is ideal. Factor in 65mm, 70mm, and IMAX and you want to scan your original negative in at least 8K to capture all the fine detail (as far as I know, no one is professionally scanning above 8K yet).
Of course recording on RED4K or scanning 35mm at 4K or 8K is a pointless venture if video filtering like noise reduction or edge enhancement are applied during the mastering or encoding process. Like smearing poop on a diamond.
You can't bring up "normal" people when discussing the bleeding edge. The argument is moot. Those folks don't jump on board for any new technology until it hits the Walmart Black Friday ad.
MGSsancho - Monday, April 23, 2012 - link
While I agree with most everything there is something I would like to nit pick on, While making a digital copy of old film in what ever format you use, more often than not a lot of touching up needs to be done. Wizard of OZ and all the 007 films can be an example. (I am ignoring the remastering of Star Wars and Lucas deciding to add in 'features' vs giving us a cleaned up remaster sans bonuses.) Still when your spending millions in remaster I expect at least not muddy the entire thing up.However I feel we need to bring in higher bitrates first. I will not apologize over this, yes encoders are great but a 4mbs 1080p stream still is not as good as nice as a 20mb-60mb vbr blu-ray film The feeling that a craptastic 4k or even 2k bitrate will ruin the expedience for the non informed. Also notice I am ignore an entire difference debate whether the current can candle true HD streaming to every household, at least in the US.
nathanddrews - Monday, April 23, 2012 - link
Higher bit rates will be inherent with 4K or 2K over 1080p, but bit rates aren't the be all end all. 4K will likely use HVEC H.265 which offers double the compression with better quality than H.264.Fixing scratches, tears, or other issues with film elements should never be a reason for mass application of filtering.
SlyNine - Tuesday, April 24, 2012 - link
H.264 doesn't even offer 2x the compression over Mpeg 2. I doubt H.265 offers 2x over 264."This means that the HEVC codec can achieve the same quality as H.264 with a bitrate saving of around 39-44%."
Source http://www.vcodex.com/h265.html
Casper42 - Monday, April 23, 2012 - link
I LOL'd at "Walmart Black Friday" Nathan :)And for the OP, 32", really?
Its completely understandable you don't see the difference on a screen that size.
Step up to a 60" screen and then go compare 720p to 1080p (who uses 1080i anymore, oh thats right, crappy 32" LCDs. Don't get me wrong, I own 2, but they go in the bedroom and my office, not my Family Room.)
I think 60" +/- 5" is pretty much the norm now a days for the average middle class family's main movie watching TV.
anirudhs - Monday, April 23, 2012 - link
Cable TV maxes out at 1080i ( I have Time Warner). My TV can do 1080P.nathanddrews - Monday, April 23, 2012 - link
1080i @ 60 fields per second when deinterlaced is the same as 1080p @ 30 fields per second. The picture quality is almost entirely dependent upon your display's ability to deinterlace. However, cable TV is generally of a lower bit rate than OTA or satellite.SlyNine - Tuesday, April 24, 2012 - link
Yea but because of shimmering effects progressive images almost always looks better.If the video is 2:2 or 3:2 many tv's can build the frame in to a progressive image anymore.
Exodite - Tuesday, April 24, 2012 - link
In the US, possibly, but I dare say 55-60" TVs are far from the norm everywhere.peterfares - Thursday, September 27, 2012 - link
2560x 27" and 30" monitors are NOT very pixel dense. 27" is slightly more dense (~12.5% more dense) than the standard display but the 30" is only about 4% more dense than a standard displaya 1920x1080 13.3" display is 71.88% more dense than a standard display.
dcaxax - Tuesday, April 24, 2012 - link
On a 32" you will certainly not see a difference between 720p and 1080p - it is barely visible on a 40". Once you go to 52"+ however the difference becomes visible.On a 61" screen as you suggest the difference will be quite visible.
Having said that I am still very happy with the Quality of properly mastered DVD's which are only 576p on my 47" TV.
It's not that I can't tell the difference, its just that it doesn't matter to me that much, which is why I also don't bother with MadVR and all that, and just stick to Windows Media Center for my HTPC.
Everyone's priorities are different.
Midwayman - Wednesday, April 25, 2012 - link
Have you ever seen a 4k display on a uncompressed signal? The clarity is just astounding.I'm more concerned about the ability to deliver content with that kind of bandwidth requirements. We already get hdtv signals that are so compressed that they're barely better than a really really good SDTV signal.
MobiusStrip - Friday, April 27, 2012 - link
Most of what you see labeled "HD" (or variants thereof) is marketing bullshit. You're not getting HD when the bitrate is 3 megabits per second, especially when anything on the screen is moving.You can blow a VHS picture up to "HD" resolution, and it won't be HD. That's exactly what's happening in most consumer devices today.
"4K" is rapidly emerging as the next fraud. We'll see the same crap blown up to 3840 x whatever (barely even 4K by any standard), but containing 1K of real resolution if you're lucky.
The era of increasing quality is over, as consumers prove over and over that they don't care about quality.
A5 - Monday, April 23, 2012 - link
"Otherwise, the Ivy Bridge platform has everything that a HTPC user would ever need."I'd like an open-source (or at least) free encoder that supports QuickSync and not having to be picky with my DRAM purchase to use GPU-accelerated decoders before I say that.
Other than that, it seems to be good enough for the basic HTPC functionality - can't wait for the new i3s and Pentiums to see if the low-end parts are good enough!
ganeshts - Monday, April 23, 2012 - link
You can use GPU accelerated decoders even with DDR3-1333 DRAM. You need to go high speed / low latency only if you want rendering through madVR.Use QuickSync Decoder or DXVA2 Native in LAV or MPC Video Decoder + EVR-CP to get full decode and rendering acceleration without worry about the DRAM.
babgvant - Monday, April 23, 2012 - link
DVRMSToolbox (DTB) has included a QS capable transcoding solution for over a year. The main benefit to using it vs. the other retail options is that it supports EDL files during transcoding.DTB is FOSS, the QS dlls are just FSS.
A5 - Monday, April 23, 2012 - link
Cool stuff. Hadn't heard of your tool before today, I'll make sure to check it out when I get my HDHR Prime from Woot.shawkie - Monday, April 23, 2012 - link
I've experimented with madVR a bit but in the end the problems with playing back DVDs and Blu-rays with menus has so far stopped me from using it seriously. However, I've seen reports claiming that Ivy Bridge includes higher quality upscaling within Windows Media Player (as part of the EVR I suppose). Any evidence of this?ganeshts - Monday, April 23, 2012 - link
You can take a look at the PowerDVD chroma upscaling screenshots linked in the text. I was really surprised at the quality (until I zoomed to 300%, I couldn't actually decipher the difference between PowerDVD and madVR!). Similar behavior with MPC-HC using MPCVideoDec.Btw, can you link me to the reports that you mention?
shawkie - Monday, April 23, 2012 - link
http://www.intel.com/content/www/us/en/architectur...This was one of them. I also found a comment from one of the engineers that explained that they were using a higher quality upsampling algorithm too but I can't find it now.
ganeshts - Monday, April 23, 2012 - link
Andrew @ MissingRemote just refreshed my memory about this post by Eric Gur: http://forum.doom9.org/showthread.php?p=1551981#po...shawkie - Monday, April 23, 2012 - link
Well found! So nothing new in Ivy Bridge then...shawkie - Monday, April 23, 2012 - link
Also, when we are complaining about 23.976Hz versus something like 23.972 how can you be sure that your measurement is accurate? I would think that for most HTPC users the important thing is that the video clock and audio clock are derived from a common clock. Is there some way you can check for this? I'm also interested to know if automatic lip-sync over HDMI is working properly - it doesn't seem to work on my AMD E-450.ganeshts - Monday, April 23, 2012 - link
Whether the clock is accurate or not, what matters it the number of frames dropped or repeated by the renderer because of this. madVR clearly indicates this in the Statistics.Yes, you are right about video and audio clock derived from a common clock, but I am not sure on how to check for this.
Does lip sync not work for you on E-450, but does work on some other machine? I have played with the e-450 only briefly in the Zotac Zbox Nano XS, and I did watch one movie completely. I didn't have lip sync issues to warrant digging in further.. I do agree my sample set is extremely small.
shawkie - Monday, April 23, 2012 - link
I agree that what matters is dropped frames. I'm not absolutely sure how madVR decides when to drop frames. As I see it there are four options1) lock playback to the video clock and drop or repeat audio frames
2) lock playback to the audio clock and drop or repeat video frames
3) lock playback to the video clock and resample the audio
4) lock playback to some other clock (maybe the processor clock) and drop or repeat both video and audio frames.
My guess its probably doing 2 which would make the reported dropped frames a good measurement. If it was doing 1 or 3 then it wouldn't drop frames. If its doing 4 then I'd argue that its a faulty renderer.
Regarding the lip sync its difficult to be very scientific about it because I don't have any suitable test material. My TV definitely introduces a significant delay and for some reason I haven't had much luck correcting it with manual adjustment on my AV receiver. Maybe it varies with frame rate or maybe the delay is outside the range I can set manually. When I enable automatic lip sync it does seem to correct things for the set top box and standalone DVD player but for my E-450 (an ASUS mini-ITX motherboard) it seems to be way off. Its quite possible its a bug in PowerDVD or that it depends on the format of the audio track or I don't know what else.
I do have machines that I could try but it would really help to have some test material in a range of frame rates and audio formats.
ghost6007 - Monday, April 23, 2012 - link
This article is great commentary on the video aspects of an Intel HTPC setup however nowhere on either the processor discussions or the Z77 motherboard articles was any attempt made to actually review the audio portions of HTPC setups which is still a major part of any Home Theater.IMO if you want a complete comprehensive look at HTPC capabilities of any platform addressing such things as audio decoding, audio pass through over HDMI and audio quality are a must until then it is not a complete review.
ganeshts - Monday, April 23, 2012 - link
HDMI Audio Passthrough has now become a 'commodity' feature. It is an issue in only media players now.Yes, I agree there are some other audio tests that could be done, but we had to operate within time constraints. I apologize for the same.
ghost6007 - Monday, April 23, 2012 - link
I hope you guys do a more comprehensive review once these chips are available via retail or even a Ivy Bridge HTPC build.This new platform seems like an excellent candidate for a powerful low power/noise HTPC setup.
Southernsharky - Monday, April 23, 2012 - link
Has there been some kind of study on HTPC users to find out what the average is?To me the big problem with this article is that it makes too many assumptions, the biggest of which is that we are all just watching videos on our tv.
I do recognize that there is a market for that, but I'm sure that I speak for most of us when I say that I hope that is just the beginning of the HTPC and not the goal.
When an integrated GPU can game at 1080p (or hopefully better... let me know. Until then my own "HTPC" will have a graphics card.
aliasfox - Monday, April 23, 2012 - link
I kind of have to agree. video/audio playback maybe the *primary* function, but as my HTPC is hooked up to the biggest screen in the apartment, I wouldn't mind throwing the odd game on there.My current HTPC does (very) light gaming, overnight video transcoding, light photoshop, and the (very rare) video edit. Oh, and it plays video and audio. Please don't ask what it is.
Marlin1975 - Monday, April 23, 2012 - link
Why are you testing with a HD4000? The 4000 only comes in the higher and more costly chips? Most lowwer/Mid Ivy chips will use HD2500 video.The price differance is enough to buy a cheaper chip and get a full sep. video card that has its own memory, or wait for Trinity.
jwilliams4200 - Monday, April 23, 2012 - link
Not really. I think HD4000 is just about right for an HTPC. Later, when the Ivy Bridge core-i3's come out, I think the i3-3225, with HD4000, will be the first choice for HTPCs.shawkie - Monday, April 23, 2012 - link
If the i7-3770T is actually ever available to buy then from a power consumption point of view it would also be a good choice (with plenty of CPU headroom for the times where GPU decoding doesn't work) . From a cost point of view it might be a bit on the high side I suppose.flashbacck - Monday, April 23, 2012 - link
As one of the few people still running a dedicated htpc, I appreciate the article.anirudhs - Tuesday, April 24, 2012 - link
You mean use your HTPC for all media, including HD-DVR and Blu-Ray? I am just getting into it now.jwcalla - Monday, April 23, 2012 - link
Getting an i7 for an HTPC is like getting a Mustang GT500 to drive Miss Daisy. Come on now, is AT a review site for Daddy Warbucks types?Ok serious question though. What's the Intel IVB driver / HW acceleration situation on Linux? I couldn't imagine dropping $100 on Windows 7 for something as simple as HTPC functionality. For nvidia we're talking $10 Geforce card + VDPAU + lowest end CPU + least amount of RAM possible + linux = HTPC solution. Or a Zotac box. Can Intel compete with that?
ExarKun333 - Monday, April 23, 2012 - link
This review is really testing the HD4000 implementation. When the dual-cores are released with the HD4000, the GPU will be exactly the same, so almost everything will be directly applicable there too.anirudhs - Tuesday, April 24, 2012 - link
If you plan on getting cable onto your PC you have no choice but Windows due to DRM issues. Some channels will not be recorded by MythTV.CoffeeGrinder - Monday, April 23, 2012 - link
With that P8H77-M config, if you use a double slot GPU in one PCIex16 slot (and so lose one PCIex1 slot) and use TV tuners in both on the remaining slots PCIex1 and PCIex16 does using the second PCIex16 slot result in the first PCIex16 running at x8 ?ganeshts - Tuesday, April 24, 2012 - link
If the second PCIe is occupied, then it will cause the first x16 to run at x8. Both these slots are electrically connected, so when you need even one lane, it takes eight away from the first PCIe slot for it.Bluestraw - Tuesday, April 24, 2012 - link
I see you didn't test madVR in Full Screen Exclusive mode - can you elaborate on the reason for this please? I read over at missingremote that FSE improved the situation significantly for madVR with the HD4000?ganeshts - Tuesday, April 24, 2012 - link
The FSE mode performed visibly worse for me compared to FSW in the few cases that I tried. I have got the rest of the settings that Andrew @ MR used. I may try it and see if it improves things. My aim was to get madVR to render without any dropped frames, and I was able to get that at DDR3-1600 (which is what Andrew used too) for almost all the clips I had (except 720p60, which I didn't try till yesterday).satish0072001 - Tuesday, April 24, 2012 - link
Video decoding and rendering benchmarksCan you provide the learning guide how you've got those scores? It will be very helpful for some of us... I know about hqv score.. but this one is new to me.. kindly help :)
From where can I get these benchmarks if i have to compare my existing system with the IVB results?
LuckyKnight - Tuesday, April 24, 2012 - link
In the article there is a promise of a BIOS update to fix the 23.97Hz issue. Wasn't something similar also promised for sandy bridge in the same article over a year ago!! That never happened did it. I want to build a HTPC already!ganeshts - Tuesday, April 24, 2012 - link
Well, something did happen with SNB.. they got it to 23.972 Hz :) If you think about it, video cards with AMD and NVIDIA GPUs also end up in the 23.974 - 23.978 range, and only very rarely do I actually see a GPU outputting exactly 23.976023976 Hz.If Intel gets between 23.974 - 23.978 in a stable manner, I will consider the matter closed.
Shaggie - Thursday, April 26, 2012 - link
Is there still the problem like with SB that the driver puts color space to limited range when connecting to the tv with HDMI and resets it with every refresh rate switch/reboot with the integrated graphics?Stabgotham - Monday, April 30, 2012 - link
Is there a point to getting an H77 board with Ivy Bridge if all you are using it for is as an HTPC (sans overclock)? I can't tell what the benefit would be to justify the price increase.crisliv - Wednesday, June 13, 2012 - link
Nice article! As always.About the note:
"The good news is that Intel is claiming that this issue is fully resolved in the latest production BIOS on their motherboard. This means that BIOS updates to the current boards from other manufacturers should also get the fix. Hopefully, we should be able to independently test and confirm this soon."
What does it mean exactly? Does it mean that this BIOS update should get refresh rate closer to the 23.976 than it was in your test? And "on their motherboard" - does it mean that this BIOS update is for Intel MB only?
True that in AMD and nVidia the out of the box refresh rate for 23 is never precisely 23.976, but the custom timings on nVidia allows you to get closer to is. There is no custom timing settings on the HD4000, right?
LuckyKnight - Thursday, June 14, 2012 - link
Do we have an update regarding 23.967hz?theboyknowsclass - Tuesday, July 24, 2012 - link
it's been a while, and couldn't find any follow upHdale85 - Thursday, August 23, 2012 - link
I've been looking at an Ivy Bridge setup with the H77/Z77 chipset but I can't find any information about the audio support? Can it bitstream TrueHD and DTS-HD tracks? The older chipsets do it so I would find it strange that the new ones don't, but I don't see it mentioned on any of the new boards or in the intel information.