Very interesting article! I wonder what 600 PPI would look like, I am very pleased with 440 PPI already though; then again I was very happy with my qHD 4.3" screen too.
qhd is workable on 4,3", as long as it's not pentile we're talking about. of course we are used to sharper displays after the wake of 1080p devices in the last year, but eyeballing efficiency and thus battery life on our mobile companions i wonder if it wouldn't be more productive if we would be content with 720p on displays up to 5 inches.
You are confusing pixel per inch (PPI) with pixel per degree (PPD). PPI is independent of the viewing distance, it is simply a measure of # pixel / area. PPD has the viewing distance in its number.
No, I'm not confusing the two, I understand they are different. dyc4ha specifically asked what 600ppi would look like. Holding the iPhone at twice the difference should increase the pixel per degree by halving the apparent pixel size. I admit I didn't actually do the math, it might be that you can double the apparent PPI much closer than that by factoring in inherent limitations in human vision.
Such a good point to remember. Also, I notice when reading at a reduced display light level that as your device gets dimmer, it's harder to discern the letters further away, and easier when it is brighter. Of course the implication of interest is really looking at the display in daylight.
As to light levels effecting legibility, remember that with greater luminance, the iris constricts and so leads to increased depth of field and acuity, like with a pin hole camera. I believe this is the reason for discernment difference.
Love these types of articles. The reasons behind some of the manufactors decisions helps me to understand why some of the choices are made. Please keep them coming.
Most of this info has been peppered throughout previous articles, but I love seeing a clear and concise compendium of pixel density knowledge. It's a primer of sorts.
What if we combine larger batteries and all these improvements in SoC, RF components, panel technology, and combine this with lower resolution display to get the possible battery life??
we just have to look at mobile gaming - a sector that is advancing at a high pace - to see the downside of the high-res craze. rendering 720p is much more power efficient than doing so on a 1080p display. current mid-high-end desktop graphic cards consuming hundreds of watts struggle with the arrival of 4k displays, yet we still demand our mobile games be processed in 1080p - a quarter of the resolution - on a platform that uses only about one hundredth of the power.
you don't even have to go with gaming. even rendering your daily slew of apps with only half the pixels will be much more efficient and smooth than on 1080p, or even higher resolutions. and that doesn't take in context the efficiency of the lower res screen itself, as stated above.
i'm not saying that i want displays to stay "720p forever", but seeing smartphone battery life still leaving much to be desired, although newer smartphones rivaling my 2006 desktop and my 2010 laptop in some performance metrics, i think we got our priorities just a little bit out of ourder.
THIS! Given the choice between two screens of equal quality under 5", one being 720p and the other being 1080p, I would far prefer the former, provided it comes with noticeably better battery life. What really irks me is that the high end seems to be just about maxing out every possible spec, while cheaper devices scale everything down (contrast, brightness, battery capacity, etc), not just the resolution. It's a frustrating market if you are looking for a high end device that isn't just about the "biggest" specs, but about the most reasonable trade offs.
Great article. Above a certain practical level, it really is more marketing p*ssing contest than anything else. Most people want longer battery life & lower cost. Apple vs Android "extreme PPI" rat-race is a bit like arguing over whose HD audio is better : 96khz vs 192khz in a world where almost everyone continues to fail 96khz vs 44khz CD tests (once your strip out all the plaecbo and emotional "superman" wishful thinking rife in the audiophile world under controlled double-blind ABX testing conditions...)
If you require a perfectly dark room in order for people with better than 20/20 vision to see it, then it's mostly wasted. The human eye pupil diameter varies from 2-7mm. At 2mm, you're diffraction limited to only about 1 minute (roughly 300ppi from 12" away), so I'm pretty sure even if there was a marginal difference post 300ppi, most people still don't want to sacrifice 20% battery life for 1% of people to benefit from it only whilst sitting in a photographic dark-room. :-)
In the HD audio thing at least the physics can categorically state that anything higher than 44.1/48kHz sampling rate is completely useless to humans. Nyquist's theorem states you only need to sample at double the maximum frequency you need to encode (20kHz for human hearing) so sampling at 96kHz is wasting bandwidth and storage space. Unless you happen to be a dog and can hear sounds up to 48kHz.
The 20-20K range has one flaw - two actually. Infrasonic and ultrasonic. Just because the brain can't identify the frequencies doesn't mean they don't affect your ear/brain/body or other frequencies. Studies have shown that some people - a small percentage - can tell the difference, so why limit ourselves to the average when we have the technology to record live audio with its entire range of frequencies, why not collect them all?
As to the PPI race, I care less about PPI from the perspective of what the human eye can see and more about how it will force GPU makers to step up their game.
-- so why limit ourselves to the average when we have the technology to record live audio with its entire range of frequencies, why not collect them all?
Sounds like another 1%-er demanding that the other 99% pay for its toys. Bah.
No one can hear these ultrasonics, their presence can certainly be degrading to the audio experience though in more ways than one. Zero benefit. Anyone arguing against this is simply wrong.
Like most things in life, you have to be careful with absolutes. Most of the available "HD music" hocked by websites are in fact not HD at all, but rather upsampled from dated masters. The facts are that in a recording workflow that starts and ends with 24/96 (or higher), there is a measurable difference compared to 16/44. To what extent this can be heard depends upon the playback setup, the playback environment, and the human doing the hearing. Ultrasonics can be negative, but only in improper setups.
One problem with the testing of music tracks in the Meyer/Moran study is that the content used was sourced from older formats that lack the dynamic range of a modern high-resolution master. You can't take an old tape master from the 70s and get more out of it than what's there.
For your consideration, live orchestras can exceed 150dB. Many instruments (and noises) operate outside the average human hearing window: from pipe organs can get down below 10Hz and trumpets can get above 40kHz. These are things that can be recorded and played back if done so appropriately. And no, 192kbps MP3s and Beats™ earbuds won't cut it.
While I know for a fact that I can't hear them, I sure as hell can FEEL the bass under 20Hz in movies. War of the Worlds, with freqs down to 5Hz (my theater room in its current setup is only good for 12Hz) always serves as a good method for loosening one's bowels. LOL
"As to the PPI race, I care less about PPI from the perspective of what the human eye can see and more about how it will force GPU makers to step up their game."
this logic is somewhat backwards. so we have highres displays and framerates aren't as good as they could be. so gpu makers "step up their game" and implement more powerful, but at the same time more power hungry graphics solutions. so now we get the same framerates as we would have gotten if we only had sticked to slightly lower res displays, with the added benefit of having a hand warmer included in our smartphones. mhm.
You know as well as I do that demand drives innovation. Consumers want devices that last all day (low power), look great (high DPI), and operate smoothly (high FPS). They aren't going to get it until SOC/GPU makers release better GPUs. The display tech has been here for a while, it's time to play catch-up.
While it's pointless to source higher than 44kHz it's important to appreciate the merits of upsampling during playback. Some receiver DACs do it on their own but most on board codecs in the past two years support 24 bit 192kHz. It does a more accurate interpolation before running it through the DAC and generally results in lower THD. You can go one step farther and do the upsampling in software with hefty algorithms like sinc in programs that support it. Two common ones that come to mind are foobar and mpc-hc. Granted doing it in software probably doesn't buy you much but I can't hear the difference and I don't have the equipment to test it either way.
you are right and although you probably already know, what i've heard is that higher sampling rates are primarily used in production, so that after mixing and adding effects and distorting it an whatnot you still have a relatively "clean" signal you can just downsample to eg. 44.1khz and sell this way.
even funnier is an article i read once on some site, where some alledged music professional physically explained why 192khz can even sound worse than your good ol' cd. but i guess we're getting ot ;)
not wanting to add to the o.t. conversation, but the filters required for using the Nyquist limit on lowest sample rate can be pretty harsh, and add to phase distortion too (even though the human ear isn't particularly sensitive to phase).. so sampling at a much higher rate with more bits makes the signal processing much easier so as to allow you to reproduce the original signal when fed through your DAC & final stage analogue filters.
p.s. my point is that you only want to accurately reproduce the 10Hz-20kHz spectrum, and drop everything that's not actually audible, but do it in a way that prevents aliassing and distortion, intermod etc.
The problem is that human eyesight can exceed the theoretical physical limits through interpolation. It isn't as much as meeting the needs of the actual eye but the brain which has evolved the "software" to exceed the theoretical limits of the eye.
What surprises me is that the assumption that a phone is always at 12 inches or 30cm, is accepted without any criticism. There are multiple reasons why printers go way beyond 300dpi. One of they is that paper is not always at 30cm distance.
Forget phones and go back 10 years. When looking at a high quality print or photo on paper there was a very efficient way to zoom in: you bring the print closer to your eyes, like 10cm. It is quite a lot more efficient than pinching on a screen. And suddenly 1200dpi makes a lot of sense.
Nobody with a low-res phone (say 300dpi or less) thinks of bringing a phone closer to zoom, because it is no use at all. But with high-dpi screens things look a lot clearer in close up.
Yes but they are only photos. They do not generaly need high resolution. It also depends on the technology. Actual chemical photo paper (developed, not printed) combines the dots in a different way than inkjet or laser printers. I work with high resolution printers and while most "pictures" come in at a relatively low resolution (200 or 300dpi) we print them at 1200x1200 because it makes the single pixels less visible when dithering. Also, diagonal lines really do not look great on a photo so that is out for precise maps for example... Anyway, there are a lot of factors to take into account past simply source resolution.
The "dpi" or ppi in a digital picture is meaningless. "per inch" means something only in the context of physical size. In a digital picture it's just metadata that will tell layout programs at what size the picture should be laid out on the page. You can change these values without even touching the digital picture, it will be exactly the same. The DPI in printers' specs are the physical ink dots the printer will lay out in the paper. They're different things.
Laser printers go past 300 dpi because toner is black. That means each "dot" is either black or white, nothing in between. You need 600-1200 dpi to do half-toning with black and white dots to simulate a greyscale "pixel" at 300 dpi.
Text is much easier to read at resolutions higher than 300 DPI, even though it is monochrome. 30 years ago, when I was trying to print the final copy of my thesis, the best printers could only do 300 DPI, whereas professional books were being printed at 2400 DPI. Guess which was easier to read, and by a huge margin? There is still a noticeable difference between a text document printed 600 DPI and one printed at 2400 DPI, even though you can't see the individual pixels in either document.
That's definitely true, but there are limits to how close the display can get based upon the size and the minimum distance that an object can be from the eye before it can't be focused upon.
so you say we should implement even higher res screens, which probably aren't cheap and also use considerable more power, just so we don't have to pinch our screens just to zoom in on scarlett johannsons nudes, instead we bring it up to 10cm to our exes?
doesn't seem very straight forward to me to let all the additional resolution and processing power go to waste in every standard use case, just so we can zoom in like this again.
I was thinking more or less the same thing. I don't know why 12 inches is treated as some sort of magic number.
I often hold my phone closer to my eyes than 12 inches, especially if I have my glasses off, for any of many reasons. I imagine I am far from the only person who does this. My phone has about 320 ppi and I can easily see the pixelation. Even looking at some of the 450 ppi phones in stores, it's not hard for me to see the pixelation. I always thought Apple's "retina" claim was so demonstrable wrong, the first time I ever saw one of the screens, it was just stupid. Higher resolution would be nice.
If the question is what could make a difference in pratical real world use, then the 12 inch assumption seems like a bad one.
Specifically, he says that 8K per eye is enough for imperceptible pixels, but the human eye can still see more detail than that, ad the promoted comment goes on to talk about.
I suspect for applications like the Oculus Rift, the future is going to be direct imaging onto the retina. The benefit being that instead of having to render the entire image at ultra-high resolution, you only need to render the point the fovea is aimed at at that resolution. Everything else can be rendered at a much lower resolution because it'll be falling on the peripheral vision.
Where this article fails is that it try to tech out a single spec from the screen. For me a screen has several parameters that cannot be left out when you are speaking about the quality of the screen. - PPI. This parameter is probably the easiest. And this is also the parameter where we are seeing the different marketing organizations compete. - Pixel density. This parameter is a little bit more difficult to determine because he you need to look at the single pixel size and compare to how much "black" is around. You could in principle have very high PPI, but almost all area is black. Pixel density influence directly on required back lighting for a certain luminescence. - Black. Black is very important for a screen. And for some when they first notice that the screen is not black but grey that are not happy. - Color accuracy. This parameter is again more tricky and where you often see over saturated screens to give a bit more "puf" to the images. To have a realistic picture which looks like out of the window we need true colors.
This is not a rant this is more that when we are discussing such topics about screens we need to look at all parameters together and not take a single one out. A disclaimer explaining this would have been nice.
Can you put your views about differences between PPI and pixel density more elaborately? Super AMOLED 1080p displays have a lot of black area around the actual pixels. But that doesn't affect its viewability. However it does lack a little in sharpness thanks to the fact that it uses RGBG like pattern.
PPI is how many pixel per inch. Let us just say for example that there are 1 pixel per inch but the pixel itself is 1/2 inch in each direction. That means the pixel itself only occupies 25% out of the total 1 inch x 1 inch area.
Just because you have more pixel per inch does not necessary means you have any good pixel density if the pixels are that much smaller.
If you look at any close up pictures of any screen there are alot of black. There should be no black.
What you are describing as "pixel density" is actually something like a fill factor, as in "the area covered by pixels". Talking just about the meaning of "pixel density" it is.. well, the density of the pixels. x number of pixels per length or area. Which is just the same as PPI.
Your point about the luminous area of a display vs the supporting circuity's area is definitely a good one but it's not easily calculated compared to most other parameters.
Contrast is definitely important, as is color accuracy, but there's definitely been a lot more information out there for those aspects, as there is some level of an objective standard for those two.
you are right of course, but i think that's part of the reason we even have this debate: manufacturers have chosen first screen size and now resolution as their sole marketing argument. nobody is advertising their black levels, their contrast, colour accuracy. they throw around with buzzwords like "super +" and "quattron" and "bravia", but the only metric that's really sticking out is resolution.
it's a fault on the manufacturers part if even sony includes somewhat substandard screens on its flagship devices, but hey, at least they are 1080p.
at the same time i applaud anandtech for once more giving perspective to an area which is presented somewhat askew by the manufacturers. so this article should be only about resoltion, so we can once and for all stop worrying about "how much is enough" and maybe - just maybe - concentrate on other, often not less important, aspects of our screens.
What you have here is af ocused article. It talks about 1 issue and clearly states so in the headline. Had he gone on and on about the other important display metrics it would have been much longer and some would have rightfully criticized it for missing the topic in large parts.
Apart from that you are obviously right that there's more to good displays than resolution. That's why I wouldn't want a 4k display with a TN panel even as a gift.
I imagine that higher resolution displays will become more prevalent as cost and efficiency continue to improve. Right now, there is a serious cost. In the future, the cost may be negligible. That's why high end devices with high end prices and serious batteries will go up in resolution while lower end devices will stick with the tried and true with smaller batteries.
Just like it is now. In fact, this is so true as to make the point of the article seemingly a "Duh?" moment. The information held in the earlier part of the article is nice, though. It's just the conclusion that seems suspect, given that it's telling us what we already know.
Higher resolution displays are for the rich and are for the most part gratuitous.
You also got: - cost, big high res displays are quite expensive - glasses, where we kinda want high res - we should also have stretchable screens soon enough ( not too long after just flexible) and very high PPI becomes less when expanded.
A very nice 2003 paper from the AFRL titled http://www.itcexperts.net/library/Capability%20of%...">Capability of the Human Visual System give an excellent overview of what specifications a truly indistinguisable-from-reality display would have to fulfil, as well as an overview of the human visual system and various methods of measuring acuity. As well as angular resolution, it includes things like display update rates (depedant on refresh technique), luminance levels and range, colour gamut, etc. For example, a super-Oculus display (covering the entire human binocular visual field at 0.5arc-seconds per pixel) would require a display on the order of 1.5 TERApixels. However, the paper does not take into account other effects, such as the vestibulo-ocular reflex that would necessitate even higher update rates for the edge case of the display being physically fixed in respect to the head.
Very interesting, thank you! I have had many conversations with many differing opinions on this very topic, so it is nice to see a technical explanation that was easy to understand. In my personal opinion, 99.99% of the time you are not using a phone for any sort of critical viewing, so getting up and beyond 400ppi really is not in my interest. I would gladly trade any extra ppi for extra frame rate and battery life by not taxing the gpu so hard. Now, if we are talking desktop monitors, BRING IT ON. Because it has upset me for years now that companies are so gung ho about creating these super high ppi screens for something like a phone, but completely ignore an area where they can actually be utilized and have a noticeable impact...so we've stagnated at this 90-110ppi range in the 24"/27" monitor sizes.
The main goal should be to get all displays to at least 4k. That way they can play back 4k content without scaling (and scaling artifacts are often rendered MORE noticeable by high PPI displays if they don't reach an even scaling multiple).
Yes but they are only photos. They do not generaly need high resolution. It also depends on the technology. Actual chemical photo paper (developed, not printed) combines the dots in a different way than inkjet or laser printers. I work with high resolution printers and while most "pictures" come in at a relatively low resolution (200 or 300dpi) we print them at 1200x1200 because it makes the single pixels less visible when dithering. Also, diagonal lines really do not look great on a photo so that is out for precise maps for example... Anyway, there are a lot of factors to take into account past simply source resolution.
Kudos on the quality of the article. Some who comment express an element of dissatisfaction, yet the quality of the article has yielded an equally high level of enlightening discussion. And it's generally pretty civil too! I'm sassified!
Thanks Joshua, a ho bunch! (And I promise to NEVER repeat that pathetic word play, which undoubtedly makes you ill with each pedestrian repetition. I'm not striving to alienate you; I'm simply a very, very, disturbed obsessive/compulsive, i.e. Mr. Creosote desiring "just 1 more thin mint!").
I found this article to be rather frustrating. At first I was excited to see the author mention the importance of cognitive processing's effects on the perceptual quality of vision. A good example of this is the perception of a hair follicle. Even in a super high density display you can tell the follicle is slightly off, even if you can't discern individual pixelation.
However the author then hand waved the issue away by saying we'll never have 1800 PPD displays in the near future, so it's not important- utterly missing the point.
Research has demonstrated the closer you get to the ideal 1800 PPD, the more "lifelike" a display becomes. As in, rather than looking at a super sharp image on a screen, you start having the sensation you're looking at something *real*. That's why palmar luckey of oculus rift said 8k per eye wasn't enough. It's not just about making individual pixels indistinguishable. It's about going beyond that and tricking your brain into thinking you're looking at looking through a clear glass window.
That's definitely true, but it's rather unfeasible to achieve 1800 PPD currently, although in the long run it may be possible. For a 5" display at the 12 inch distance mark (which is contrived, but is a generally realistic distance for mobile), it would take 37453x21067 to have enough resolution, which would require square pixels with a side length of around 2.96 microns. For comparison, a 5 inch panel with 1080p resolution would have pixels with a side length of 57.7 microns. Shrinking a display pixel to that size is years away at this point.
Joshua, why didn't you mention the fact that UHD-1 and later UHD-2 is in fact more than just pixel density , its about providing the consumer with real Rec. 2020 10bit real color space rather than the antiquated Rec. 709 (HDTV and below)8 bit pseudocolor of today's panels . "Rec. 2020 allows for RGB and YCbCr signal formats with 4:4:4, 4:2:2, and 4:2:0 chroma subsampling.[1] Rec. 2020 specifies that if a luma (Y') signal is made that it uses the R’G’B’ coefficients 0.2627 for red, 0.6780 for green, and 0.0593 for blue.[1]"
as regards using more power that's what the latest quantum dot display is for, as in, that light is supplied on demand, which enables new, more efficient displays and allows for mobile devices with longer battery lives.
That's not so much about pixel density as it is about 24-bit vs 30-bit color. I'm not too well educated on HDTV spec, but sRGB is 24-bit and is the industry standard, while Adobe RGB is 30-bit but is effectively limited to content creation purposes, as only applications like Photoshop are really aware of colorspace.
I think I conflated color depth with color space, so my foot is currently firmly in my mouth, they are independent of each other.
But for the most part, sRGB is the standard, and within that limited gamut, 8-bit color is the standard. In the future 10-bit color or greater will happen, but it doesn't seem likely for now.
:) thats ok,just remember to always ask and answer the question on all future reviews , "is this [UHD] panel International Telecommunication Union (ITU) Rec. 2020 10bit real color space ready out the box"
For sure, but I think that mostly applies to TV applications. For smartphone and desktop displays, it seems that sRGB will remain the standard, although there may be some movement towards wider color gamuts some day.
oh and just to save looking it up "In coverage of the CIE 1931 color space the Rec. 2020 color space covers 75.8%, the digital cinema reference projector color space covers 53.6%, the Adobe RGB color space covers 52.1%, and the Rec. 709 color space covers 35.9%"
"The NHK [and the BBC] measured contrast sensitivity for the Rec. 2020 color space using Barten's equation which had previously been used to determine the bit depth for digital cinema. 11-bits per sample for the Rec. 2020 color space is below the visual modulation threshold, the ability to discern a one value difference in luminance, for the entire luminance range. The NHK [and the BBC] is planning for their UHDTV system, Super Hi-Vision [UHD-2] , to use 12-bits per sample RGB"
Higher resolution always reduces power efficiency of the system. As the article says: if there are other improvements to power efficiency to offset this loss, they could always be used to reduce power at lower resolutions. Our choice.
Regarding quantum dots: it's hard to pump them electrically, quantum wells are much better than this. But all of the pretty much require expensive III-V or II-VI compound semiconductors (like LEDs and many lasers), which doesn't bode well for large-area applications. That's why they're going for the OLEDs instead.
And about pumping them optically: well, you don't want to have to put on sun screen in order to look at your phone for longer than a few minutes, do you? Anyway, UV light sources are neither cheap nor efficient. A cure worse than the desease, so to say.
Good to see some deeper analysis on display resolution and congratulations for this well-written article. The Snellen eye test is probably a good enough measure for a Westerner, but people tend to forget there are other languages out there. Namely, Chinese and Japanese (and I would guess the arabic languages) readers are the once who benefit the most from higher pixel densities, as complex Chinese characters can be an extremely complicated drawing that fits in the space of almost a single letter. My guess would be that pixel densities over and around 500ppi would actually make for a tangible improvement in the reading experience, but it'd be interesting to see more reasearch on this.
While I am interested in how a 600-PPI display would look like, I tend to favor a phone with lower resolution and PPI. Since the difference may not be very noticeable, I weigh more on battery life and overall smoothness which a higher resolution display would definitely impact.
Joshua, it would have been nice if you commented more directly on the images you're showing. I recognize them as different sub tiles, but apart from that I have no idea what I'm looking at and how the images relate to the text.
Now _this_ is the kind of stuff that makes Anandtech the site that it is. No marketspeak, no dumbing-down, but still accessible to mortals due to no use of overly esoteric terms, rehashes of academic papers, or any "look how smart the author is" insecurity.
A realistic, practical, educated, intelligent look at a significant topic.
3" is the realistic value. The human eye can't realistically see anything closer that's in focus.
At 120PPD, 2300DPI or so is needed at 3" away. So there is capacity to go further for VR purposes. The ST1080 VR display has already achieved such densities, but the price is relatively high compared to a smartphone display.
I guess I was trying to say the merit from a more technical perspective than previously examined, as much of the discourse hasn't been very well fleshed-out in light of the existing literature.
It doesn't matter if the trumpets are playing 30kHz or 40kHz, unless you really care what your dog hears (you won't hear the difference). Try playing a 11kHz signal back as a sine wave and as a square wave (which are only different due to supersonic harmonics). You won't hear a single difference (even with 30kHz+ headphones).
Also, don't count having that 20kHz hearing. Wiki claims "most children and some adults" can hear 20kHz. Out of ~20 teenagers in my old electronics shop class one *1*! student could tell when another student left the amplifier test at 20kHz (all we tested for) and told him to "turn it off already". My max hearing was about 16kHz then (at 16 years old. Now at [censored] I can "hear" 12kHz, but at 40dB lower than 10kHz.)
I can't argue the low frequencies, but nobody is claiming that is "hearing".
Instead of going to stupid 2560x1440 on smartphones Samsung should just finally switch back to RGB. They used to do pentile then RGB then pentile with higher resolution but they're sticking with pentile. I hate pentile.
Thank you for this very interesting technical article on ppi. As a person with a keen interest in display quality, I really appreciate your detailed discussion of the issue. After my mother bought a 1080p (6.4 inch 16:9 screen) smartphone nearly 2 years back at a ppi of nearly 330, I have found full hd to be the perfection of visual clarity for a smartphone. This week she bought a new 5.5 inch smartphone of 720p resolution at ppi around 266. It is a very well designed display and I found it just as good as the earlier full hd phone display of 330ppi. Having used phones smaller than 4.5 inch, I find that amazingly high ppi may not give an advantage on such a small screen for the regular range of activities of a modern smartphone, which is nowadays a powerful mini computer. So my humble personal conclusion in this matter is that 1. Minimum 5 inch screen size in 16:9 format is required for a really good user experience 2. Upto 5.5 inch size at least, a 720p hd screen at 267ppi is at least 90% as good as full hd screen for routine purposes 3. Higher resolutions and higher ppi are much more appreciable and useful at screen sizes above 6 inches 4. And finally, a well developed and sophisticated smartphone screen with a well designed user interface can produce amazingly pleasing display results and user satisfaction even with a ppi of around 275 (720p hd on a 5.5 inch screen) saving upto Rs.7500 (indian) for the purchaser by opting for lower resolution. Thank you.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
99 Comments
Back to Article
dyc4ha - Sunday, February 9, 2014 - link
Very interesting article! I wonder what 600 PPI would look like, I am very pleased with 440 PPI already though; then again I was very happy with my qHD 4.3" screen too.fokka - Sunday, February 9, 2014 - link
qhd is workable on 4,3", as long as it's not pentile we're talking about. of course we are used to sharper displays after the wake of 1080p devices in the last year, but eyeballing efficiency and thus battery life on our mobile companions i wonder if it wouldn't be more productive if we would be content with 720p on displays up to 5 inches.michael2k - Monday, February 10, 2014 - link
600ppi would look like your 440ppi screen held farther away.Seriously. Take an iPhone 4, 326ppi, and hold it at twice the normal distance and you get 652ppi.
Death666Angel - Monday, February 10, 2014 - link
You are confusing pixel per inch (PPI) with pixel per degree (PPD). PPI is independent of the viewing distance, it is simply a measure of # pixel / area. PPD has the viewing distance in its number.michael2k - Monday, February 10, 2014 - link
No, I'm not confusing the two, I understand they are different. dyc4ha specifically asked what 600ppi would look like. Holding the iPhone at twice the difference should increase the pixel per degree by halving the apparent pixel size. I admit I didn't actually do the math, it might be that you can double the apparent PPI much closer than that by factoring in inherent limitations in human vision.MrSpadge - Monday, February 10, 2014 - link
No you're quite correct on that - no further correction needed!halbhh2 - Wednesday, February 19, 2014 - link
Such a good point to remember. Also, I notice when reading at a reduced display light level that as your device gets dimmer, it's harder to discern the letters further away, and easier when it is brighter. Of course the implication of interest is really looking at the display in daylight.TonyTheToolGuy - Thursday, February 20, 2014 - link
As to light levels effecting legibility, remember that with greater luminance, the iris constricts and so leads to increased depth of field and acuity, like with a pin hole camera. I believe this is the reason for discernment difference.Leftbranch - Sunday, February 9, 2014 - link
Love these types of articles. The reasons behind some of the manufactors decisions helps me to understand why some of the choices are made. Please keep them coming.ImSpartacus - Sunday, February 9, 2014 - link
I know!Most of this info has been peppered throughout previous articles, but I love seeing a clear and concise compendium of pixel density knowledge. It's a primer of sorts.
Beautyspin - Sunday, February 9, 2014 - link
There must be some rumors coming out about iPhone 6 so I think Anand is getting us prepared for this.There is a better article here - http://www.cultofmac.com/173702/why-retina-isnt-en...
A5 - Sunday, February 9, 2014 - link
You know Anand doesn't write all the articles (or this one), right?ImSpartacus - Sunday, February 9, 2014 - link
On that note, who's this Joshua Ho fella?He seems to write about as well as the other writers, so I have no complaints.
DanNeely - Sunday, February 9, 2014 - link
I don't think I've seen him before either. Maybe he's a new hire from the call for writers from a few weeks ago.iwod - Sunday, February 9, 2014 - link
The article you posted assume the best eyesight has 0.3 arcminutes value. While this suggest the upper bound to be 0.5. And that is a huge difference.blanarahul - Sunday, February 9, 2014 - link
What if we combine larger batteries and all these improvements in SoC, RF components, panel technology, and combine this with lower resolution display to get the possible battery life??But I guess it will only a dream.
fokka - Sunday, February 9, 2014 - link
that's exactly what i was thinking about.we just have to look at mobile gaming - a sector that is advancing at a high pace - to see the downside of the high-res craze. rendering 720p is much more power efficient than doing so on a 1080p display. current mid-high-end desktop graphic cards consuming hundreds of watts struggle with the arrival of 4k displays, yet we still demand our mobile games be processed in 1080p - a quarter of the resolution - on a platform that uses only about one hundredth of the power.
you don't even have to go with gaming. even rendering your daily slew of apps with only half the pixels will be much more efficient and smooth than on 1080p, or even higher resolutions. and that doesn't take in context the efficiency of the lower res screen itself, as stated above.
i'm not saying that i want displays to stay "720p forever", but seeing smartphone battery life still leaving much to be desired, although newer smartphones rivaling my 2006 desktop and my 2010 laptop in some performance metrics, i think we got our priorities just a little bit out of ourder.
MrSpadge - Monday, February 10, 2014 - link
Same for me please: reasonably high PPI and power savings!jimjamjamie - Monday, February 10, 2014 - link
Sony Xperia Z1 Compacta5cent - Tuesday, February 11, 2014 - link
THIS! Given the choice between two screens of equal quality under 5", one being 720p and the other being 1080p, I would far prefer the former, provided it comes with noticeably better battery life. What really irks me is that the high end seems to be just about maxing out every possible spec, while cheaper devices scale everything down (contrast, brightness, battery capacity, etc), not just the resolution. It's a frustrating market if you are looking for a high end device that isn't just about the "biggest" specs, but about the most reasonable trade offs.bsim500 - Sunday, February 9, 2014 - link
Great article. Above a certain practical level, it really is more marketing p*ssing contest than anything else. Most people want longer battery life & lower cost. Apple vs Android "extreme PPI" rat-race is a bit like arguing over whose HD audio is better : 96khz vs 192khz in a world where almost everyone continues to fail 96khz vs 44khz CD tests (once your strip out all the plaecbo and emotional "superman" wishful thinking rife in the audiophile world under controlled double-blind ABX testing conditions...)If you require a perfectly dark room in order for people with better than 20/20 vision to see it, then it's mostly wasted. The human eye pupil diameter varies from 2-7mm. At 2mm, you're diffraction limited to only about 1 minute (roughly 300ppi from 12" away), so I'm pretty sure even if there was a marginal difference post 300ppi, most people still don't want to sacrifice 20% battery life for 1% of people to benefit from it only whilst sitting in a photographic dark-room. :-)
r3loaded - Sunday, February 9, 2014 - link
In the HD audio thing at least the physics can categorically state that anything higher than 44.1/48kHz sampling rate is completely useless to humans. Nyquist's theorem states you only need to sample at double the maximum frequency you need to encode (20kHz for human hearing) so sampling at 96kHz is wasting bandwidth and storage space. Unless you happen to be a dog and can hear sounds up to 48kHz.ZeDestructor - Sunday, February 9, 2014 - link
"On the internet, nobody knows you're a dog..."nathanddrews - Sunday, February 9, 2014 - link
The 20-20K range has one flaw - two actually. Infrasonic and ultrasonic. Just because the brain can't identify the frequencies doesn't mean they don't affect your ear/brain/body or other frequencies. Studies have shown that some people - a small percentage - can tell the difference, so why limit ourselves to the average when we have the technology to record live audio with its entire range of frequencies, why not collect them all?As to the PPI race, I care less about PPI from the perspective of what the human eye can see and more about how it will force GPU makers to step up their game.
FunBunny2 - Sunday, February 9, 2014 - link
-- so why limit ourselves to the average when we have the technology to record live audio with its entire range of frequencies, why not collect them all?Sounds like another 1%-er demanding that the other 99% pay for its toys. Bah.
nathanddrews - Monday, February 10, 2014 - link
Technically, the 16/44 covers approximately 97% of average human hearing, so let's call it what it is. 3%-er. ;-)buttgx - Sunday, February 9, 2014 - link
No one can hear these ultrasonics, their presence can certainly be degrading to the audio experience though in more ways than one. Zero benefit. Anyone arguing against this is simply wrong.I recommend this article by creator of FLAC.
http://xiph.org/~xiphmont/demo/neil-young.html
nathanddrews - Monday, February 10, 2014 - link
Like most things in life, you have to be careful with absolutes. Most of the available "HD music" hocked by websites are in fact not HD at all, but rather upsampled from dated masters. The facts are that in a recording workflow that starts and ends with 24/96 (or higher), there is a measurable difference compared to 16/44. To what extent this can be heard depends upon the playback setup, the playback environment, and the human doing the hearing. Ultrasonics can be negative, but only in improper setups.One problem with the testing of music tracks in the Meyer/Moran study is that the content used was sourced from older formats that lack the dynamic range of a modern high-resolution master. You can't take an old tape master from the 70s and get more out of it than what's there.
For your consideration, live orchestras can exceed 150dB. Many instruments (and noises) operate outside the average human hearing window: from pipe organs can get down below 10Hz and trumpets can get above 40kHz. These are things that can be recorded and played back if done so appropriately. And no, 192kbps MP3s and Beats™ earbuds won't cut it.
While I know for a fact that I can't hear them, I sure as hell can FEEL the bass under 20Hz in movies. War of the Worlds, with freqs down to 5Hz (my theater room in its current setup is only good for 12Hz) always serves as a good method for loosening one's bowels. LOL
fokka - Sunday, February 9, 2014 - link
"As to the PPI race, I care less about PPI from the perspective of what the human eye can see and more about how it will force GPU makers to step up their game."this logic is somewhat backwards. so we have highres displays and framerates aren't as good as they could be. so gpu makers "step up their game" and implement more powerful, but at the same time more power hungry graphics solutions. so now we get the same framerates as we would have gotten if we only had sticked to slightly lower res displays, with the added benefit of having a hand warmer included in our smartphones. mhm.
nathanddrews - Monday, February 10, 2014 - link
You know as well as I do that demand drives innovation. Consumers want devices that last all day (low power), look great (high DPI), and operate smoothly (high FPS). They aren't going to get it until SOC/GPU makers release better GPUs. The display tech has been here for a while, it's time to play catch-up.willis936 - Sunday, February 9, 2014 - link
While it's pointless to source higher than 44kHz it's important to appreciate the merits of upsampling during playback. Some receiver DACs do it on their own but most on board codecs in the past two years support 24 bit 192kHz. It does a more accurate interpolation before running it through the DAC and generally results in lower THD. You can go one step farther and do the upsampling in software with hefty algorithms like sinc in programs that support it. Two common ones that come to mind are foobar and mpc-hc. Granted doing it in software probably doesn't buy you much but I can't hear the difference and I don't have the equipment to test it either way.extide - Sunday, February 9, 2014 - link
For the most accurate* sound, you do NOT want to upsample.*Accurate meaning as close to the way it is supposed to be; or as close as possible to the artists intention.
fokka - Sunday, February 9, 2014 - link
you are right and although you probably already know, what i've heard is that higher sampling rates are primarily used in production, so that after mixing and adding effects and distorting it an whatnot you still have a relatively "clean" signal you can just downsample to eg. 44.1khz and sell this way.even funnier is an article i read once on some site, where some alledged music professional physically explained why 192khz can even sound worse than your good ol' cd. but i guess we're getting ot ;)
speculatrix - Friday, February 14, 2014 - link
not wanting to add to the o.t. conversation, but the filters required for using the Nyquist limit on lowest sample rate can be pretty harsh, and add to phase distortion too (even though the human ear isn't particularly sensitive to phase).. so sampling at a much higher rate with more bits makes the signal processing much easier so as to allow you to reproduce the original signal when fed through your DAC & final stage analogue filters.speculatrix - Friday, February 14, 2014 - link
p.s. my point is that you only want to accurately reproduce the 10Hz-20kHz spectrum, and drop everything that's not actually audible, but do it in a way that prevents aliassing and distortion, intermod etc.errorr - Sunday, February 9, 2014 - link
The problem is that human eyesight can exceed the theoretical physical limits through interpolation. It isn't as much as meeting the needs of the actual eye but the brain which has evolved the "software" to exceed the theoretical limits of the eye.janderk - Sunday, February 9, 2014 - link
What surprises me is that the assumption that a phone is always at 12 inches or 30cm, is accepted without any criticism. There are multiple reasons why printers go way beyond 300dpi. One of they is that paper is not always at 30cm distance.Forget phones and go back 10 years. When looking at a high quality print or photo on paper there was a very efficient way to zoom in: you bring the print closer to your eyes, like 10cm. It is quite a lot more efficient than pinching on a screen. And suddenly 1200dpi makes a lot of sense.
Nobody with a low-res phone (say 300dpi or less) thinks of bringing a phone closer to zoom, because it is no use at all. But with high-dpi screens things look a lot clearer in close up.
haikuginger - Sunday, February 9, 2014 - link
"There are multiple reasons why printers go way beyond 300dpi."Actually, professional-grade photo printing sits right at 300dpi.
Nick2000 - Sunday, February 9, 2014 - link
Yes but they are only photos. They do not generaly need high resolution. It also depends on the technology. Actual chemical photo paper (developed, not printed) combines the dots in a different way than inkjet or laser printers. I work with high resolution printers and while most "pictures" come in at a relatively low resolution (200 or 300dpi) we print them at 1200x1200 because it makes the single pixels less visible when dithering. Also, diagonal lines really do not look great on a photo so that is out for precise maps for example... Anyway, there are a lot of factors to take into account past simply source resolution.andy o - Sunday, February 9, 2014 - link
The "dpi" or ppi in a digital picture is meaningless. "per inch" means something only in the context of physical size. In a digital picture it's just metadata that will tell layout programs at what size the picture should be laid out on the page. You can change these values without even touching the digital picture, it will be exactly the same. The DPI in printers' specs are the physical ink dots the printer will lay out in the paper. They're different things.Solandri - Tuesday, February 11, 2014 - link
Laser printers go past 300 dpi because toner is black. That means each "dot" is either black or white, nothing in between. You need 600-1200 dpi to do half-toning with black and white dots to simulate a greyscale "pixel" at 300 dpi.Ktracho - Tuesday, February 11, 2014 - link
Text is much easier to read at resolutions higher than 300 DPI, even though it is monochrome. 30 years ago, when I was trying to print the final copy of my thesis, the best printers could only do 300 DPI, whereas professional books were being printed at 2400 DPI. Guess which was easier to read, and by a huge margin? There is still a noticeable difference between a text document printed 600 DPI and one printed at 2400 DPI, even though you can't see the individual pixels in either document.JoshHo - Sunday, February 9, 2014 - link
That's definitely true, but there are limits to how close the display can get based upon the size and the minimum distance that an object can be from the eye before it can't be focused upon.fokka - Sunday, February 9, 2014 - link
so you say we should implement even higher res screens, which probably aren't cheap and also use considerable more power, just so we don't have to pinch our screens just to zoom in on scarlett johannsons nudes, instead we bring it up to 10cm to our exes?doesn't seem very straight forward to me to let all the additional resolution and processing power go to waste in every standard use case, just so we can zoom in like this again.
gg555 - Tuesday, February 18, 2014 - link
I was thinking more or less the same thing. I don't know why 12 inches is treated as some sort of magic number.I often hold my phone closer to my eyes than 12 inches, especially if I have my glasses off, for any of many reasons. I imagine I am far from the only person who does this. My phone has about 320 ppi and I can easily see the pixelation. Even looking at some of the 450 ppi phones in stores, it's not hard for me to see the pixelation. I always thought Apple's "retina" claim was so demonstrable wrong, the first time I ever saw one of the screens, it was just stupid. Higher resolution would be nice.
If the question is what could make a difference in pratical real world use, then the 12 inch assumption seems like a bad one.
Mr Alpha - Sunday, February 9, 2014 - link
What about something like Occulus Rift, where you are strapping the screen to your face?ZeDestructor - Sunday, February 9, 2014 - link
They use a 7" panel and the founder has gone on record saying that even 8K per eye isn't enough:http://arstechnica.com/gaming/2013/09/virtual-perf...
Specifically, he says that 8K per eye is enough for imperceptible pixels, but the human eye can still see more detail than that, ad the promoted comment goes on to talk about.
Solandri - Tuesday, February 11, 2014 - link
I suspect for applications like the Oculus Rift, the future is going to be direct imaging onto the retina. The benefit being that instead of having to render the entire image at ultra-high resolution, you only need to render the point the fovea is aimed at at that resolution. Everything else can be rendered at a much lower resolution because it'll be falling on the peripheral vision.frakkel - Sunday, February 9, 2014 - link
Where this article fails is that it try to tech out a single spec from the screen.For me a screen has several parameters that cannot be left out when you are speaking about the quality of the screen.
- PPI. This parameter is probably the easiest. And this is also the parameter where we are seeing the different marketing organizations compete.
- Pixel density. This parameter is a little bit more difficult to determine because he you need to look at the single pixel size and compare to how much "black" is around. You could in principle have very high PPI, but almost all area is black. Pixel density influence directly on required back lighting for a certain luminescence.
- Black. Black is very important for a screen. And for some when they first notice that the screen is not black but grey that are not happy.
- Color accuracy. This parameter is again more tricky and where you often see over saturated screens to give a bit more "puf" to the images. To have a realistic picture which looks like out of the window we need true colors.
This is not a rant this is more that when we are discussing such topics about screens we need to look at all parameters together and not take a single one out. A disclaimer explaining this would have been nice.
blanarahul - Sunday, February 9, 2014 - link
Can you put your views about differences between PPI and pixel density more elaborately? Super AMOLED 1080p displays have a lot of black area around the actual pixels. But that doesn't affect its viewability. However it does lack a little in sharpness thanks to the fact that it uses RGBG like pattern.frakkel - Sunday, February 9, 2014 - link
PPI is how many pixel per inch. Let us just say for example that there are 1 pixel per inch but the pixel itself is 1/2 inch in each direction. That means the pixel itself only occupies 25% out of the total 1 inch x 1 inch area.Just because you have more pixel per inch does not necessary means you have any good pixel density if the pixels are that much smaller.
If you look at any close up pictures of any screen there are alot of black. There should be no black.
MrSpadge - Monday, February 10, 2014 - link
What you are describing as "pixel density" is actually something like a fill factor, as in "the area covered by pixels". Talking just about the meaning of "pixel density" it is.. well, the density of the pixels. x number of pixels per length or area. Which is just the same as PPI.JoshHo - Sunday, February 9, 2014 - link
Your point about the luminous area of a display vs the supporting circuity's area is definitely a good one but it's not easily calculated compared to most other parameters.Contrast is definitely important, as is color accuracy, but there's definitely been a lot more information out there for those aspects, as there is some level of an objective standard for those two.
fokka - Sunday, February 9, 2014 - link
you are right of course, but i think that's part of the reason we even have this debate: manufacturers have chosen first screen size and now resolution as their sole marketing argument. nobody is advertising their black levels, their contrast, colour accuracy. they throw around with buzzwords like "super +" and "quattron" and "bravia", but the only metric that's really sticking out is resolution.it's a fault on the manufacturers part if even sony includes somewhat substandard screens on its flagship devices, but hey, at least they are 1080p.
at the same time i applaud anandtech for once more giving perspective to an area which is presented somewhat askew by the manufacturers. so this article should be only about resoltion, so we can once and for all stop worrying about "how much is enough" and maybe - just maybe - concentrate on other, often not less important, aspects of our screens.
MrSpadge - Monday, February 10, 2014 - link
What you have here is af ocused article. It talks about 1 issue and clearly states so in the headline. Had he gone on and on about the other important display metrics it would have been much longer and some would have rightfully criticized it for missing the topic in large parts.Apart from that you are obviously right that there's more to good displays than resolution. That's why I wouldn't want a 4k display with a TN panel even as a gift.
zodiacfml - Sunday, February 9, 2014 - link
I say, please continue with the progress so that I can afford the 1080p displays in the near future.HisDivineOrder - Sunday, February 9, 2014 - link
I imagine that higher resolution displays will become more prevalent as cost and efficiency continue to improve. Right now, there is a serious cost. In the future, the cost may be negligible. That's why high end devices with high end prices and serious batteries will go up in resolution while lower end devices will stick with the tried and true with smaller batteries.Just like it is now. In fact, this is so true as to make the point of the article seemingly a "Duh?" moment. The information held in the earlier part of the article is nice, though. It's just the conclusion that seems suspect, given that it's telling us what we already know.
Higher resolution displays are for the rich and are for the most part gratuitous.
jjj - Sunday, February 9, 2014 - link
You also got:- cost, big high res displays are quite expensive
- glasses, where we kinda want high res
- we should also have stretchable screens soon enough ( not too long after just flexible) and very high PPI becomes less when expanded.
iwod - Sunday, February 9, 2014 - link
I dont want a tech that doubles the PPI to 600. I want a tech that HALFES the current power usage.thesavvymage - Sunday, February 9, 2014 - link
well PPI and power usage dont even scale together...edzieba - Sunday, February 9, 2014 - link
A very nice 2003 paper from the AFRL titled http://www.itcexperts.net/library/Capability%20of%...">Capability of the Human Visual System give an excellent overview of what specifications a truly indistinguisable-from-reality display would have to fulfil, as well as an overview of the human visual system and various methods of measuring acuity. As well as angular resolution, it includes things like display update rates (depedant on refresh technique), luminance levels and range, colour gamut, etc.For example, a super-Oculus display (covering the entire human binocular visual field at 0.5arc-seconds per pixel) would require a display on the order of 1.5 TERApixels.
However, the paper does not take into account other effects, such as the vestibulo-ocular reflex that would necessitate even higher update rates for the edge case of the display being physically fixed in respect to the head.
edzieba - Sunday, February 9, 2014 - link
http://www.itcexperts.net/library/Capability%20of%...Looks like the URL got eaten.
HanzNFranzen - Sunday, February 9, 2014 - link
Very interesting, thank you! I have had many conversations with many differing opinions on this very topic, so it is nice to see a technical explanation that was easy to understand. In my personal opinion, 99.99% of the time you are not using a phone for any sort of critical viewing, so getting up and beyond 400ppi really is not in my interest. I would gladly trade any extra ppi for extra frame rate and battery life by not taxing the gpu so hard. Now, if we are talking desktop monitors, BRING IT ON. Because it has upset me for years now that companies are so gung ho about creating these super high ppi screens for something like a phone, but completely ignore an area where they can actually be utilized and have a noticeable impact...so we've stagnated at this 90-110ppi range in the 24"/27" monitor sizes.surt - Sunday, February 9, 2014 - link
The main goal should be to get all displays to at least 4k. That way they can play back 4k content without scaling (and scaling artifacts are often rendered MORE noticeable by high PPI displays if they don't reach an even scaling multiple).Nick2000 - Sunday, February 9, 2014 - link
Yes but they are only photos. They do not generaly need high resolution. It also depends on the technology. Actual chemical photo paper (developed, not printed) combines the dots in a different way than inkjet or laser printers. I work with high resolution printers and while most "pictures" come in at a relatively low resolution (200 or 300dpi) we print them at 1200x1200 because it makes the single pixels less visible when dithering. Also, diagonal lines really do not look great on a photo so that is out for precise maps for example... Anyway, there are a lot of factors to take into account past simply source resolution.Nick2000 - Sunday, February 9, 2014 - link
Oops that was a reply to an earlier comment regarding professional photo printing being good enough at 300dpiBobs_Your_Uncle - Sunday, February 9, 2014 - link
Kudos on the quality of the article. Some who comment express an element of dissatisfaction, yet the quality of the article has yielded an equally high level of enlightening discussion. And it's generally pretty civil too! I'm sassified!Thanks Joshua, a ho bunch! (And I promise to NEVER repeat that pathetic word play, which undoubtedly makes you ill with each pedestrian repetition. I'm not striving to alienate you; I'm simply a very, very, disturbed obsessive/compulsive, i.e. Mr. Creosote desiring "just 1 more thin mint!").
JoshHo - Sunday, February 9, 2014 - link
I definitely haven't seen that one before, but thanks for your input.sonicmerlin - Sunday, February 9, 2014 - link
I found this article to be rather frustrating. At first I was excited to see the author mention the importance of cognitive processing's effects on the perceptual quality of vision. A good example of this is the perception of a hair follicle. Even in a super high density display you can tell the follicle is slightly off, even if you can't discern individual pixelation.However the author then hand waved the issue away by saying we'll never have 1800 PPD displays in the near future, so it's not important- utterly missing the point.
Research has demonstrated the closer you get to the ideal 1800 PPD, the more "lifelike" a display becomes. As in, rather than looking at a super sharp image on a screen, you start having the sensation you're looking at something *real*. That's why palmar luckey of oculus rift said 8k per eye wasn't enough. It's not just about making individual pixels indistinguishable. It's about going beyond that and tricking your brain into thinking you're looking at looking through a clear glass window.
JoshHo - Sunday, February 9, 2014 - link
That's definitely true, but it's rather unfeasible to achieve 1800 PPD currently, although in the long run it may be possible. For a 5" display at the 12 inch distance mark (which is contrived, but is a generally realistic distance for mobile), it would take 37453x21067 to have enough resolution, which would require square pixels with a side length of around 2.96 microns. For comparison, a 5 inch panel with 1080p resolution would have pixels with a side length of 57.7 microns. Shrinking a display pixel to that size is years away at this point.BMNify - Monday, February 10, 2014 - link
Joshua, why didn't you mention the fact that UHD-1 and later UHD-2 is in fact more than just pixel density , its about providing the consumer with real Rec. 2020 10bit real color space rather than the antiquated Rec. 709 (HDTV and below)8 bit pseudocolor of today's panels ."Rec. 2020 allows for RGB and YCbCr signal formats with 4:4:4, 4:2:2, and 4:2:0 chroma subsampling.[1] Rec. 2020 specifies that if a luma (Y') signal is made that it uses the R’G’B’ coefficients 0.2627 for red, 0.6780 for green, and 0.0593 for blue.[1]"
as regards using more power that's what the latest quantum dot display is for, as in, that light is supplied on demand, which enables new, more efficient displays and allows for mobile devices with longer battery lives.
https://upload.wikimedia.org/wikipedia/commons/thu...
"Colloidal quantum dots irradiated with a UV light. Different sized quantum dots emit different color light due to quantum confinement."
JoshHo - Monday, February 10, 2014 - link
That's not so much about pixel density as it is about 24-bit vs 30-bit color. I'm not too well educated on HDTV spec, but sRGB is 24-bit and is the industry standard, while Adobe RGB is 30-bit but is effectively limited to content creation purposes, as only applications like Photoshop are really aware of colorspace.JoshHo - Monday, February 10, 2014 - link
I think I conflated color depth with color space, so my foot is currently firmly in my mouth, they are independent of each other.But for the most part, sRGB is the standard, and within that limited gamut, 8-bit color is the standard. In the future 10-bit color or greater will happen, but it doesn't seem likely for now.
BMNify - Monday, February 10, 2014 - link
:) thats ok,just remember to always ask and answer the question on all future reviews , "is this [UHD] panel International Telecommunication Union (ITU) Rec. 2020 10bit real color space ready out the box"JoshHo - Monday, February 10, 2014 - link
For sure, but I think that mostly applies to TV applications. For smartphone and desktop displays, it seems that sRGB will remain the standard, although there may be some movement towards wider color gamuts some day.BMNify - Monday, February 10, 2014 - link
oh and just to save looking it up"In coverage of the CIE 1931 color space
the Rec. 2020 color space covers 75.8%,
the digital cinema reference projector color space covers 53.6%,
the Adobe RGB color space covers 52.1%,
and the Rec. 709 color space covers 35.9%"
"The NHK [and the BBC] measured contrast sensitivity for the Rec. 2020 color space using Barten's equation which had previously been used to determine the bit depth for digital cinema.
11-bits per sample for the Rec. 2020 color space is below the visual modulation threshold, the ability to discern a one value difference in luminance, for the entire luminance range.
The NHK [and the BBC] is planning for their UHDTV system, Super Hi-Vision [UHD-2] , to use 12-bits per sample RGB"
MrSpadge - Monday, February 10, 2014 - link
Higher resolution always reduces power efficiency of the system. As the article says: if there are other improvements to power efficiency to offset this loss, they could always be used to reduce power at lower resolutions. Our choice.Regarding quantum dots: it's hard to pump them electrically, quantum wells are much better than this. But all of the pretty much require expensive III-V or II-VI compound semiconductors (like LEDs and many lasers), which doesn't bode well for large-area applications. That's why they're going for the OLEDs instead.
And about pumping them optically: well, you don't want to have to put on sun screen in order to look at your phone for longer than a few minutes, do you? Anyway, UV light sources are neither cheap nor efficient. A cure worse than the desease, so to say.
victorson - Monday, February 10, 2014 - link
Good to see some deeper analysis on display resolution and congratulations for this well-written article. The Snellen eye test is probably a good enough measure for a Westerner, but people tend to forget there are other languages out there. Namely, Chinese and Japanese (and I would guess the arabic languages) readers are the once who benefit the most from higher pixel densities, as complex Chinese characters can be an extremely complicated drawing that fits in the space of almost a single letter. My guess would be that pixel densities over and around 500ppi would actually make for a tangible improvement in the reading experience, but it'd be interesting to see more reasearch on this.jerrylzy - Monday, February 10, 2014 - link
While I am interested in how a 600-PPI display would look like, I tend to favor a phone with lower resolution and PPI. Since the difference may not be very noticeable, I weigh more on battery life and overall smoothness which a higher resolution display would definitely impact.piroroadkill - Tuesday, February 11, 2014 - link
I agree. I'm loving the fact Moto X came with a 1280x720 screen, an the Xperia Z1 compact too.Let's bring some goddamn sanity back into things.
Lower cost and higher battery life is what I care about over jamming my retinas against the glass and complaining if I can see a discernible element.
MrSpadge - Monday, February 10, 2014 - link
Joshua, it would have been nice if you commented more directly on the images you're showing. I recognize them as different sub tiles, but apart from that I have no idea what I'm looking at and how the images relate to the text.JoshHo - Monday, February 10, 2014 - link
Thanks for the feedback, I'll be sure to do this now and for future articles.MrSpadge - Tuesday, February 11, 2014 - link
Thanks! And I previously forgot to mention I really enjoyed that article :)Sivar - Tuesday, February 11, 2014 - link
Now _this_ is the kind of stuff that makes Anandtech the site that it is.No marketspeak, no dumbing-down, but still accessible to mortals due to no use of overly esoteric terms, rehashes of academic papers, or any "look how smart the author is" insecurity.
A realistic, practical, educated, intelligent look at a significant topic.
Joshua Ho, please write for Anandtech more often!
JoshHo - Thursday, February 13, 2014 - link
Thanks for the input, I hope to do so. :)MadAd - Friday, February 14, 2014 - link
So in the same way what PPI would be necessary for a single eye screen at maybe 1 inch? (as in x2, ala VR headset or screen glasses?).JoshHo - Wednesday, February 19, 2014 - link
3" is the realistic value. The human eye can't realistically see anything closer that's in focus.At 120PPD, 2300DPI or so is needed at 3" away. So there is capacity to go further for VR purposes. The ST1080 VR display has already achieved such densities, but the price is relatively high compared to a smartphone display.
coburn_c - Friday, February 14, 2014 - link
Tangible merits is what you meant. Technical merits are obvious as it is a technical difference.JoshHo - Thursday, February 20, 2014 - link
I guess I was trying to say the merit from a more technical perspective than previously examined, as much of the discourse hasn't been very well fleshed-out in light of the existing literature.wumpus - Thursday, February 20, 2014 - link
It doesn't matter if the trumpets are playing 30kHz or 40kHz, unless you really care what your dog hears (you won't hear the difference). Try playing a 11kHz signal back as a sine wave and as a square wave (which are only different due to supersonic harmonics). You won't hear a single difference (even with 30kHz+ headphones).Also, don't count having that 20kHz hearing. Wiki claims "most children and some adults" can hear 20kHz. Out of ~20 teenagers in my old electronics shop class one *1*! student could tell when another student left the amplifier test at 20kHz (all we tested for) and told him to "turn it off already". My max hearing was about 16kHz then (at 16 years old. Now at [censored] I can "hear" 12kHz, but at 40dB lower than 10kHz.)
I can't argue the low frequencies, but nobody is claiming that is "hearing".
Gottfired - Friday, February 21, 2014 - link
There's one very good reason for small 1440p displays. Oculus Rift.peterfares - Saturday, May 10, 2014 - link
I'm happy with 1920x1080 RGB.Instead of going to stupid 2560x1440 on smartphones Samsung should just finally switch back to RGB. They used to do pentile then RGB then pentile with higher resolution but they're sticking with pentile. I hate pentile.
vnrlifestyle - Saturday, July 18, 2015 - link
Thank you for this very interesting technical article on ppi.As a person with a keen interest in display quality, I really appreciate your detailed discussion of the issue.
After my mother bought a 1080p (6.4 inch 16:9 screen) smartphone nearly 2 years back at a ppi of nearly 330, I have found full hd to be the perfection of visual clarity for a smartphone.
This week she bought a new 5.5 inch smartphone of 720p resolution at ppi around 266.
It is a very well designed display and I found it just as good as the earlier full hd phone display of 330ppi.
Having used phones smaller than 4.5 inch, I find that amazingly high ppi may not give an advantage on such a small screen for the regular range of activities of a modern smartphone, which is nowadays a powerful mini computer.
So my humble personal conclusion in this matter is that
1. Minimum 5 inch screen size in 16:9 format is required for a really good user experience
2. Upto 5.5 inch size at least, a 720p hd screen at 267ppi is at least 90% as good as full hd screen for routine purposes
3. Higher resolutions and higher ppi are much more appreciable and useful at screen sizes above 6 inches
4. And finally, a well developed and sophisticated smartphone screen with a well designed user interface can produce amazingly pleasing display results and user satisfaction even with a ppi of around 275 (720p hd on a 5.5 inch screen) saving upto Rs.7500 (indian) for the purchaser by opting for lower resolution.
Thank you.