"I feel like the more common use case in smartphones is to just lock your phone/display when you're not actively using it."
Wouldn't you benefit from PSR even when you are actively using it? When I'm reading a web page in a browser on my phone, I'm not constantly scrolling. I scroll, read a bit, scroll, read a bit, etc. I might spend 10% of the time scrolling and 90% of the time reading a static screen. It sounds like PSR would benefit there too.
+1 that. Majority of my tablet use is actually reading ebooks, so in my case I spend a lot of time looking at one static page until it comes time to read the next one. Whenever I check my battery usage log it's actually something like 90% screen, since I do so little that actually taxes the CPU/GPU.
Does the screen percentage include the GPU power used in sending updates to the screen? If (as I suspect) it doesn't then your battery life improvements will be limited to <10%. This tech doesn't effect screen power consumption. It lowers the power consumed by the SoC by removing the need to send updates to an idle screen.
What I'm *really* excited about, though, is combining PSR with the IGZO panels coming out in those 3200 x 1800 Ultrabooks. Apparently, IGZO can hold its active state longer, so here you actually ARE decreasing the panel's power consumption. Sharp's marketing says a 67% decrease in power consumption: http://online.wsj.com/ad/article/vision-breakthrou...
The key here is going to be how quickly it and efficiently it can more in and out of PSR. For many types of active use outside of playing games/watching movies there are still stretches where the contents of the screen aren't really changing quickly. For example reading a web page/email/text msg/ebook/etc results in scroll, read for a while, scroll again. PSR can easily kick in for that read portion as it will be several seconds at least but even for more active things you can still find a couple seconds here or there or even parts of a second where PSR can kick in and while it might not seem like much it can add up to spending a significant amount of time in PSR and save a lot of power.
Switch latency should be no problem, as the SOC just needs to push an updated image to the display and its memory. At 60 fps it's got 16.7 ms to do this - which is eternity in the world of electronics.
You typically call the part of the SoC driving the display the Dispolay controller or the LCD controller. You don't run the whole GPU which is typically a separate part of the SoC,
Maybe this is a stupid question, but i always thought, that lcd-pixels don't need refreshing (except when they need to change color of course). And that is the reason why lcd-panels don't flicker, like crt and plasmas do.
So the only thing you need to do for PSR is not turning of the panel, when you don't get new data?
If you've ever crashed an LCD smartphone or laptop hard enough, you've seen this in action: A ghost of the last image on the screen can be darkly imprinted (sometimes with horizontal lines through it) because the pixels were left in their last state.
I think the reason they don't flicker is because the pixels don't provide light; there's nothing TO flicker, unless the backlight starts flickering for some reason lol. I don't think it's because they don't need refreshing. As far as I know, all common LCD technologies have an internal refresh that's necessary, which is probably why we need PSR instead of already having GPU drivers that know how to make a GPU wait when it can. The system is the thing that PSR is pausing, but only because the display was forcing it to keep working before.
PSR is part of the embedded Display Port (eDP) spec. eDP is only used in embedded displays because it requires a permanent physical connection. There is no reason why this wouldn't work on a discrete GPU, but the system must have the panel embedded. So computers using wired displays can't take advantage.
You also have to remember this technology is aimed purely at power savings. It adds extra cost to a panel so will probably only be used in places where less power usage is really important. Which for now most likely means systems that already have integrated graphics.
Laptops and AOIs that have discrete GPUs might have support in the future, but I can't see it being a top priority.
What's the technical reason to not have the GPU stop sending updates (like what it's doing with PSR) and have the LCD just not update the screen? Why have a memory buffer that holds the static image and still update the screen with it?
The internal refresh on LCD is probably necessary to maintain the integrity of the pixel state. There are "zero-power" LCD technologies that haven't been able to get out of the lab for years. They don't need to be refreshed, but I think right now the speed at which they can refresh is still too low for prime time.
How fast do the refreshes need to be to maintain a pixel's state? Does PSR refresh the screen at a slower rate that's just enough to not lose the state? Kind of like how DDR memory works where refreshes are only done when needed.
I don't think PSR is LCD-specific, so I doubt it. Then again, it may very well be that LCD requires an internal refresh for compatibility purposes -- legacy software and paradigms. Lazy quick Googling has not shone much light on this. This looks like a job for anandtech.com!
I find it odd that PSR is being touted as something new. It's not new. Phones, at least Nokia's, have had self refreshing panels for ever. Both MIPI DBI and MIPI DSI video busses support this. And some phones take this even further, as they only update the changed portion of the display.
Would PSR make it possible to dynamically modify refresh rate to match frame rate when gaming? This would allow V-Sync to be enabled without limiting frame times to multiples of 1/60 of a second.
Variable frame rate LCDs with really high internal refresh rate (from PSR) is one possible solution. It would work very well with OLEDs as the pixel switching time is very low. Some TVs already do this in one way or other. This would make it possible to vsync to any refresh rate as you suggest.
It just moves the framebuffer to memory on the display itself, but that is essentially a noop, or even adds complexity and power consumption instead of reducing it, in the form of one added framebuffer layer. The link from this new framebuffer to the actual pixels still does essentially the same thing as what the link from the old framebuffer to the pixels should have been doing. If it does it in a more power-efficient way or whatever, why not just optimize the old link...
To clarify, I am referring specifically to embedded devices, where "memory on the display itself" has absolutely nothing that can make it any more special than normal system memory, because the display and the rest of the system are literally millimeters from each other.
It's about the GPU, that requires more power to run just to keep the frame buffer alive, on expensive high-bandwidth memory, when a lower bandwidth dedicated frame buffer memory can do so with less power consumption. The memory is just between the high-performance frame buffer, and the display. Kind of a LITTLE.big implementation of the frame buffer, where you double the amount of hardware, so you can optimize for two different usage profiles.
Anand, based on the comments (and my own questions), it might be helpful if you added a few words that better explain what this "26% reduction in power" means. Is this a 26% reduction in SOC power consumption? Or a 26% reduction in the GPU portion of the SOC's power consumption? Or something else entirely.
And what proportion of total power consumption, assuming a static state, does the SOC/GPU/whatever use on these thrifty mobile devices? If the display is using 85% of the power, then a 26% reduction in SOC power usage would be small (4% reduction in total power usage), but not insignificant. If we are talking about a 26% reduction in something that uses even less power, this would be a smaller influence still. Possibly to the point of being unmeasurable in a battery life test...
If you are actively changing the display, but not all of it (say, browsing with an animation in a sidebar), you will likely benefit from the fact that your RAM isn't being grabbed to output a framebuffer all the time (although I doubt you will notice it even on a phone. The last time I heard memory DMA mattered was programming an 8-bit ATARI 400/800 (although this is largely due to the fact that any PC designed remotely for performance has a video card)).
On the other hand it requires a separate buffer for the video card (think the latest Intel onboard graphics with its own memory bus). 1080 graphics is something like 8M of memory, which is an odd duck (too large for SRAM on a reasonable sized chip, too small for DRAM). I'd guess it will take a few SRAM chips, that will add heat & power issues.
"The last time I heard memory DMA mattered was programming an 8-bit ATARI 400/800"
DMA is still extremely important for computing, it just mostly happens in the OS and drivers now. By reducing the load on main memory you are not only lowering power consumption but also freeing up system resources. Feeding a 2560*1600 panel at 60hz actually requires ~1GB/sec memory bandwidth continuously at 32bpp.
DMA matters still the same today - if you a kernel/driver programmer.
You would want to use DMA if it is available, to avoid clogging the CPU with the memory xfer. This is almost universally done in the graphics drivers whenever there is a need to upload the chunk of data to the VRAM, etc.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
38 Comments
Back to Article
Guspaz - Wednesday, August 7, 2013 - link
"I feel like the more common use case in smartphones is to just lock your phone/display when you're not actively using it."Wouldn't you benefit from PSR even when you are actively using it? When I'm reading a web page in a browser on my phone, I'm not constantly scrolling. I scroll, read a bit, scroll, read a bit, etc. I might spend 10% of the time scrolling and 90% of the time reading a static screen. It sounds like PSR would benefit there too.
JlHADJOE - Wednesday, August 7, 2013 - link
+1 that.Majority of my tablet use is actually reading ebooks, so in my case I spend a lot of time looking at one static page until it comes time to read the next one. Whenever I check my battery usage log it's actually something like 90% screen, since I do so little that actually taxes the CPU/GPU.
JPForums - Thursday, August 8, 2013 - link
Does the screen percentage include the GPU power used in sending updates to the screen? If (as I suspect) it doesn't then your battery life improvements will be limited to <10%. This tech doesn't effect screen power consumption. It lowers the power consumed by the SoC by removing the need to send updates to an idle screen.JPForums - Thursday, August 8, 2013 - link
I should be clear that the <10% figure comes from the fact that your logs show 90% of battery usage is coming from the screen.JlHADJOE - Thursday, August 8, 2013 - link
I'm not expecting to see the screen power consumption go down at all. Just maybe see a wee bit more battery life out of my usage model.The point I was making is that people reading ebooks seems like the ideal scenario where PSR can make a difference.
ikjadoon - Wednesday, August 7, 2013 - link
Nice! Glad to see PSR getting some traction.What I'm *really* excited about, though, is combining PSR with the IGZO panels coming out in those 3200 x 1800 Ultrabooks. Apparently, IGZO can hold its active state longer, so here you actually ARE decreasing the panel's power consumption. Sharp's marketing says a 67% decrease in power consumption: http://online.wsj.com/ad/article/vision-breakthrou...
SodaAnt - Wednesday, August 7, 2013 - link
Well, I'd think an increasing amount of the energy would go to backlighting, which wouldn't be affected much by this.kpb321 - Wednesday, August 7, 2013 - link
The key here is going to be how quickly it and efficiently it can more in and out of PSR. For many types of active use outside of playing games/watching movies there are still stretches where the contents of the screen aren't really changing quickly. For example reading a web page/email/text msg/ebook/etc results in scroll, read for a while, scroll again. PSR can easily kick in for that read portion as it will be several seconds at least but even for more active things you can still find a couple seconds here or there or even parts of a second where PSR can kick in and while it might not seem like much it can add up to spending a significant amount of time in PSR and save a lot of power.MrSpadge - Wednesday, August 7, 2013 - link
Switch latency should be no problem, as the SOC just needs to push an updated image to the display and its memory. At 60 fps it's got 16.7 ms to do this - which is eternity in the world of electronics.juvosemi - Wednesday, August 7, 2013 - link
You typically call the part of the SoC driving the display the Dispolay controller or the LCD controller. You don't run the whole GPU which is typically a separate part of the SoC,MaxPowerTech - Wednesday, August 7, 2013 - link
Maybe this is a stupid question, but i always thought, that lcd-pixels don't need refreshing (except when they need to change color of course).And that is the reason why lcd-panels don't flicker, like crt and plasmas do.
So the only thing you need to do for PSR is not turning of the panel, when you don't get new data?
Death666Angel - Wednesday, August 7, 2013 - link
That is correct, but I'm pretty sure that they still expect to get the data, even when it is identical to what is already being displayed.XZerg - Wednesday, August 7, 2013 - link
you are correct and i believe there is some logic in the LCD to skip the "update" if there is no change to that pixel. this reduces power consumption.isnoop - Thursday, August 8, 2013 - link
If you've ever crashed an LCD smartphone or laptop hard enough, you've seen this in action: A ghost of the last image on the screen can be darkly imprinted (sometimes with horizontal lines through it) because the pixels were left in their last state.Ortanon - Thursday, August 8, 2013 - link
I think the reason they don't flicker is because the pixels don't provide light; there's nothing TO flicker, unless the backlight starts flickering for some reason lol. I don't think it's because they don't need refreshing. As far as I know, all common LCD technologies have an internal refresh that's necessary, which is probably why we need PSR instead of already having GPU drivers that know how to make a GPU wait when it can. The system is the thing that PSR is pausing, but only because the display was forcing it to keep working before.JDG1980 - Wednesday, August 7, 2013 - link
Can this technology be made to work with discrete GPUs or is it for integrated graphics only?zyankali - Wednesday, August 7, 2013 - link
PSR is part of the embedded Display Port (eDP) spec. eDP is only used in embedded displays because it requires a permanent physical connection. There is no reason why this wouldn't work on a discrete GPU, but the system must have the panel embedded. So computers using wired displays can't take advantage.zyankali - Wednesday, August 7, 2013 - link
You also have to remember this technology is aimed purely at power savings. It adds extra cost to a panel so will probably only be used in places where less power usage is really important. Which for now most likely means systems that already have integrated graphics.Laptops and AOIs that have discrete GPUs might have support in the future, but I can't see it being a top priority.
Krysto - Wednesday, August 7, 2013 - link
Nexus 10 had PSR too I think, or at least the SoC supported it.mort32 - Wednesday, August 7, 2013 - link
What's the technical reason to not have the GPU stop sending updates (like what it's doing with PSR) and have the LCD just not update the screen? Why have a memory buffer that holds the static image and still update the screen with it?MaxPowerTech - Thursday, August 8, 2013 - link
Exactly what i'm wondering.Why can't you just leave the panel at its current state, when no new data arrives?
Why would you need that extra buffer?
Ortanon - Thursday, August 8, 2013 - link
The internal refresh on LCD is probably necessary to maintain the integrity of the pixel state. There are "zero-power" LCD technologies that haven't been able to get out of the lab for years. They don't need to be refreshed, but I think right now the speed at which they can refresh is still too low for prime time.mort32 - Thursday, August 8, 2013 - link
How fast do the refreshes need to be to maintain a pixel's state? Does PSR refresh the screen at a slower rate that's just enough to not lose the state? Kind of like how DDR memory works where refreshes are only done when needed.Ortanon - Thursday, August 8, 2013 - link
I don't think PSR is LCD-specific, so I doubt it. Then again, it may very well be that LCD requires an internal refresh for compatibility purposes -- legacy software and paradigms. Lazy quick Googling has not shone much light on this. This looks like a job for anandtech.com!Laststop311 - Wednesday, August 7, 2013 - link
wheres the battery life tests and comparisons?tomba - Thursday, August 8, 2013 - link
I find it odd that PSR is being touted as something new. It's not new. Phones, at least Nokia's, have had self refreshing panels for ever. Both MIPI DBI and MIPI DSI video busses support this. And some phones take this even further, as they only update the changed portion of the display.Soulwager - Thursday, August 8, 2013 - link
Would PSR make it possible to dynamically modify refresh rate to match frame rate when gaming? This would allow V-Sync to be enabled without limiting frame times to multiples of 1/60 of a second.DesktopMan - Monday, August 19, 2013 - link
Variable frame rate LCDs with really high internal refresh rate (from PSR) is one possible solution. It would work very well with OLEDs as the pixel switching time is very low. Some TVs already do this in one way or other. This would make it possible to vsync to any refresh rate as you suggest.Visual - Thursday, August 8, 2013 - link
This does not make sense.It just moves the framebuffer to memory on the display itself, but that is essentially a noop, or even adds complexity and power consumption instead of reducing it, in the form of one added framebuffer layer. The link from this new framebuffer to the actual pixels still does essentially the same thing as what the link from the old framebuffer to the pixels should have been doing. If it does it in a more power-efficient way or whatever, why not just optimize the old link...
Visual - Thursday, August 8, 2013 - link
To clarify, I am referring specifically to embedded devices, where "memory on the display itself" has absolutely nothing that can make it any more special than normal system memory, because the display and the rest of the system are literally millimeters from each other.Rick83 - Thursday, August 8, 2013 - link
It's about the GPU, that requires more power to run just to keep the frame buffer alive, on expensive high-bandwidth memory, when a lower bandwidth dedicated frame buffer memory can do so with less power consumption.The memory is just between the high-performance frame buffer, and the display. Kind of a LITTLE.big implementation of the frame buffer, where you double the amount of hardware, so you can optimize for two different usage profiles.
TrackSmart - Thursday, August 8, 2013 - link
Anand, based on the comments (and my own questions), it might be helpful if you added a few words that better explain what this "26% reduction in power" means. Is this a 26% reduction in SOC power consumption? Or a 26% reduction in the GPU portion of the SOC's power consumption? Or something else entirely.And what proportion of total power consumption, assuming a static state, does the SOC/GPU/whatever use on these thrifty mobile devices? If the display is using 85% of the power, then a 26% reduction in SOC power usage would be small (4% reduction in total power usage), but not insignificant. If we are talking about a 26% reduction in something that uses even less power, this would be a smaller influence still. Possibly to the point of being unmeasurable in a battery life test...
Anyway, some further context would be helpful.
juhatus - Friday, August 9, 2013 - link
Doesnt the new Haswell laptops (for example Sony Vaio Pro) use this?n4s33r - Wednesday, August 14, 2013 - link
The 2013 Nexus 7 also has this featurewumpus - Sunday, August 18, 2013 - link
If you are actively changing the display, but not all of it (say, browsing with an animation in a sidebar), you will likely benefit from the fact that your RAM isn't being grabbed to output a framebuffer all the time (although I doubt you will notice it even on a phone. The last time I heard memory DMA mattered was programming an 8-bit ATARI 400/800 (although this is largely due to the fact that any PC designed remotely for performance has a video card)).On the other hand it requires a separate buffer for the video card (think the latest Intel onboard graphics with its own memory bus). 1080 graphics is something like 8M of memory, which is an odd duck (too large for SRAM on a reasonable sized chip, too small for DRAM). I'd guess it will take a few SRAM chips, that will add heat & power issues.
DesktopMan - Monday, August 19, 2013 - link
"The last time I heard memory DMA mattered was programming an 8-bit ATARI 400/800"DMA is still extremely important for computing, it just mostly happens in the OS and drivers now. By reducing the load on main memory you are not only lowering power consumption but also freeing up system resources. Feeding a 2560*1600 panel at 60hz actually requires ~1GB/sec memory bandwidth continuously at 32bpp.
psyq321 - Wednesday, August 21, 2013 - link
DMA matters still the same today - if you a kernel/driver programmer.You would want to use DMA if it is available, to avoid clogging the CPU with the memory xfer. This is almost universally done in the graphics drivers whenever there is a need to upload the chunk of data to the VRAM, etc.
Cheiz - Saturday, May 25, 2019 - link
Panel Self Refresh is enabled by default on the Dell XPS 9380 FHD 13 and it works like shit. See https://www.dell.com/community/XPS/XPS-9380-FHD-13...