The Asus ROG Swift PG27UQ G-SYNC HDR Monitor Review: Gaming With All The Bells and Whistles
by Nate Oh on October 2, 2018 10:00 AM EST- Posted in
- Monitors
- Displays
- Asus
- NVIDIA
- G-Sync
- PG27UQ
- ROG Swift PG27UQ
- G-Sync HDR
Delayed past its original late 2017 timeframe, let alone the April and May estimates, NVIDIA’s G-Sync HDR technology finally arrived over the last couple months courtesy of Asus’ ROG Swift PG27UQ and Acer’s Predator X27. First shown at Computex 2017 as prototypes, the 27-inch displays bring what are arguably the most desired and visible aspects of modern gaming monitors: ultra high resolution (4K), high refresh rates (144Hz), and variable refresh rate technology (G-Sync), all in a reasonably-sized quality panel (27-inch IPS-type). In addition to that, of course, are the various HDR-related capabilities with brightness and color gamut.
Individually, these features are just some of the many modern display technologies, but where resolution and refresh rate (and also input latency) are core to PC gaming, those elements typically work as tradeoffs, with 1440p/144Hz being a notable middle ground. So by the basic 4K/144Hz standard, we have not yet had a true ultra-premium gaming monitor. But today, we look at one such beast with the Asus ROG Swift PG27UQ.
ASUS ROG Swift PG27UQ G-SYNC HDR Monitor Specifications | |||||
ROG Swift PG27UQ | |||||
Panel | 27" IPS (AHVA) | ||||
Resolution | 3840 × 2160 | ||||
Refresh Rate | OC Mode | 144Hz (HDR, 4:2:2) | 144Hz (SDR, 4:2:2) | ||
Standard | 120Hz (HDR, 4:2:2) 98Hz (HDR, 4:4:4) |
120Hz (SDR, 4:4:4) | |||
Over HDMI | 60Hz | ||||
Variable Refresh Rate | NVIDIA G-Sync HDR module (actively cooled) |
||||
Response Time | 4 ms (GTG) | ||||
Brightness | Typical | 300 - 600 cd/m² | |||
Peak | 1000 cd/m² (HDR) | ||||
Contrast | Typical | 1000:1 | |||
Peak | 50000:1 (HDR) | ||||
Backlighting | FALD, 384 zones | ||||
Quantum Dot | Yes | ||||
HDR Standard | HDR10 Support | ||||
Viewing Angles | 178°/178° horizontal/vertical | ||||
Pixel Density | 163 pixels per inch 0.155mm pixel pitch |
||||
Color Depth | 1.07 billion (8-bit with FRC) |
||||
Color Gamut | sRGB: 100% Adobe RGB: 99% DCI-P3: 97% |
||||
Inputs | 1 × DisplayPort 1.4 1 × HDMI 2.0 |
||||
Audio | 3.5-mm audio jack | ||||
USB Hub | 2-port USB 3.0 | ||||
Stand Adjustments | Tilt: +20°~-5° Swivel: +160°~+160° Pivot: +90°~-90° Height Adjustment: 0~120 mm |
||||
Dimensions (with stand) | 634 x 437-557 x 268 mm | ||||
VESA Mount | 100 × 100 | ||||
Power Consumption | Idle: 0.5 W Peak: 180 W (HDR) |
||||
Price | $1999 |
As an ultra-premium gaming monitor of that caliber, the PG27UQ also has an ultra-premium price of $1999. For reasons we’ll soon discuss, the pricing very much represents the panel’s HDR backlighting unit, quantum dot film, and G-Sync HDR module. The full-array local dimming (FALD) backlighting system delivers the brightness and contrast needed for HDR, while the quantum dot film enhances the representable colors to a wider gamut, another HDR element. The new generation G-Sync HDR module deals with the variable refresh implementation, but with HDR, high refresh rate, and high resolution combined, bandwidth constraints require chroma subsampling beyond 98Hz.
In terms of base specifications, the PG27UQ is identical to Acer’s Predator X27 as it uses the same AU Optronics panel, and both monitors are essentially flagships for the G-Sync HDR platform, which includes the curved ultrawide 35-inch models and 4K 65-inch Big Format Gaming Displays (BFGD). Otherwise, there isn’t anything new here that we haven’t already known about in the long run-up.
NVIDIA G-SYNC HDR Monitor Lineup | |||||||
Acer Predator X27 |
ASUS ROG Swift PG27UQ |
Acer Predator X35 |
ASUS ROG Swift PG35VQ |
Acer Predator BFGD |
ASUS ROG Swift PG65 |
HP OMEN X 65 BFGD |
|
Panel | 27" IPS-type (AHVA) | 35" VA 1800R curve |
65" VA? | ||||
Resolution | 3840 × 2160 | 3440 × 1440 (21:9) | 3840 × 2160 | ||||
Pixel Density | 163 PPI | 103 PPI | 68 PPI | ||||
Max Refresh Rates | 144Hz 60Hz (HDMI) |
200Hz 60Hz (HDMI) |
120Hz 60Hz (HDMI) |
||||
Backlighting | FALD (384 zones) | FALD (512 zones) | FALD | ||||
Quantum Dot | Yes | ||||||
HDR Standard | HDR10 Support | ||||||
Color Gamut | sRGB DCI-P3 |
||||||
Inputs | 2 × DisplayPort 1.4 1 × HDMI 2.0 |
DisplayPort 1.4 HDMI 2.0 |
DisplayPort 1.4 HDMI 2.0 Ethernet |
||||
Price | $1999 | TBA | TBA | ||||
Availability | Present | 2H 2018? |
Furthermore, Asus’ ROG Swift PG27UQ also had a rather insightful channel for updates on their ROG forums, so there's some insight into the panel-related firmware troubles they've been having.
How We Got Here: Modern Gaming Monitors and G-Sync HDR
One of the more interesting aspects about the PG27UQ is about its headlining features. The 3840 x 2160 ‘4K’ resolution and 144Hz refresh rate are very much in the mix, and so is the monitor being not just G-Sync but G-Sync HDR. Then there is the HDR aspect, with the IPS-type panel that has localized backlighting and a quantum dot film. G-Sync HDR means both a premium tier of HDR monitor, as well as the new generation of G-Sync that works with high dynamic range gaming.
Altogether, the explanation isn’t very succinct for gamers, especially compared to a non-HDR gaming monitor, and it has all to do with the vast amount of moving parts involved in consumer monitor features, something more thoroughly covered by Brett. For some context, recent display trends include
- Higher resolutions (e.g. 1440p, 4K, 8K)
- Higher refresh rates (e.g. 120Hz, 165Hz, 240Hz)
- Variable refresh rate (VRR) (e.g. G-Sync, FreeSync)
- Panel size, pixel density, curved and/or ultrawide formats
- Better panel technology (e.g. VA, IPS-type, OLED)
- Color bit depth
- Color compression (e.g. chroma subsampling)
- Other high dynamic range (HDR) relevant functions for better brightness/contrast ratios and color space coverage, such as local dimming/backlighting and quantum dot films
These features obviously overlap, and much of their recent developments are not so much ‘new’ as they are now ‘reasonably affordable’ to the broader public. For a professional class price, monitors for professional visualization have offered many of the same specifications. And most elements are ultimately limited by PC game support, even uncapped refresh rates and 4K+ resolutions. This is, of course, not including connection standards, design (i.e. bezels and thinness), or gaming monitor features (e.g. ULMB). All these bits, and more, are served up to consumers in a bevy of numbers and brands.
Why does all of this matter? All of these points are points of discussion with the Asus ROG Swift PG27UQ, and especially to G-Sync HDR at the heart of this display. Gaming monitors are moving beyond resolution and refresh rate in their feature sets, especially as games start to support HDR technologies (i.e. HDR10, Dolby Vision, FreeSync 2 tone-mapping). To implement those overlapping features, much more has to do with the panel rather than the VRR hardware/specification, which has become the de facto identifier of a modern gaming monitor. The goal is no longer summarized by ‘faster frames filled with more pixels’ and becomes more difficult to communicate, let alone market, to consumers. And this has much to do with where G-Sync (and VRR) started and what it is now aspiring to be.
91 Comments
View All Comments
Ryan Smith - Wednesday, October 3, 2018 - link
Aye. The FALD array puts out plenty of heat, but it's distributed, so it can be dissipated over a large area. The FPGA for controlling G-Sync HDR is generates much less heat, but it's concentrated. So passive cooling would seem to be non-viable here.a5cent - Wednesday, October 3, 2018 - link
Yeah, nVidia's DP1.4 VRR solution is baffelingly poor/non-competitive, not just due to the requirement for active cooling.nVidia's DP1.4 g-sync module is speculated to contribute a lot to the monitor's price (FPGA alone is estimated to be ~ $500). If true, I just don't see how g-sync isn't on a path towards extinction. That simply isn't a price premium over FreeSync that the consumer market will accept.
If g-sync isn't at least somewhat widespread and (via customer lock in) helping nVidia sell more g-sync enabled GPUs, then g-sync also isn't serving any role for nVidia. They might as well drop it and go with VESA's VRR standard.
So, although I'm actually thinking of shelling out $2000 for a monitor, I don't want to invest in technology it seems has priced itself out of the market and is bound to become irrelevant.
Maybe you could shed some light on where nVidia is going with their latest g-sync solution? At least for now it doesn't seem viable.
Impulses - Wednesday, October 3, 2018 - link
How would anyone outside of NV know where they're going with this tho? I imagine it does help sell more hardware to one extent or another (be it GPUs, FPGAs to display makers, or a combination of profits thru the side deals) AND they'll stay the course as long as AMD isn't competitive at the high end...Just the sad reality. I just bought a G-Sync display but it wasn't one of these or even $1K, and it's still a nice display regardless of whether it has G-Sync or not. I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones.
a5cent - Wednesday, October 3, 2018 - link
"How would anyone outside of NV know where they're going with this tho?"Anandtech could talk with their contacts at nVidia, discuss the situation with monitor OEMs, or take any one of a dozen other approaches. Anandtech does a lot of good market research and analysis. There is no reason they can't do that here too. If Anandtech confronted nVidia with the concern of DP1.4 g-sync being priced into irrelevancy, they would surely get some response.
"I don't intend to pay this kinda premium without a clear path forward either but I guess plenty of people are or both Acer and Asus wouldn't be selling this and plenty of other G-Sync displays with a premium over the Freesync ones."
You're mistakenly assuming the DP1.2 g-sync is in any way comparable to DP1.4 g-sync. It's not.
First, nobody sells plenty of g-sync monitors. The $200 price premium over FreeSync has made g-sync monitors (comparatively) low volume niche products. For DP1.4 that premium goes up to over $500. There is no way that will fly in a market where the entire product typically sells for less than $500. This is made worse by the fact that ONLY DP1.4 supports HDR. That means even a measly DisplayHDR 400 monitor, which will soon retail for around $400, will cost at least $900 if you want it with g-sync.
Almost nobody, for whom price is even a little bit of an issue, will pay that.
While DP1.2 g-sync monitors were niche products, DP1.4 g-sync monitors will be irrelevant products (in terms of market penetration). Acer's and Asus' $2000 monitors aren't and will not sell in significant numbers. Nothing using nVidia's DP1.4 g-sync module will.
To be clear, this isn't a rant about price. It's a rant about strategy. The whole point of g-sync is customer lock-in. Nobody, not even nVidia, earns anything selling g-sync hardware. For nVidia, the potential of g-sync is only realized when a person with a g-sync monitor upgrades to a new nVidia card who would otherwise have bought an AMD card. If DP1.4 g-sync isn't adopted in at least somewhat meaningful numbers, g-sync loses its purpose. That is when I'd expect nVidia to either trash g-sync and start supporting FreeSync, OR build a better g-sync module without the insanely expensive FPGA.
Neither of those two scenarios motivates me to buy a $2000 g-sync monitor today. That's the problem.
a5cent - Wednesday, October 3, 2018 - link
To clarify the above...If I'm spending $2000 on a g-sync monitor today, I'd like some reassurance that g-sync will still be relevant and supported three years from now.
For the reasons mentioned, from where I stand, g-sync looks like "dead technology walking". With DP1.4 it's priced itself out of the market. I'm sure many would appreciate some background on where nVidia is going with this...
lilkwarrior - Monday, October 8, 2018 - link
Nvidia's solution is objectively better besides not being open. Similarly NVLINK is better than any other multi-GPU hardware wise.With HDMI 2.1, Nvidia will likely support it unless it's simply underwhelming.
Once standards catch up, Nvidia hasn't been afraid to deprecate their own previous effort somewhat besides continuing to support it for wide-spread support / loyalty or a balanced approach (i.e. NVLINK for Geforce cards but delegate memory pooling to DX12 & Vulkan)
Impulses - Tuesday, October 2, 2018 - link
If NVidia started supporting standard adaptive sync at the same time that would be great... Pipe dream I know. Things like G-Sync vs Freesync, fans inside displays, and dubious HDR support don't inspire much confidence in these new displays. I'd gladly drop the two grand if I *knew* this was the way forward and would easily last me 5+ years, but I dunno if that would really pan out.DanNeely - Tuesday, October 2, 2018 - link
Thank you for including the explanation on why DSC hasn't shown up in any products to date.Heavenly71 - Tuesday, October 2, 2018 - link
I'm pretty disappointed that a gaming monitor with this price still has only 8 bits of native color resolution (plus FRC, I know).Compare this to the ASUS PA32UC which – while not mainly targetted at gamers – has 10 bits, no fan noise, is 5 inches bigger (32" total) and many more inputs (including USB-C DP). For about the same price.
milkod2001 - Tuesday, October 2, 2018 - link
Wonder if they make native 10bit monitors. Would you be able to output 10bit colours from gaming GPU or only professional GPU?