The Asus ROG Swift PG27UQ G-SYNC HDR Monitor Review: Gaming With All The Bells and Whistles
by Nate Oh on October 2, 2018 10:00 AM EST- Posted in
- Monitors
- Displays
- Asus
- NVIDIA
- G-Sync
- PG27UQ
- ROG Swift PG27UQ
- G-Sync HDR
From G-Sync Variable Refresh To G-Sync HDR Gaming Experience
The original FreeSync and G-Sync were solutions to a specific and longstanding problem: fluctuating framerates would cause either screen tearing or, with V-Sync enabled, stutter/input lag. The result of VRR has been a considerably smoother experience in the 30 to 60 fps range. And an equally important benefit was compensating for dips and peaks over the wide ranges introduced with higher refresh rates like 144Hz. So they were very much tied to a single specification that directly described the experience, even if the numbers sometimes didn’t do the experience justice.
Meanwhile, HDR in terms of gaming is a whole suite of things that essentially allows for greater brightness, blacker darkness, and better/more colors. More importantly, this requires developer support for applications and production of HDR content. The end result is not nearly as static as VRR, as much depends on the game’s implementation – or in NVIDIA’s case, sometimes with Windows 10’s implementation. Done properly, even with simply better brightness, there can be perceived enhancements with colorfulness and spatial resolution, which are the Hunt effect and Stevens effect, respectively.
So we can see why both AMD and NVIDIA are pushing the idea of a ‘better gaming experience’, though NVIDIA is explicit about this with G-Sync HDR. The downside of this is that the required specifications for both FreeSync 2 and G-Sync HDR certifications are closed off and only discussed broadly, deferring to VESA’s DisplayHDR standards. Their situations, however, are very different. For AMD, their explanations are a little more open, and outside of HDR requirements, FreeSync 2 also has a lot to do with standardizing SDR VRR quality with mandated LFC, wider VRR range, and lower input lag. Otherwise, they’ve also stated that FreeSync 2’s color gamut, max brightness, and contrast ratio requirements are broadly comparable to those in DisplayHDR 600, though the HDR requirements do not overlap completely. And with FreeSync/FreeSync 2 support on Xbox One models and upcoming TVs, FreeSync 2 appears to be a more straightforward specification.
For NVIDIA, their push is much more general and holistic with respect to feature standards, and purely focused on the specific products. At the same time, they discussed the need for consumer education on the spectrum of HDR performance. While there are specific G-Sync HDR standards as part of their G-Sync certification process, those specifications are only known to NVIDIA and the manufacturers. Nor was much detail provided on minimum requirements outside of HDR10 support, peak 1000 nits brightness, and unspecified coverage of DCI-P3 for the 4K G-Sync HDR models, citing their certification process and deferring detailed capabilities to other certifications that G-Sync HDR monitors may have. In this case, UHD Alliance Premium and DisplayHDR 1000 certifications for the Asus PG27UQ. Which is to say that, at least for the moment, the only G-Sync HDR displays are those that adhere to some very stringent standards; there aren't any monitors under this moniker that offer limited color gamuts or subpar dynamic contrast ratios.
At least with UHD Premium, the certification is specific to 4K resolution, so while the announced 65” 4K 120Hz Big Format Gaming Displays almost surely will be, the 35” curved 3440 × 1440 200Hz models won’t. Practically-speaking, all the capabilities of these monitors are tied into the AU Optronics panels inside them, and we know that NVIDIA worked closely with AUO as well as the monitor manufacturers. As far as we know those AUO panels are only coupled with G-Sync HDR displays, and vice versa. No other standardized specification was disclosed, only referring back to their own certification process and the ‘ultimate gaming display’ ideal.
As much as NVIDIA mentioned consumer education on the HDR performance spectrum, the consumer is hardly any more educated on a monitor’s HDR capabilities with the G-Sync HDR branding. Detailed specifications are left to monitor certifications and manufacturers, which is the status quo. Without a specific G-Sync HDR page, NVIDIA lists G-Sync HDR features under the G-Sync page, and while those features are specified as G-Sync HDR, there is no explanation on the full differences between a G-Sync HDR monitor and a standard G-Sync monitor. The NVIDIA G-Sync HDR whitepaper is primarily background on HDR concepts and a handful of generalized G-Sync HDR details.
For all intents and purposes, G-Sync HDR is presented not as specification or technology but as branding for a premium product family, and right now for consumers it is more useful to think of it that way.
91 Comments
View All Comments
lilkwarrior - Monday, October 8, 2018 - link
OLED isn't covered by VESA HDR standards; it's far superior picture quality & contrast.QLED cannot compete with OLED at all in such things. I would very much get a Dolby Vision OLED monitor than a LED monitor with a HDR 1000 rating.
Lolimaster - Tuesday, October 2, 2018 - link
You can't even call HDR with a pathetic low contrast IPS.resiroth - Monday, October 8, 2018 - link
Peak luminance levels are overblown because they’re easily quantifiable. In reality, if you’ve ever seen a recent LG TV which can hit about 900 nits peak that is too much. https://www.rtings.com/tv/reviews/lg/c8It’s actually almost painful.
That said I agree oled is the way to go. I wasn’t impressed by any LCD (FALD or not) personally. It doesn’t matter how bright the display gets if it can’t highlight stars on a night sky etc. without significant blooming.
Even 1000 bits is too much for me. The idea of 4000 is absurd. Yes, sunlight is way brighter, but we don’t frequently change scenes from night time to day like television shows do. It’s extremely jarring. Unless you like the feeling of being woken up repeatedly in the middle of the night by a flood light. It’s a hard pass.
Hxx - Saturday, October 6, 2018 - link
the only competition is Acer which costs the same. If you want Gsync you have to pony up otherwise yeah there are much cheaper alternatives.Hixbot - Tuesday, October 2, 2018 - link
Careful with this one, the "whistles" in the article title is referring to the built-in fan whine. Seriously, look at the newegg reviews.JoeyJoJo123 - Tuesday, October 2, 2018 - link
"because I know"I wouldn't be so sure. Not for Gsync, at least. AU Optronics is the only panel producer for monitor sized displays that even gives a flip about pushing lots of high refresh rate options on the market. A 2560x1440 144hz monitor 3 years ago still costs just as much today (if not more, due to upcoming China-to-US import tariffs, starting with 10% on October 1st 2018, and another 15% (total 25%) in January 1st 2019.
High refresh rate GSync isn't set to come down anytime soon, not as long as Nvidia has a stranglehold on GPU market and not as long as AU Optronics is the only panel manufacturer that cares about high refresh rate PC monitor displays.
lilkwarrior - Monday, October 8, 2018 - link
Japan Display plans to change that in 2019. IIRC Asus is planning to use their displays for a portable Professional OLED monitor.I would not be surprised they or LG created OLED gaming monitors from Japan Display that's a win-win for gamers, Japan Display, & monitor manufacturers in 2020.
Alternatively they surprise us with MLED monitors that Japan Display also invested in + Samsung & LG.
That's way better to me than any Nano-IPS/QLED monitor. They simply cannot compete.
Impulses - Tuesday, October 2, 2018 - link
I would GLADLY pay the premium over the $600-1,000 alternatives IF I thought I was really going to take advantage of what the display offers in the next 2 or even 4 years... But that's the issue. I'm trying to move away from SLI/CF (2x R9 290 atm, about to purchase some sort of 2080), not force myself back into it.You're gonna need SLI RTX 2080s (Ti or not) to really eke out frame rates fast enough for the refresh rate to matter at 4K, chances are it'll be the same with the next gen of cards unless AMD pulls a rabbit out of a hat and quickly gets a lot more competitive. That's 2-3 years easy where SLI would be a requirement.
HDR support seems to be just as much of a mess... I'll probably just end up with a 32" 4K display (because I'm yearning for something larger than my single 16:10 24" and that approaches the 3x 24" setup I've used at times)... But if I wanted to try a fast refresh rate display I'd just plop down a 27" 1440p 165Hz next to it.
Nate's conclusion is exactly the mental calculus I've been doing, those two displays are still less money than one of these and probably more useful in the long run as secondary displays or hand me down options... As awesome as these G-Sync HDR displays may be, the vendor lock in around G-Sync and active cooling makes em seem like poor investments.
Good displays should last 5+ years easy IMO, I'm not sure these would still be the best solution in 3 years.
Icehawk - Wednesday, October 3, 2018 - link
Grab yourself an inexpensive 32" 4k display, decent ones are ~$400 these days. I have an LG and it's great all around (I'm a gamer btw), it's not quite high end but it's not a low end display either - it compares pretty favorably to my Dell 27" 2k monitor. I just couldn't see bothering with HDR or any of that other $$$ BS at this point, plus I'm not particularly bothered by screen tearing and I don't demand 100+ FPS from games. Not sure why people are all in a tizzy about super high FPS, as long as the game runs smoothly I am happy.WasHopingForAnHonestReview - Saturday, October 6, 2018 - link
You dont belong here, plebian.