The G-Sync module is used in place of (or bypasses any use of) the typical monitor's display scaler, and that's no exception for adaptive sync. So for monitors to be able to support both G-Sync and Adaptive Sync, it'd need to have both the usual display scaler AND the G-Sync module.
Some laptop models have been shown to support G-Sync despite not having a G-Sync module, so I suppose it is possible to get both G-Sync and Adaptive Sync on the same monitor without the expensive G-Sync module, but you'd need some kind of mode switching between the two.
In my opinion, buying G-Sync monitors today or video cards that don't support VESA Adaptive Sync (Nvidia cards) feels like a sunk cost in the long run. Only Adaptive Sync displays and video output will really have mileage 5 years down the road when a plethora of electronics have started adopting Adaptive Sync more commonly.
@JoeyJoJo123: "It actually adds cost. The G-Sync module is used in place of (or bypasses any use of) the typical monitor's display scaler ..."
I'm pretty sure the OP was talking about nVidia supporting VESA adaptive-sync on their video cards. There is no technical reason that they couldn't. There wouldn't be anything added to the Bill Of Materials, just a relatively short (trivial?) development time to support it in software.
@JoeyJoJo123: "Some laptop models have been shown to support G-Sync despite not having a G-Sync module ..."
Because they already know how to support "Sync" technology without their G-Sync module, it shouldn't be hard to support Adaptive Sync with their video cards. It probably isn't all that difficult (now that much of the work is done) to support G-Sync on most "Sync" capable monitor without the module, but there are probably some tradeoffs.
There might be a couple reasons for that. Their current gen graphics cards might not have the necessary hardware features in their DisplayPort implementation for Adaptive-Sync support. That would mean Nvidia could only support it with Pascal or newer, so it's understandable why they would not announce support before a good portion of their user base have supported cards.
The more cynical view is that they have the capability in current cards, but see it as competition and try to stifle it in favor of their G-Sync implementation. In that case they'll probably wait to support it only if the rest of the industry chooses Adaptive-Sync. Intel's support would essentially doom G-Sync. At the moment Adaptive-Sync panels seem to have bigger restrictions with the refresh rate range, and it'll likely take a year or two before TCON manufacturers start offering boards that aren't hobbled with tight dynamic refresh rate ranges.
I would love to see this on Nvidia cards but I guess for Nvdia it isn't that easy. Not sure if the GSYNC premium pricing is higher then AMD's Freesync/Adaptive sync but since they went the exclusive route they might be stuck with some tough choices and questions? -Can they run freesync on G-sync hardware,namely the monitors might be a problem? -How to promote it to the market if they drop gsync and your gsync customers? -Try to improve/adapt Gsync so you can still offer it as apremium over freesync?
Even better, just rename VESA adaptive-sync as G-sync Lite or something and offer the original G-sync as a premium product. Assuming original G-sync is better. Have not seen them both in action. This way they don't have to abandon the sunk costs on G-sync marketing.
Exactly right, and G-Sync would still be a selling point-- G-Sync will show the same image for several refreshes, which adaptive sync/freesync doesn't support.
Both AMD and Intel support adaptive sync, and it's part of the VESA spec. This spec is obviously going to win. NVidia won't support it because, well... they're dicks.
The same day they will replace PhysX with Havok and CUDA with OpenCL. We are talking about Nvidia here. They love proprietary, they hate open standards.
CUDA with OpenCL/DirectCompute has functionally already happened, except a few specialized situations in science/math/modeling. Physx is only used in games that Nvidia has literally paid the developers to include and they're not really developing it further.
Boring monitors really. Reason being that for some reason Freesync users are treated as second class citizens with most FS supporting monitors using TN panels. I dont even dream of 120Hz 4K IPS. Just give me proper 27" 144Hz+ 1440P IPS with good quality control and DP 1.3/1.4 and HDR capability+ support for both sync tech inside under 1000$. I'd be happy with that for years to come until OLED finally gets better pricing.
I'm in the market for a monitor upgrade and really like the 3rd option, MG248UQ, but pricing will be the final decision point. Bummer that it isn't available yet.
Still waiting for the 120hz 4K IPS monitors to land. I assume there's just not enough people willing to shell out the cash on GPUs to play games at those specs yet.
You might need to wait for a while. The DP 1.2 does not support 4K resolution at 120hz. The 1.3 standard has been out there for a while, the only problem being that none of the current high-end graphics cards support it. Maybe the next gen cards will...
I really dislike that people tout completely ambiguous claims about hardware not being capable of certain framerates at certain resolutions.
Yes, current gen hardware can do 120fps at 4k resolution, but it all depends on the graphical fidelity of the game you're trying to play.
I can play TF2 at 120fps at 4k resolution on my i5-4690k + GTX 970 on a 24" 3840x2160 IPS monitor (Acer K242HQK) just fine. I disable AA (as the pixel pitch of a 4k 24" monitor is so tiny that enabling AA offers very little benefit while cutting a lot into performance), but keep all other settings at the highest settings (sans bloom, hdr, and other lighting nonsense that gets in the way of identifying players on the enemy team).
Why anyone would need or want to go beyond possibly 2x MSAA on a 4K game is beyond me. At a certain point, higher graphical settings really offer incredibly little benefit while cutting down into performance and sometimes even taking you below 60hz in intense scenes. You should always aim to have your minimum FPS in gameplay to be at least 10% higher than your monitor's refresh rate.
Even DisplayPort 1.3 doesn't support that kind of bandwidth, and nobody's done anything with trying to support that hardware. Then, DisplayPort 1.4 was just published by VESA like a month ago, which does support 1.4, so it'll be a good year or two before you see graphics cards supporting DP1.4 AND displays supporting DP1.4.
DisplayPort 1.3 (Approved Sept 2014): "This bandwidth is enough for a 4K UHD display (3840×2160) at 120 Hz, a 5K display (5120×2880) at 60 Hz, or an 8K UHD display (7680×4320) at 30 Hz, with 24-bit RGB color. "
It's not any more awful than the 1080/1200 24" & 1440 27" screens many are already used to, which also means zero scaling issues in Windows or games. I actually find it pretty appealing, spent a while looking at that Phillips 40" last year...
It'd be roughly equivalent to my 3x 24" when they're in portrait mode, but without bezzels in between. I find that a lot more enticing that phone-like ppi on a screen that is a couple feet from me, and going down to a single 24/27" is just my gonna happen for me work-wise.
4k at 42" is roughly the same pixel density as standard non-"retina" monitors. It's like four 20" 1080p monitors in a 2x2 configuration with no bezels-- pretty sweet.
Thanks guys, exactly. 100-110ppi means no scaling, which also means normal face to monitor distance 16"-30". Also, if desirable, scoot the monitor back to about 38" from face and set scaling to 150% for high dpi.
In my opinion hi dpi is nice when you have size constraints, like on a laptop, so you can get that crisp text and images. However, in an office, I would always trade density for unscaled real estate.
I currently use a Wasabi Mango 43", it sits 28 inches from my face and has a PPI of about 103. Like schizoide says it's the same as bezel-less 2x2 of 21" 1080p screens. Everything about it is amazing. I have full unscaled 4k real estate, or I have 1440p effective high dpi when I scoot it all the way back on my desk. However, I think it's unlikely it will last me very long and I'm sure I can't collect on warranty.
Now if I could have the same screen size at hi dpi I would do it, but that would be a 7k-8k screen... Someday I'm sure!
Remember the price point. You could get all of the boxes ticked (within reason, no wide gamut or high refresh rate 4K yet) monitors months ago but as soon as you pick a price you have to shave features.
I'm writing this comment on an ASUS X501 laptop. The first and last ASUS product I will buy. It did not even last the 3 year guarantee period without problems.
I think reviewers or the news item writer must also indicate: 1. Whether display supports HDR with rec 2020 games spec? 2. HD I.2 supports HDCP 2.2 or not? Please ask these answers from manufacturer when u review these babies.
I like seeing a 4k monitor at a reasonable 24" size. While 27"+ size screens are good for work, for gaming that's just too big to take in the whole screen all at once.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
42 Comments
Back to Article
Flunk - Friday, April 15, 2016 - link
These monitors are all pretty compelling, do we have any idea when Nvidia is going to suport VESA adaptive-sync?G0053 - Friday, April 15, 2016 - link
It is silly of them not supporting adaptive sync. It would just be another check box for them.JoeyJoJo123 - Friday, April 15, 2016 - link
It actually adds cost.The G-Sync module is used in place of (or bypasses any use of) the typical monitor's display scaler, and that's no exception for adaptive sync. So for monitors to be able to support both G-Sync and Adaptive Sync, it'd need to have both the usual display scaler AND the G-Sync module.
Some laptop models have been shown to support G-Sync despite not having a G-Sync module, so I suppose it is possible to get both G-Sync and Adaptive Sync on the same monitor without the expensive G-Sync module, but you'd need some kind of mode switching between the two.
In my opinion, buying G-Sync monitors today or video cards that don't support VESA Adaptive Sync (Nvidia cards) feels like a sunk cost in the long run. Only Adaptive Sync displays and video output will really have mileage 5 years down the road when a plethora of electronics have started adopting Adaptive Sync more commonly.
Polaris can't come soon enough.
BurntMyBacon - Friday, April 15, 2016 - link
@JoeyJoJo123: "It actually adds cost. The G-Sync module is used in place of (or bypasses any use of) the typical monitor's display scaler ..."I'm pretty sure the OP was talking about nVidia supporting VESA adaptive-sync on their video cards. There is no technical reason that they couldn't. There wouldn't be anything added to the Bill Of Materials, just a relatively short (trivial?) development time to support it in software.
@JoeyJoJo123: "Some laptop models have been shown to support G-Sync despite not having a G-Sync module ..."
Because they already know how to support "Sync" technology without their G-Sync module, it shouldn't be hard to support Adaptive Sync with their video cards. It probably isn't all that difficult (now that much of the work is done) to support G-Sync on most "Sync" capable monitor without the module, but there are probably some tradeoffs.
Pantsu - Friday, April 15, 2016 - link
There might be a couple reasons for that. Their current gen graphics cards might not have the necessary hardware features in their DisplayPort implementation for Adaptive-Sync support. That would mean Nvidia could only support it with Pascal or newer, so it's understandable why they would not announce support before a good portion of their user base have supported cards.The more cynical view is that they have the capability in current cards, but see it as competition and try to stifle it in favor of their G-Sync implementation. In that case they'll probably wait to support it only if the rest of the industry chooses Adaptive-Sync. Intel's support would essentially doom G-Sync. At the moment Adaptive-Sync panels seem to have bigger restrictions with the refresh rate range, and it'll likely take a year or two before TCON manufacturers start offering boards that aren't hobbled with tight dynamic refresh rate ranges.
plopke - Friday, April 15, 2016 - link
I would love to see this on Nvidia cards but I guess for Nvdia it isn't that easy. Not sure if the GSYNC premium pricing is higher then AMD's Freesync/Adaptive sync but since they went the exclusive route they might be stuck with some tough choices and questions?-Can they run freesync on G-sync hardware,namely the monitors might be a problem?
-How to promote it to the market if they drop gsync and your gsync customers?
-Try to improve/adapt Gsync so you can still offer it as apremium over freesync?
plopke - Friday, April 15, 2016 - link
Anyway long life competition :PFlunk - Friday, April 15, 2016 - link
They don't need to stop supporting G-Sync to support adaptive sync.valinor89 - Friday, April 15, 2016 - link
Even better, just rename VESA adaptive-sync as G-sync Lite or something and offer the original G-sync as a premium product. Assuming original G-sync is better. Have not seen them both in action. This way they don't have to abandon the sunk costs on G-sync marketing.Alexvrb - Saturday, April 16, 2016 - link
This idea gets my vote. But good luck selling it to Nvidia. :Dschizoide - Friday, April 15, 2016 - link
Exactly right, and G-Sync would still be a selling point-- G-Sync will show the same image for several refreshes, which adaptive sync/freesync doesn't support.Both AMD and Intel support adaptive sync, and it's part of the VESA spec. This spec is obviously going to win. NVidia won't support it because, well... they're dicks.
Rexolaboy - Saturday, April 16, 2016 - link
Incorrect, AMD does have Low Frame Compensation. If the FreeSync range is 2.5 times its minimum rate it allows frame doubling.....just like G-sync.yannigr2 - Friday, April 15, 2016 - link
The same day they will replace PhysX with Havok and CUDA with OpenCL. We are talking about Nvidia here. They love proprietary, they hate open standards.Flunk - Friday, April 15, 2016 - link
CUDA with OpenCL/DirectCompute has functionally already happened, except a few specialized situations in science/math/modeling. Physx is only used in games that Nvidia has literally paid the developers to include and they're not really developing it further.R7 - Friday, April 15, 2016 - link
Boring monitors really. Reason being that for some reason Freesync users are treated as second class citizens with most FS supporting monitors using TN panels. I dont even dream of 120Hz 4K IPS. Just give me proper 27" 144Hz+ 1440P IPS with good quality control and DP 1.3/1.4 and HDR capability+ support for both sync tech inside under 1000$. I'd be happy with that for years to come until OLED finally gets better pricing.JoeyJoJo123 - Friday, April 15, 2016 - link
And all three will cost more than the competition and have terrible backlight bleed and quality control as is typical from SNSV!creed3020 - Friday, April 15, 2016 - link
I'm in the market for a monitor upgrade and really like the 3rd option, MG248UQ, but pricing will be the final decision point. Bummer that it isn't available yet.vanilla_gorilla - Friday, April 15, 2016 - link
Still waiting for the 120hz 4K IPS monitors to land. I assume there's just not enough people willing to shell out the cash on GPUs to play games at those specs yet.Sworp - Friday, April 15, 2016 - link
You might need to wait for a while. The DP 1.2 does not support 4K resolution at 120hz. The 1.3 standard has been out there for a while, the only problem being that none of the current high-end graphics cards support it. Maybe the next gen cards will...madwolfa - Friday, April 15, 2016 - link
There's no current high-end graphics card that could pull off 120 FPS in 4K resolution anyway, so...Sworp - Friday, April 15, 2016 - link
4x 980Ti might just make it.... might.vanilla_gorilla - Friday, April 15, 2016 - link
Depends on the game. And many cards could play AAA titles in SLI, which, like I said, I don't think many people would be willing to shell out for.JoeyJoJo123 - Friday, April 15, 2016 - link
I really dislike that people tout completely ambiguous claims about hardware not being capable of certain framerates at certain resolutions.Yes, current gen hardware can do 120fps at 4k resolution, but it all depends on the graphical fidelity of the game you're trying to play.
I can play TF2 at 120fps at 4k resolution on my i5-4690k + GTX 970 on a 24" 3840x2160 IPS monitor (Acer K242HQK) just fine. I disable AA (as the pixel pitch of a 4k 24" monitor is so tiny that enabling AA offers very little benefit while cutting a lot into performance), but keep all other settings at the highest settings (sans bloom, hdr, and other lighting nonsense that gets in the way of identifying players on the enemy team).
Why anyone would need or want to go beyond possibly 2x MSAA on a 4K game is beyond me. At a certain point, higher graphical settings really offer incredibly little benefit while cutting down into performance and sometimes even taking you below 60hz in intense scenes. You should always aim to have your minimum FPS in gameplay to be at least 10% higher than your monitor's refresh rate.
schizoide - Friday, April 15, 2016 - link
Not in gaming obviously, but higher refresh rates are really sweet for desktop work too.vanilla_gorilla - Friday, April 15, 2016 - link
The R9 380X doesn't support 1.3?JoeyJoJo123 - Friday, April 15, 2016 - link
Not going to happen for a long while, bud.Even DisplayPort 1.3 doesn't support that kind of bandwidth, and nobody's done anything with trying to support that hardware. Then, DisplayPort 1.4 was just published by VESA like a month ago, which does support 1.4, so it'll be a good year or two before you see graphics cards supporting DP1.4 AND displays supporting DP1.4.
vanilla_gorilla - Friday, April 15, 2016 - link
https://en.wikipedia.org/wiki/DisplayPort#1.3DisplayPort 1.3 (Approved Sept 2014): "This bandwidth is enough for a 4K UHD display (3840×2160) at 120 Hz, a 5K display (5120×2880) at 60 Hz, or an 8K UHD display (7680×4320) at 30 Hz, with 24-bit RGB color. "
zepi - Friday, April 15, 2016 - link
24-bit color is unacceptable with HDR and 10bit hevc encodes becoming available.Nintendo Maniac 64 - Friday, April 15, 2016 - link
And no HDR and 10bit HEVC video is 4k @ 120fps with 4:4:4 chromajavishd - Friday, April 15, 2016 - link
I wish Asus would make a ~42" 4k monitor. The Korean ones make me nervous.vanilla_gorilla - Friday, April 15, 2016 - link
Why would you want a 4K 42" monitor? The pixel density would be awful (~104ppi). Unless you plan on sitting 10+ feet from it?Impulses - Friday, April 15, 2016 - link
lol What? 10 feet is a gross exaggeration...It's not any more awful than the 1080/1200 24" & 1440 27" screens many are already used to, which also means zero scaling issues in Windows or games. I actually find it pretty appealing, spent a while looking at that Phillips 40" last year...
It'd be roughly equivalent to my 3x 24" when they're in portrait mode, but without bezzels in between. I find that a lot more enticing that phone-like ppi on a screen that is a couple feet from me, and going down to a single 24/27" is just my gonna happen for me work-wise.
schizoide - Friday, April 15, 2016 - link
4k at 42" is roughly the same pixel density as standard non-"retina" monitors. It's like four 20" 1080p monitors in a 2x2 configuration with no bezels-- pretty sweet.javishd - Monday, April 18, 2016 - link
Thanks guys, exactly. 100-110ppi means no scaling, which also means normal face to monitor distance 16"-30". Also, if desirable, scoot the monitor back to about 38" from face and set scaling to 150% for high dpi.In my opinion hi dpi is nice when you have size constraints, like on a laptop, so you can get that crisp text and images. However, in an office, I would always trade density for unscaled real estate.
I currently use a Wasabi Mango 43", it sits 28 inches from my face and has a PPI of about 103. Like schizoide says it's the same as bezel-less 2x2 of 21" 1080p screens. Everything about it is amazing. I have full unscaled 4k real estate, or I have 1440p effective high dpi when I scoot it all the way back on my desk. However, I think it's unlikely it will last me very long and I'm sure I can't collect on warranty.
Now if I could have the same screen size at hi dpi I would do it, but that would be a 7k-8k screen... Someday I'm sure!
ToTTenTranz - Friday, April 15, 2016 - link
If the maximum refresh rate is 60Hz for the 28" model, why not use an IPS panel?It's like Asus is dead set on inducing caveats in every single monitor in their line-up.
willis936 - Friday, April 15, 2016 - link
Remember the price point. You could get all of the boxes ticked (within reason, no wide gamut or high refresh rate 4K yet) monitors months ago but as soon as you pick a price you have to shave features.JoeyJoJo123 - Friday, April 15, 2016 - link
There are no 28" 4k IPS panels in production, AFAIK.27" 4K IPS, yes, but not 28".
Henry Tobias - Saturday, April 16, 2016 - link
I'm writing this comment on an ASUS X501 laptop. The first and last ASUS product I will buy. It did not even last the 3 year guarantee period without problems.Rishi100 - Saturday, April 16, 2016 - link
I think reviewers or the news item writer must also indicate:1. Whether display supports HDR with rec 2020 games spec?
2. HD I.2 supports HDCP 2.2 or not?
Please ask these answers from manufacturer when u review these babies.
Rishi100 - Saturday, April 16, 2016 - link
Not games but gamut.Murloc - Sunday, April 17, 2016 - link
this is a press release so they don't know but I'm sure they write about it and test it when doing reviews.Khenglish - Saturday, April 16, 2016 - link
I like seeing a 4k monitor at a reasonable 24" size. While 27"+ size screens are good for work, for gaming that's just too big to take in the whole screen all at once.