Awfully nice to see more Freesync monitors hitting the market, now if they are just of a decent refresh range..... Amazing how many are out there with a pitifully small range that is practically unusable.
I think a lot of it is the other half of Freesync being based on an optional (if rarely used) part of the spec. While it could be turned on via a firmware update to existing controllers, if the controller wasn't designed around it, the ability to usefully support it was rather limited. Presumably by now any new monitors were designed a full hardware package where everything was intended to be variable sync from the start and all future Freesync monitors should have comparable performance ranges to what GSync monitors can do.
"Amazing how many are out there with a pitifully small range that is practically unusable. "
It's an artifact of Freesync using a regular panel controller (hence the reduced cost) rather than a dedicated controller as with G-Sync. Freesync allows a panel controller manufacturer to take an existing 'normal' fixed refresh rate controller, that already has the capability to output a wide range of pixel clock timings (so you can sell one controller to drive a wide range of panels) and 'unlock' that existing variation to be changed per-frame. The downside is there are not that massive a range of panels serviced by each controller (e.g. one controller may happily do 1920x1080 @100Hz, or 2540x1440 @ 60Hz, but won't do 3840x2160 @ 120Hz) which limits the available VRR range to what the controller can do.
It is of course possible to design a controller that will operate over a very large range of refresh rates, but it's expensive to do so. Monitor assemblers are taking the view of "if Nvidia have already made the investment to design such a panel controller, why would we not just buy the controller that exists rather than funding development of one that doesn't?".
Your statements are grounded in bias, rather than fact. Freesync allows a broad range of implementations and controllers with various price levels and capabilities. It's all about giving people choice, a wide range of products. You can have affordable low-refresh panels with a smaller range like 40-75, which is better than similarly priced non-adaptive sync displays with NO adaptive range. Or you can implement a premium gaming monitor with high refresh rates and very large FreeSync ranges. I've seen 144hz displays with a 30-144 range. There's no reason a manufacturer couldn't implement even larger ranges should they so choose. FreeSync version 2 has been out for a while and adds features like LFC, too. G-Sync was conceived as a feature for the elite, high-end displays. The peasants among us could only afford regular old non-adaptive sync monitors... until FreeSync came along.
Also the number of FreeSync monitors on the market is greater than the number of G-Sync models. That seems to run contrary to your assertion that manufacturers choose the Nvidia solution because they already designed such a controller. This is despite G-Sync's head start. Must be because you didn't factor in the high cost of Nvidia's proprietary solution.
"Freesync allows a broad range of implementations and controllers with various price levels and capabilities."
I never stated otherwise (in fact, I explicitly stated this). What is allowed for and what is available are two different things though. Asus, Acer, etc do not design panel controllers, nor to AMD. Both are dependant on what panel controller manufacturers can implement for the least cost.
"Also the number of FreeSync monitors on the market is greater than the number of G-Sync models. That seems to run contrary to your assertion that manufacturers choose the Nvidia solution because they already designed such a controller. This is despite G-Sync's head start. Must be because you didn't factor in the high cost of Nvidia's proprietary solution." Cost is the reason for the lack of high-end Freesync monitors, and the proliferation of more basic ones. More capable controllers cost more, regardless of who is designing them. Somebody may pony up the funds to make a high-end controller compatible with DP Adaptive Sync, but this far nobody has.
"FreeSync version 2 has been out for a while and adds features like LFC, too." Note that the majority of featured added to Freesync 2 are host-side ones done in software (e.g. HDR) rather than ones dependant on changes in monitor hardware.
You said "It's an artifact of Freesync using a regular panel controller". This is simply not true. They could have very easily mandated that manufacturers meet various specs, including sync range. The real cause is them allowing OEMs free reign to implement any number of solutions from cheap to premium. Even the lowest end (or entry-level IPS displays) 40-60hz display is still better than a conventional 60hz display (all else remaining equal), and the adaptive sync range is still beneficial when your framerate fluctuates - dipping into the 50s or 40s wouldn't present a problem nearly as severe as with a conventional monitor. But there are plenty of examples of monitors with wider ranges, yes even with "regular" (non-proprietary) controllers.
"Cost is the reason for the lack of high-end Freesync monitors, and the proliferation of more basic ones. More capable controllers cost more, regardless of who is designing them. Somebody may pony up the funds to make a high-end controller compatible with DP Adaptive Sync, but this far nobody has." Like I said, cost. Trying to paint the spread of affordable adaptive sync displays as a bad thing smacks of elitism. Adaptive sync? Not for you plebs with your cheap displays. These days even gamers whose rig is $600 (arguably the ones who benefit the most from adaptive sync) can afford a FreeSync monitor now. Also there are high-end FreeSync solutions for those who want and can afford them. Or I guess monitors like the XG270HU are using a low-end controller, weird, wonder how they pull it off.
"Note that the majority of featured added to Freesync 2 are host-side ones done in software (e.g. HDR) rather than ones dependant on changes in monitor hardware." I don't think it much matters as long as it works. For example windowed full screen freesync support. The graphics card (and it's drivers) are the other part of the equation anyway. LFC does have hardware requirements, however.
While the Strix XG258 looks interesting, I'm leaning more towards the XL2540 as a FreeSync 24.5" 1080p TN 240hz monitor myself.
Cleaner aesthetics, likely more competitive price, and adjustable motion blur settings which can allow for a "60Hz ULMB" mode for plugged in 1080p 60hz consoles and devices.
>Q: Can I use Blur Reduction for Game Consoles and for Television?
>A: Yes! You just pre-configure Strobe Utility at 1920×1080 60Hz via computer first, then switch input to the HDMI input for gaming/televison. You can even unplug your monitor and move it to a different room, after pre-configuring it with Strobe Utility. The computer is no longer needed after configuring. Currently, Strobe Utility is the only way to get “LightBoost effect” at 60Hz. This is CRT-clarity 60fps at 60Hz with no motion blur!
@Patrick MacMillan Did you get any information about the strobing on the monitors? ASUS referred to them as "Extreme Low Motion Blur" in their press release, so I'm wondering if this means something beyond the normal 120Hz strobing modes we've had for years. In particular, I'm wondering if XG258 will have a ~180Hz strobed mode that is factory calibrated to avoid crosstalk. Thank you in advance.
It's unlikely. ULMB isn't a "thing" on FreeSync monitors, as ULMB is categorically an improvement to Nvidia's Lightboost and is a feature defined by Nvidia to coincide side-by-side Gsync on Gsync enabled monitors. FreeSync monitors aren't Gsync enabled, and therefore don't have ULMB.
Extreme Low Motion Blur is likely an AMD equivalent that ASUS has created to coincide with FreeSync, in absence of AMD FreeSync monitors having any ULMB option.
Furthermore, as you already know ULMB is defined at 120hz. If you're interested in 180hz "ULMB" + Freesync, it's possible that the XL2540 (24.5" 240hz 1080p TN FreeSync) might be able to adjust for that, but I can't make any guarantees. Essentially you'd set that monitor at that refresh rate, then run the strobe utility and adjust the settings until the image is tuned for the refresh rate you want, and it's done.
Unlike other "ULMB" modes shipped by other monitor manufacturers, BenQ/Zowie has both a custom implementation that allows for adjustment of the strobe phase, strobe duty, and single strobing, so you can adjust the trade-off between reduced input lag versus reduced ghosting.
I am aware, but seeing as they used a new term for it(which you could be absolutely right in saying is just their new name for the Freesync equivalent of ULMB) I was hoping they might have taken note of the community experimentation that's been going on with the XL2540 and XL2546 and decided to include an out-of-the-box menu setting for it on the XG258. I mean... someone has got to do it at some point, right? That would be a lot more appealing to the general consumer than "go fiddle with your refresh rate and vertical totals for half an hour until it looks right".
Freesync is a must have today and G-sync is great too. I think the dilemma is that AMD GPUs just aren't competitive today and hard to find. So it's better just to pay for the higher priced G-sync monitors and pair with Nvidia GPUs. I've gamed on several G-syncs and just super stable and lag free gaming at 4K. How could anyone game without it.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
16 Comments
Back to Article
Vyvian07 - Tuesday, June 6, 2017 - link
Wow, the Strix XG32V sounds amazing. Too bad it will most likely be WAY out of my price range.tarqsharq - Wednesday, June 7, 2017 - link
AOC is releasing the AGON AG322QCX or something along those lines that I suspect is based on the exact same panel for probably less money.Jax Omen - Monday, June 12, 2017 - link
I've been waiting for that monitor since february, can't find ANYTHING on it since it was previewed.Was supposed to come out in May.
I just want a >27" 144hz 1440p non-TN monitor to replace my 30" 60hz IPS! V_V
waltsmith - Tuesday, June 6, 2017 - link
Awfully nice to see more Freesync monitors hitting the market, now if they are just of a decent refresh range..... Amazing how many are out there with a pitifully small range that is practically unusable.DanNeely - Tuesday, June 6, 2017 - link
I think a lot of it is the other half of Freesync being based on an optional (if rarely used) part of the spec. While it could be turned on via a firmware update to existing controllers, if the controller wasn't designed around it, the ability to usefully support it was rather limited. Presumably by now any new monitors were designed a full hardware package where everything was intended to be variable sync from the start and all future Freesync monitors should have comparable performance ranges to what GSync monitors can do.edzieba - Tuesday, June 6, 2017 - link
"Amazing how many are out there with a pitifully small range that is practically unusable. "It's an artifact of Freesync using a regular panel controller (hence the reduced cost) rather than a dedicated controller as with G-Sync. Freesync allows a panel controller manufacturer to take an existing 'normal' fixed refresh rate controller, that already has the capability to output a wide range of pixel clock timings (so you can sell one controller to drive a wide range of panels) and 'unlock' that existing variation to be changed per-frame. The downside is there are not that massive a range of panels serviced by each controller (e.g. one controller may happily do 1920x1080 @100Hz, or 2540x1440 @ 60Hz, but won't do 3840x2160 @ 120Hz) which limits the available VRR range to what the controller can do.
It is of course possible to design a controller that will operate over a very large range of refresh rates, but it's expensive to do so. Monitor assemblers are taking the view of "if Nvidia have already made the investment to design such a panel controller, why would we not just buy the controller that exists rather than funding development of one that doesn't?".
Alexvrb - Tuesday, June 6, 2017 - link
Your statements are grounded in bias, rather than fact. Freesync allows a broad range of implementations and controllers with various price levels and capabilities. It's all about giving people choice, a wide range of products. You can have affordable low-refresh panels with a smaller range like 40-75, which is better than similarly priced non-adaptive sync displays with NO adaptive range. Or you can implement a premium gaming monitor with high refresh rates and very large FreeSync ranges. I've seen 144hz displays with a 30-144 range. There's no reason a manufacturer couldn't implement even larger ranges should they so choose. FreeSync version 2 has been out for a while and adds features like LFC, too. G-Sync was conceived as a feature for the elite, high-end displays. The peasants among us could only afford regular old non-adaptive sync monitors... until FreeSync came along.Also the number of FreeSync monitors on the market is greater than the number of G-Sync models. That seems to run contrary to your assertion that manufacturers choose the Nvidia solution because they already designed such a controller. This is despite G-Sync's head start. Must be because you didn't factor in the high cost of Nvidia's proprietary solution.
edzieba - Wednesday, June 7, 2017 - link
"Freesync allows a broad range of implementations and controllers with various price levels and capabilities."I never stated otherwise (in fact, I explicitly stated this). What is allowed for and what is available are two different things though. Asus, Acer, etc do not design panel controllers, nor to AMD. Both are dependant on what panel controller manufacturers can implement for the least cost.
"Also the number of FreeSync monitors on the market is greater than the number of G-Sync models. That seems to run contrary to your assertion that manufacturers choose the Nvidia solution because they already designed such a controller. This is despite G-Sync's head start. Must be because you didn't factor in the high cost of Nvidia's proprietary solution."
Cost is the reason for the lack of high-end Freesync monitors, and the proliferation of more basic ones. More capable controllers cost more, regardless of who is designing them. Somebody may pony up the funds to make a high-end controller compatible with DP Adaptive Sync, but this far nobody has.
"FreeSync version 2 has been out for a while and adds features like LFC, too."
Note that the majority of featured added to Freesync 2 are host-side ones done in software (e.g. HDR) rather than ones dependant on changes in monitor hardware.
Alexvrb - Thursday, June 8, 2017 - link
You said "It's an artifact of Freesync using a regular panel controller". This is simply not true. They could have very easily mandated that manufacturers meet various specs, including sync range. The real cause is them allowing OEMs free reign to implement any number of solutions from cheap to premium. Even the lowest end (or entry-level IPS displays) 40-60hz display is still better than a conventional 60hz display (all else remaining equal), and the adaptive sync range is still beneficial when your framerate fluctuates - dipping into the 50s or 40s wouldn't present a problem nearly as severe as with a conventional monitor. But there are plenty of examples of monitors with wider ranges, yes even with "regular" (non-proprietary) controllers."Cost is the reason for the lack of high-end Freesync monitors, and the proliferation of more basic ones. More capable controllers cost more, regardless of who is designing them. Somebody may pony up the funds to make a high-end controller compatible with DP Adaptive Sync, but this far nobody has." Like I said, cost. Trying to paint the spread of affordable adaptive sync displays as a bad thing smacks of elitism. Adaptive sync? Not for you plebs with your cheap displays. These days even gamers whose rig is $600 (arguably the ones who benefit the most from adaptive sync) can afford a FreeSync monitor now. Also there are high-end FreeSync solutions for those who want and can afford them. Or I guess monitors like the XG270HU are using a low-end controller, weird, wonder how they pull it off.
"Note that the majority of featured added to Freesync 2 are host-side ones done in software (e.g. HDR) rather than ones dependant on changes in monitor hardware."
I don't think it much matters as long as it works. For example windowed full screen freesync support. The graphics card (and it's drivers) are the other part of the equation anyway. LFC does have hardware requirements, however.
JoeyJoJo123 - Tuesday, June 6, 2017 - link
While the Strix XG258 looks interesting, I'm leaning more towards the XL2540 as a FreeSync 24.5" 1080p TN 240hz monitor myself.Cleaner aesthetics, likely more competitive price, and adjustable motion blur settings which can allow for a "60Hz ULMB" mode for plugged in 1080p 60hz consoles and devices.
https://www.blurbusters.com/benq/strobe-utility/
>Q: Can I use Blur Reduction for Game Consoles and for Television?
>A: Yes! You just pre-configure Strobe Utility at 1920×1080 60Hz via computer first, then switch input to the HDMI input for gaming/televison. You can even unplug your monitor and move it to a different room, after pre-configuring it with Strobe Utility. The computer is no longer needed after configuring. Currently, Strobe Utility is the only way to get “LightBoost effect” at 60Hz. This is CRT-clarity 60fps at 60Hz with no motion blur!
0siris - Tuesday, June 6, 2017 - link
@Patrick MacMillanDid you get any information about the strobing on the monitors? ASUS referred to them as "Extreme Low Motion Blur" in their press release, so I'm wondering if this means something beyond the normal 120Hz strobing modes we've had for years. In particular, I'm wondering if XG258 will have a ~180Hz strobed mode that is factory calibrated to avoid crosstalk. Thank you in advance.
JoeyJoJo123 - Wednesday, June 7, 2017 - link
It's unlikely. ULMB isn't a "thing" on FreeSync monitors, as ULMB is categorically an improvement to Nvidia's Lightboost and is a feature defined by Nvidia to coincide side-by-side Gsync on Gsync enabled monitors. FreeSync monitors aren't Gsync enabled, and therefore don't have ULMB.http://www.144hzmonitors.com/g-sync-ulmb/
Extreme Low Motion Blur is likely an AMD equivalent that ASUS has created to coincide with FreeSync, in absence of AMD FreeSync monitors having any ULMB option.
Furthermore, as you already know ULMB is defined at 120hz. If you're interested in 180hz "ULMB" + Freesync, it's possible that the XL2540 (24.5" 240hz 1080p TN FreeSync) might be able to adjust for that, but I can't make any guarantees. Essentially you'd set that monitor at that refresh rate, then run the strobe utility and adjust the settings until the image is tuned for the refresh rate you want, and it's done.
https://www.blurbusters.com/benq/strobe-utility/
Unlike other "ULMB" modes shipped by other monitor manufacturers, BenQ/Zowie has both a custom implementation that allows for adjustment of the strobe phase, strobe duty, and single strobing, so you can adjust the trade-off between reduced input lag versus reduced ghosting.
0siris - Wednesday, June 7, 2017 - link
I am aware, but seeing as they used a new term for it(which you could be absolutely right in saying is just their new name for the Freesync equivalent of ULMB) I was hoping they might have taken note of the community experimentation that's been going on with the XL2540 and XL2546 and decided to include an out-of-the-box menu setting for it on the XG258. I mean... someone has got to do it at some point, right? That would be a lot more appealing to the general consumer than "go fiddle with your refresh rate and vertical totals for half an hour until it looks right".Samus - Wednesday, June 7, 2017 - link
How hard is it to make a monitor that just looks like a monitor. What the hell is going on with that styling?TEAMSWITCHER - Wednesday, June 7, 2017 - link
Ugly.vision33r - Wednesday, June 7, 2017 - link
Freesync is a must have today and G-sync is great too. I think the dilemma is that AMD GPUs just aren't competitive today and hard to find. So it's better just to pay for the higher priced G-sync monitors and pair with Nvidia GPUs. I've gamed on several G-syncs and just super stable and lag free gaming at 4K. How could anyone game without it.