• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

Nvidia bullying monitor manufacturers?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.
In the last year, nVidia has added the code to support Freesync to their drivers (since after all it's open source). The part I'm not clear on, is if it can actually be enabled on *any* Freesync monitor, or only supported ones. nVidia also released a list of "Gsync compatible" monitors. These are just Freesync monitors that passed nVidia testing and supposed to work well with nVidia cards using the Freesync protocol.

For supported GPUs with a new enough driver:
If the connected monitor is certified G-sync compatible, it "just works" and is activated by default.
If the connected monitor is FreeSync compatible but not certified G-Sync compatible, the user can manually turn it on.

Second question: Do the panel manufacturers need to pay for the "gsync compatible" testing or the use of the marketing language? Is this a licensing issue? In that case how is nVidia charging to license compatibility with open source software that was developed by their competition (again going on memory here). If so that also seems wrong.

G-Sync Compatible is not equivalent to saying it works with FreeSync. I'm calling it GSC for short from now as I'm not typing that every time. GSC means it not only works, but the monitor itself is of some high standard of image quality that is comparable to native G-sync displays.

When GSC was introduced, nvidia were testing all available monitors out of their own (very deep) pockets. Certified monitors would be enabled in driver. The list was very short. I don't know if the monitor manufacturers were involved in this process, at least for displays already in the market.

For new monitors that were being developed, it is conceivable the manufacturers would work with nvidia to try for certification before launch. I don't know what that process is, or if anyone needs to pay anyone. I doubt it would work out to be a significant revenue stream for nvidia, so they may choose not to charge for it in the hopes of increasing adoption.

It is fair to assume that as G-sync is an nvidia owned IP, there would at the least be some legal agreement between nvidia and display manufacturers in order to use the logo.
 
I would stick with NVIDIA's list, honestly. You'll have more bad luck 'trying' this than you will with memory on the QVL.

I'm sure there will be cases like yours, but, I wouldn't bet on it to the point of ignoring the list and getting and FS monitor 'from a trusted brand' as the only barometer.

I believe folks are G-sync compatible safe with newly released brand monitors from ASUS, LG, DELL. The reason I say folks are safe is how many new model, reputable brand Freesync monitor modules out in the wild will not work normally.

I found this good reading at techspot.

"As soon as we saw this, we called BS. And that’s because the issues they showed off are not issues with FreeSync or the VESA Adaptive Sync standard; they are not issues inherent to the technology. Instead, they are issues with monitor manufacturers producing a crappy product. It’s no secret that some FreeSync monitors – especially earlier models – aren’t very good and do indeed have issues like flickering even on AMD GPUs.

But those monitors are just rubbish. In our opinion, if you receive a monitor that flickers or has blanking issues, it’s a defective product that should be returned. Of course, there is a possibility that adaptive sync monitors that work perfectly on AMD GPUs, will have issues on Nvidia GPUs. That would be Nvidia’s fault for not implementing support for adaptive sync properly, but as with all implementations, bugs and other issues are possible." https://www.techspot.com/article/1779-freesync-and-nvidia-geforce/

- - - Auto-Merged Double Post - - -

For supported GPUs with a new enough driver:
If the connected monitor is certified G-sync compatible, it "just works" and is activated by default.
If the connected monitor is FreeSync compatible but not certified G-Sync compatible, the user can manually turn it on.



G-Sync Compatible is not equivalent to saying it works with FreeSync. I'm calling it GSC for short from now as I'm not typing that every time. GSC means it not only works, but the monitor itself is of some high standard of image quality that is comparable to native G-sync displays.

When GSC was introduced, nvidia were testing all available monitors out of their own (very deep) pockets. Certified monitors would be enabled in driver. The list was very short. I don't know if the monitor manufacturers were involved in this process, at least for displays already in the market.

For new monitors that were being developed, it is conceivable the manufacturers would work with nvidia to try for certification before launch. I don't know what that process is, or if anyone needs to pay anyone. I doubt it would work out to be a significant revenue stream for nvidia, so they may choose not to charge for it in the hopes of increasing adoption.

It is fair to assume that as G-sync is an nvidia owned IP, there would at the least be some legal agreement between nvidia and display manufacturers in order to use the logo.

I have the newer nvidia driver and my FreeSync just works by default.
 
To add to mackerel's definition:

Adaptive SYNC is a protocal in the VESA standard and picked up by DisplayPort initially. HDMI will support a version of Adaptive SYNC but built off the GSync and FreeSync.

GSync and FreeSync are additional layers on top of Adaptive Sync protocol. They are performance packages as others have stated. GSync and FreeSync specify what refresh rate they work at, and that's about it. After that its just driver layers interacting with the vesa protocol. If I'm not mistaken, if you are using DP1.2a or newer, you will always use Adaptive Sync even when you set a game Vsync.
 
I believe folks are G-sync compatible safe with newly released brand monitors from ASUS, LG, DELL. The reason I say folks are safe is how many new model, reputable brand Freesync monitor modules out in the wild will not work normally.
I would think NVIDIA would want to certify as many monitors as they could to get 'g-sync' out there even more. What benefit would it be to them to NOT certify a monitor when it works without issue? Now, clearly they cannot test ALL monitors, just like AIBs cannot test every single stick of memory on every motherboard throughout its life so there are likely many out there, like yours, that work...but why take the chance? The list grows weekly and pricing is still reasonable on all those certified. On the memory side, the chance of complete incompatibility is seemingly less compared to monitors (talking gsync not working).
 
I would think NVIDIA would want to certify as many monitors as they could to get 'g-sync' out there even more. What benefit would it be to them to NOT certify a monitor when it works without issue? Now, clearly they cannot test ALL monitors, just like AIBs cannot test every single stick of memory on every motherboard throughout its life so there are likely many out there, like yours, that work...but why take the chance? The list grows weekly and pricing is still reasonable on all those certified. On the memory side, the chance of complete incompatibility is seemingly less compared to monitors (talking gsync not working).

If the monitor works fine on AMD Video card FreeSync it should work with Nvidia G sync compatible. If monitor does not work fine without problems it should be returned. The reason I would have to take a chance on monitor quality there is only one 27" 1920x1080p IPS panel 240Hz FreeSync monitor for sale. The New Alienware 27 Gaming Monitor - AW2720HF

This is a good read from techspot
"So of the seven monitors we tested, six worked flawlessly. The one monitor that didn’t was never going to because it required FreeSync over HDMI, which Nvidia doesn’t support. It’s also good to verify that low framerate compensation and HDR work in conjunction with adaptive sync on Nvidia GPUs, just like they do on AMD GPUs.

We expect what we found here will be the case for the vast majority of FreeSync monitors. If the monitor is known to work perfectly with AMD GPUs over DisplayPort – so it doesn’t have inherent flickering issues – it should also work perfectly with Nvidia GPUs when you enable the toggle. If the monitor has issues on an Nvidia GPU, it will also likely have issues on an AMD GPU, and should be returned." https://www.techspot.com/article/1779-freesync-and-nvidia-geforce/
 
If the monitor works fine on AMD Video card FreeSync it should work with Nvidia G sync compatible.
Of course...it's certified (you dont get to call it compatible if it isn't). That was never in question.

I appreciate the link, however, I'm not going to hang my hat on it. There are too many "shoulds" amd assumptions there for me. If a person wants this to work properly the first time guaranteed, they should get one that is certified compatible.

Edit: a good read from nvidia...
https://www.nvidia.com/en-us/geforce/news/g-sync-compatible-validation/
G-SYNC Compatible Testing, Phase 1 Complete: Only 5% of Adaptive-Sync Monitors Made The Cut

To date, 503 VRR monitors have passed through our lab, and 28 (5.56%) have received G-SYNC Compatible validation, meaning 475 monitors failed.

Why? 273 failed for lacking a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz), meaning you were unlikely to get any of the benefits of VRR as your framerate wasn’t within the tight range offered.

And while some may have had a sufficient VRR range, 202 of those also failed due to image quality (flickering, blanking) or other issues. This could range in severity, from the monitor cutting out during gameplay (sure to get you killed in PvP MP games), to requiring power cycling and Control Panel changes every single time.

And in 33 other cases, we couldn’t get hold of a monitor to test as they were no longer manufactured.

Now, who knows what's happened since then or if the only picked potatos to test, but...its what they said in May.. ;)
 
Last edited:
Of course...it's certified. That was never in question.

Just because my monitor says it is G-sync compatible in Nvidia control panel does not mean it is certified.

Techspot said BS on nvidia testing and lot of the old Freesync monitors did not work correctly on AMD card.

Nvidia is not going to test all the monitors out there, it's just like QVL for memory.
 
Last edited:
Just because my monitor says it is G-sync compatible in Nvidia control panel does not mean it is certified.
Oh, interesting. I took it as something labeled as gsync compatible to be certified as that is what nvidia calls it (see edit). In other words I dont call it gsync compatible if nvidia doesnt call it as such. Just because the option is there doesnt mean it will work on all monitors.

That said, it doesnt change my point in the least. See my edit (bottom part). :)

Edit: in the end, we have a difference of opinion on the subject. I'd still bet good money that a higher percent of monitors fail the test than memory not on the qvl failing. :)
 
Last edited:
Oh, interesting. I took it as something labeled as gsync compatible to be certified as that is what nvidia calls it (see edit). In other words I dont call it gsync compatible if nvidia doesnt call it as such. Just because the option is there doesnt mean it will work on all monitors.

That said, it doesnt change my point in the least. See my edit (bottom part). :)

Edit: in the end, we have a difference of opinion on the subject. I'd still bet good money that a higher percent of monitors fail the test than memory not on the qvl failing. :)

In the Nvidia control panel it says G-sync compatible. However, it says my display is not validated as G-sync compatible. It only takes a return if it does work properly. I just have to agree with Techspot if Feesync works on AMD fine it should work on Nvidia otherwise return it like folks do with XMP memory.:)
 
If you want to make that effort to return it...either bringing it back to the store or shipping it.. that's on you. Me, I prefer to get things right the first time. Less effort in the long run. :)

I'm a gambler and this is simply a bet I wouldnt take when I have a sure win already. :)
 
Last edited:
Now, who knows what's happened since then or if the only picked potatos to test, but...its what they said in May.. ;)

During that phase, they tested almost every VRR monitor that existed. I think there were only a handful of old ones they said they couldn't get any more, so not great loss in not testing them. They did try to test everything that existed at the time.

Again, nvidia only certifies what they consider gives a great experience. Not being certified doesn't mean it will suck, but there will be limitations you may or may not care about.
 
Making sure people know which monitors are G-sync compatible hurts NVidia is some ways, since NVidia don't get a large amount of money from the monitor companies for supplying some of the parts the way they do with G-Sync monitors. NVidia might get a small kick-back for allowing monitor companies to use the G-sync compatible logo which would probably be the main reason to do it, since they undercut the price of the G-Sync monitors. There is also the benefit of getting the G-Sync logo everywhere, especially if they might also be suppressing the use of the Freesync logo, which was the main point the video was implying. People looking for monitors will see there few Freesync monitors and many G-sync monitors, since the Freesync brand is being referred to as Adaptive Sync.
 
What nvidia seems to be doing is similar to drm that you have with dvd's but in this case, of hardware. While AMD is using a more 'open' approach, thereby not having to necessarily make it proprietary in any way, nividia is literally taking the 'pay to play' approach. Nividia seemingly is geared towards the gaming industry only, at least by their advertising, and amd has taken, imo, the better (cheaper for me), more open architecture mindset. For my money I will always prefer amd's hardware over intel's and nividia simply because amd is more open to change and better yet they now make both the cards and the cpus.
There really should be an open architecture to begin if for no other reason it would make computing infinitely more practical in all ways.
 
I'm not buying that NVIDIA is bullying monitor makers. They may be trying to bully them but I just don't see NVIDIA bullying getting them anywhere. NVIDIA is big, but Samsung for example is HUGE!

I see this as similar to the 80 PLUS program for power supplies. If a PSU does not display the 80 PLUS logo, that does NOT mean the PSU is not at least 80% efficient. It might be of the highest quality and be 95% or better efficient from 0 - 100% load!

The lack of an 80 PLUS logo on the box simply means the PSU maker did not pay Plug Load Solutions to test and certify the PSU! And since they did not pay this fee, they did not earn (or pay for) the right and license to display the 80 PLUS logo.

So what happens is "the competitive market" ends up, in effect, being the bully, not Plug Load Solutions. Because consumers, right or wrong, have learned (Pavlov's dog theory?) to look for that 80 PLUS logo, manufacturers who want their PSUs to sell well end up being forced to pay Plug Load Solutions to have their PSUs certified just so they can legally display that 80 PLUS logo.

BTW, some see such logo licensing as stifling competition as it may put further financial strain on struggling new startup companies trying to get their new innovative designs out there.
 
I'm not buying that NVIDIA is bullying monitor makers. They may be trying to bully them but I just don't see NVIDIA bullying getting them anywhere. NVIDIA is big, but Samsung for example is HUGE!

I see this as similar to the 80 PLUS program for power supplies. If a PSU does not display the 80 PLUS logo, that does NOT mean the PSU is not at least 80% efficient. It might be of the highest quality and be 95% or better efficient from 0 - 100% load!

The lack of an 80 PLUS logo on the box simply means the PSU maker did not pay Plug Load Solutions to test and certify the PSU! And since they did not pay this fee, they did not earn (or pay for) the right and license to display the 80 PLUS logo.

So what happens is "the competitive market" ends up, in effect, being the bully, not Plug Load Solutions. Because consumers, right or wrong, have learned (Pavlov's dog theory?) to look for that 80 PLUS logo, manufacturers who want their PSUs to sell well end up being forced to pay Plug Load Solutions to have their PSUs certified just so they can legally display that 80 PLUS logo.

BTW, some see such logo licensing as stifling competition as it may put further financial strain on struggling new startup companies trying to get their new innovative designs out there.

Bluntly, that's a load of BS. UL and CE and whatever other organizations do electrical testing require testing for their logos, too. Are you going to claim they're preventing "innovative new designs", too?

It costs $1000 to get your brand listed on the 80 plus web site, then a couple to a few thousand (whether 120 or 240 volt, or both) and two sample units to actually get a test done. Unless you're planning to only sell a dozen units (in which case you're not on NewEgg or Fry's or any computer store anywhere anyway), that's not really a lot.
 
Last edited:
We have to separate between legal requirements and voluntary schemes. While I'm not 100% up to speed on how UL fits in, CE marking is not optional. Unless your product is exempt, you need it if you want to sell it in the EU. To gain a CE mark, there are various paths. You're not limited to a single organisation.

As far as I'm aware, 80+ is not a legal requirement. Just a popular item to look for when shopping for PSUs, but if it gains enough momentum, not having it will put you at a competitive disadvantage. Back in the CRT days you might also remember TCO certification. Again, not a legal requirement. TCO also tried to be relevant in the flat panel era and also in other office products, but they seem to be largely irrelevant nowadays.
 
We have to separate between legal requirements and voluntary schemes. While I'm not 100% up to speed on how UL fits in, CE marking is not optional. Unless your product is exempt, you need it if you want to sell it in the EU. To gain a CE mark, there are various paths. You're not limited to a single organisation.

Because as usual the EU is way ahead of the US on making sane safety laws. Here you can buy a 60 amp 240 volt EV charging station with no certifications on it at all and just hope those 14.4 kilowatts don't get shorted directly to TDP flames in your garage, because there's always some factory somewhere in the world willing to run off with a quick buck for a ****ty cheap product.

80+ is not a legal requirement. Just a popular item to look for when shopping for PSUs, but if it gains enough momentum, not having it will put you at a competitive disadvantage. Back in the CRT days you might also remember TCO certification. Again, not a legal requirement. TCO also tried to be relevant in the flat panel era and also in other office products, but they seem to be largely irrelevant nowadays.
Unless and until trash-sellers from crappy weirdly named Asian companies stop selling products that don't actually meet the specs on their labels, an external organization getting paid to certify means those who don't *SHOULD* be at a competitive disadvantage. If you're not one of those crappy companies with a name like EXPLODAPOW that sells crappy products, then too bad for you, but you need to be blaming the crappy Asian companies and the buyers that continue to import them, not the company that does certification.
 
Last edited:
This discussion has been very helpful to me and I think I have a clearer understanding of what and why Nvidia has taken this course of action. I did not realize that true gsync monitors actually have a proprietary chip inside but there are also monitors that do not have the chip that Nvidia has certified to be gsync compatible because they work properly with the amended nvidia drivers. But there are also monitors without the gsync chip that do not work properly with the doctored Nvidia drivers. That's not to say Nvidia did not also see a market place competitive advantage as well in pushing their branding on monitor manufacturers.

By the way, most of you are saying that Freesync is synonymous with adaptive sync but I have read in several places that is not entirely true. Most all but not all adaptive sync monitors will work with Freesync so apparently there is a proprietary element in Freesync.
 
Last edited:
pettyg359 said:
Bluntly, that's a load of BS. UL and CE and whatever other organizations do electrical testing require testing for their logos, too. Are you going to claim they're preventing "innovative new designs", too?
:( Gee whiz! Talk about a load of BS! Read what I actually said.

First, NO WHERE did I claim they were preventing anything - let alone innovative new designs! Making such false, made up statements about what posters said is simply not cool! :mad:

I said, "some" see such logo licensing as stifling as it "may" put additional financial strain on struggling new companies.

Second, UL and CE are about "safety" standards. The NVIDIA and 80-PLUS logos have nothing to do with safety. Plus the NVIDIA and 80 Plus programs are voluntary. In many places UL and CE are government mandated by laws and regulation. That's a HUGE difference. So you need to get your facts straight and stop telling falsehoods.
 
Last edited:
This discussion has been very helpful to me and I think I have a clearer understanding of what and why Nvidia has taken this course of action. I did not realize that true gsync monitors actually have a proprietary chip inside but there are also monitors that do not have the chip that Nvidia has certified to be gsync compatible because they work properly with the amended nvidia drivers. But there are also monitors without the gsync chip that do not work properly with the doctored Nvidia drivers. That's not to say Nvidia did not also see a market place competitive advantage as well in pushing their branding on monitor manufacturers.

By the way, most of you are saying that Freesync is synonymous with adaptive sync but I have read in several places that is not entirely true. Most all but not all adaptive sync monitors will work with Freesync so apparently there is a proprietary element in Freesync.

My short understanding:
G-sync and G-sync Ultimate: have the additional hardware and meets nvidia's other requirements
G-sync compatible: nvidia certified adaptive-sync monitors without the additional hardware. This also has quality requirements beyond just supporting adaptive-sync.
FreeSync was AMDs answer to G-sync, no hardware required. If memory serves correctly, they "gave" this to be implemented in DP standard. I'm not aware if there is any process for a monitor to call itself FreeSync, as I think that is an AMD trademark. Anyone know?
FreeSync 2 was an update/extension of that, but I don't know if it is similarly free/open.
 
Back