• Welcome to Overclockers Forums! Join us to reply in threads, receive reduced ads, and to customize your site experience!

G-sync and Freesync: will it be a battle of new standards?

Overclockers is supported by our readers. When you click a link to make a purchase, we may earn a commission. Learn More.

magellan

Member
Joined
Jul 20, 2002
Are G-sync and Freesync similar enough that monitor manufacturers could easily incorporate both into a monitor?
 
I just asked that same question here:

http://www.overclockers.com/forums/showthread.php/764181-FreeSync-and-G-Sync-Monitor

This is what I learned since then. The thing that impliments both G-Sync and FreeSync in monitors is called the scaler chip. Nvidia supplies the scaler chip to the monitor companies which will enable that monitor to perform G-Sync. AMD allows the monitor company to pick whatever scaler chip they want that will allow them to perform FreeSync. Scaler chips are kind of expensive so it is unlikely that a monitor would incorporate two different scaler chips, one for G-Sync and one for Freesync. It is extremely unlikely that Nvidia would modify the scaler chip that they supply to do G-Sync with the ability to also do FreeSync. I also heard that AMD figured out a way to get FreeSync to work using HDMI.
 
They both work, and they both work well. IMO though, G-Sync is a smoother experience, but really what I tend to lean towards is simply telling users if they have an AMD card and don't plan to move to NV, grab a FreeSync panel, if you are with NV and don't plan to go AMD in the future, grab a G-Sync panel.
 
Chances are that prices on scalers capable of freeSync will drop over time and Nvidia will probably continue to hold their pricing on the proprietary system so it will be cheaper for monitor companies to offer freesync and thus those will sell better. After awhile g-sync will probably die because manufacturers won't move enough monitors due to their higher price and nvidia will switch over to the free standard...
 
Chances are that prices on scalers capable of freeSync will drop over time and Nvidia will probably continue to hold their pricing on the proprietary system so it will be cheaper for monitor companies to offer freesync and thus those will sell better. After awhile g-sync will probably die because manufacturers won't move enough monitors due to their higher price and nvidia will switch over to the free standard...

Keep in mind Freesync is proprietary just like G-Sync is.

It is just based on the open standard, Adaptive-Sync. Which, one can only hope NV will move to as well.
 
Last edited:
So both AMD and NV are selling proprietary hardware to monitor manufacturers for G-Sync/Freesync? Or are they licensing the technology to the monitor manufacturers?

So now monitors are going to be tied to big red or mean green. This doesn't sound like a good development for the consumer.
 
I think AMD has complete control of the FreeSync standard, so that they can ensure that FreeSync works with AMD cards and that the FreeSync monitors meet the minimum FreeSync standards. That said, the monitor companies do not pay anything to AMD to create Freesync monitors and can use any scaler chip that they want. Nvidia makes the scaler chip that is used in every G-Sync monitor. This causes the monitors to be very similar in terms of specs, performance and interface. Monitor companies can only buy the G-sync scaler chip from Nvidia.
 
Nvidia is looking more and more like apple. They spend stupid amount of money to "invent" stuff that already exists and claim it is superior. That will eventually catch up with them. As for which is better, there really are no difference between the two other than the minimum supported refresh rate.
There are some tests around that show Freesync having little bit less input latency/frame latency.

EDIT: G-Sync tends to cause couple FPS drop in average frame rate and Freesync really doesn't have any affect at all
 
It's too bad CRT's are dead. Aside from their outrageous power usage, I never had problems w/ghosting, input latency issues, blacks that weren't black and monitors that were tied to a particular video card manufacturer.
 
It's too bad CRT's are dead. Aside from their outrageous power usage, I never had problems w/ghosting, input latency issues, blacks that weren't black and monitors that were tied to a particular video card manufacturer.

Agree. It really was a good technology. People just want thin now. Granted CRT never could have been practical at current TV sizes, but for computer monitors it might have been possible. The issue was the glass had to be very very thick to get it flat. My parents owned a 40" sony HD CRT back in the day and it weighed a little over 300 pounds with almost all of that weight being in the glass so it was awkward to carry/move.

Modern LCD/plasma screens are more power efficient, but in pretty much every aspect they are inferior. Every new innovation is meant to improve black levels, ghosting, latency, etc none of which was a problem with the "old" technology.
 
What's the advantage to using an LED HDTV instead of a monitor?

Well, for me, it was that I can watch TV on the same device that I am using as a monitor.
I did buy this TV before I bought my AVR with HDMI pass through though. So, switching between devices is just a button press on the remote now.

Also for the TV at the time, the price point was a plus
Currently for me, my video card precludes 4k output, so I am good where I am.

Next upgrade will likely be a 980 or 970, and a 4k monitor.
 
Back