Last October, Nvidia announced a new a new display technology which would virtually eliminate any screen tearing or input lag, the side-effects most prominent in SLI enabled systems, using the G-Sync Technology. According to Nvidia, this is achieved by building the G-Sync chip inside the monitor which allows the chip to synchronize the monitor to the output of the GPU, instead of the GPU to the monitor, resulting in a tear-free, faster, smoother experience as compared to conventional monitors. OEM montiors with built-in G-Sync chip will be available later this year.
While everyone was busy with applauding Nvidia for providing the gamers with stutter-free gameplay, their rivals AMD were busy to find their own solution and they might have just found their own answer, dubbed “FreeSync“.
Demoed on a laptop at last week’s CES, the company’s engineers claimed that they can replicate much of the advantages of Nvidia’s G-Sync through the use of what they call “dynamic refresh rates”. AMD graphic cards have the ability to alter refresh rates on the fly, with the goal of saving power on mobile displays but this option is not supported by many panel makers yet as this is not standardized. Dynamic refresh rates would, atleast in theory, work like G-Sync by specifying how long the display remained blank on a frame-by-frame basis, providing for a smoother gameplay experience.
So if this was a simple thing to do, then why didn’t AMD bring this tech to the mainstream? Well, AMD says that this is because of a lack of demand and claimed that if the gamers are interested in the G-Sync technology, then they will also provide an equivalent for its followers.
Lastly, AMD couldn’t resist taking a swipe at their green competitor’s product by claiming that “Nvidia built an expensive hardware solution for this problem because Nvidia isn’t capable of supporting G-Sync in any other fashion.”
As expected, Nvidia wasn’t impressed by “FreeSync”. TechReport spoke to Nvidia’s Tom Petersen and he said that the main difference between the two is that AMD’s solution works well with laptop displays as they are typically connected using DisplayPort or LVDS standard. PC monitors on the other hand, have their own internal scaling solutions and , usually, these chips don’t support a Dynamic Refresh rate.
So Nvidia’s solution is more of an investment that you buy a G-Sync kit and then you can use the same monitor for a couple of years. However, if the LCD makers decide to implement Dynamic refresh rates in their monitors at any point in the future, then Nvidia’s G-Sync may become obsolete before it even goes mainstream.
Do you think both these technologies can co-exist side by side? Or there could be only one winner? Let us know in the comments