G-Sync vs. FreeSync FAQ: How variable refresh rate displays make PC games super-smooth

28.08.2015
Variable refresh rate monitor.

That jumble of words isn’t rocketing to the top of any “sexiest tech phrases” list anytime soon. Nevertheless, it’s quite literally game-changing technology that’s just beginning to seep into mainstream awareness and adoption. You may know this technology by another, slightly more memorable pair of names from two companies driving your PC gaming experience: Nvidia’s G-Sync and AMD’s FreeSync.

Despite being relatively new, dozens of G-Sync and FreeSync monitors are available to satisfy a broad range of cravings. You'll find models priced from $199 to north of $1000, encompassing 1080p, 4K, and even gorgeous curved 1440p UltraWide displays.

So what’s the big deal Do you need G-Sync or FreeSync in your life Does it cost more Are there any deal-breaking drawbacks Is this tech restricted to desktop use Is your current video card compatible Sit back, grab a beverage, and let’s tackle these pressing questions.

Ever since we began manipulating onscreen gaming graphics with a keyboard and mouse, the two crucial pieces of hardware in that equation have been butting heads. Your video card is impatient to push image frames to your monitor. But if your monitor’s refresh rate is fixed at something like 60Hz, that beautiful frame of animation comes along and the monitor isn’t ready for it. You only see part of what’s happening: a portion of the current frame, and a portion of the next frame. It looks as if the picture were trying to split itself in two and take off in different directions, and it only worsens the more dynamic your game’s frame rate becomes.

Another name for this is screen tearing, an ugly artifact that’s become something PC gamers grudgingly accept as reality. But it's more than an annoyance--it's the difference between in-game life and death. Say you’re playing Battlefield 4 and a sniper camping on some mountain peak takes aim at you. The glint of his scope against the sunlight would give him away, except you didn’t see it because it took place on that fragment of a frame your monitor rejected. Sure, it’s an extreme case, but it punctuates the very real problem.

The existing workaround is the V-Sync setting on your graphics card. Sadly, in solving one problem this introduces another--a scenario where your monitor is calling the shots. Now when your GPU is ready to deliver that frame, the monitor says “wait a few more milliseconds! This silly gamer doesn’t want screen tearing.” With V-Sync on, this manifests itself as “stutter,” or seeing the animation last a touch longer than it’s supposed to. It can be a little jarring, and make the game you’re playing feel sluggish.

Ready for yet another symptom of V-Sync The dreaded “lag.” Let’s go back to Battlefield 4 and imagine you just pulled the trigger during a gunfight. Guess what happens if you do it right before the monitor “accepts” the corresponding onscreen visual That precious bullet doesn’t fire at the exact millisecond you need it to. It can be infuriating.

G-Sync and FreeSync elegantly eradicate these problems by giving your video card complete control over the display. If your game is bouncing between 40 and 75 frames per second, for example, then your monitor is going to follow suit, its refresh rate constantly changing to keep pace with the video card. Screen tearing, stutter, and input lag all go away. 

Nvidia's G-Sync deserves credit for being first solution on the scene. Aside from the bragging rights, however, a couple of key differences distinguish this variable refresh rate technology from AMD’s.

Nvidia invented G-Sync to address both sides of the problem—the GPU and the monitor. Every monitor box emblazoned with a G-Sync logo packs a proprietary module. Nvidia understandably won’t divulge too many details, but it allows Nvidia to fine-tune the experience based on its characteristics like maximum refresh rate, IPS or TN screens, and voltage. Even when your frame rate gets super low or super high, G-Sync can keep your game looking smooth.

Nvidia points to ghosting as a key advantage G-Sync has over AMD’s FreeSync. Its G-Sync module prevents ghosting by customizing the way it operates on each and every monitor. With AMD, these adjustments are made within the Radeon driver itself, while the display's firmware is in charge of other parts of the mix. One of Nvidia’s loudest arguments is that AMD may or may not keep pace with those changes on the driver level. With Nvidia’s G-Sync module, because each monitor is physically tweaked and tuned, keeping up with all the panel variations is part of the job. Only time will tell if that argument rings true.

I have seen ghosting in AMD FreeSync panels like the Acer XG27OHU, but never in a G-Sync monitor, though the ghosting issues in some earlier FreeSync displays have since been corrected via monitor firmware updates. PC Perspective created this video to compare the ghosting effects in early FreeSync monitors against the Asus ROG Swift, a G-Sync monitor.

AMD based FreeSync on a royalty-free, industry-standard spec known as DisplayPort Adaptive-Sync. The indisputable fact here is that monitor manufacturers don’t need to implement a proprietary hardware module, meaning the cost to them is cheaper. Ideally, that savings gets passed on to you, the consumer. In fact, across the board, FreeSync price tags trend a bit lower.

Let’s take a quick look at two gorgeous FreeSync monitors from Acer. Both of them are sexy, curved UltraWide displays with an IPS panel at 3440x1440 resolution. Both have a 4ms response time, and both include HDMI and DisplayPort inputs. They’re nearly identical, except that the XR341CK supports FreeSync and costs $1099. The G-Sync version—the Predator X34—costs $200 more. Granted, it rocks a slightly higher 100Hz refresh rate, but the G-Sync markup is obvious. That’s an expensive example, but it doesn’t hurt AMD’s argument that FreeSync is the more affordable solution.

As you can imagine given the frequently bitter rivalry between AMD and Nvidia, your GeForce GTX video card won’t support FreeSync, and your Radeon video card won’t give you that buttery smooth experience on a G-Sync monitor. Yes, Nvidia has the option of adopting the FreeSync/Adaptive-Sync standard—as Intel plans to do one day —but if you’d invested millions into developing a technology to exclusively benefit your users, would you

Worries about brand lock-in aside, let’s say you want to take the plunge. Is your beloved graphics card compatible On the Nvidia side the answer is simple: Every GeForce GTX card since the 650Ti will do the trick, including every 700 series and 900 series desktop graphics card. With AMD the support is a bit scattershot, because some of the company’s offerings are based on older GPUs. For example, the Radeon 360 is FreeSync compatible but the Radeon 370 isn’t. The Radeon 260, 260x, 285, 290, and 290x are ready for FreeSync, but the 270 and 270x aren’t. And so it goes.

Here’s something cool though: A strong handful of AMD’s affordable APUs (an all-in-one CPU and GPU) also support FreeSync, which opens the possibility of building a cheap 1080p gaming box and still getting a smooth gaming experience courtesy of FreeSync.

Yes, but only from Nvidia. A handful of G-Sync powered notebooks are on the market right now from ASUS and MSI, with more on the way from popular manufacturers like Clevo and Gigabyte.

Unlike desktop monitors, notebook displays won’t require that proprietary G-Sync module, but to ensure quality, Nvidia is pretty stingy with its approval process. For example, all of the current G-Sync-enabled laptop displays top out at 75Hz, not the standard 60Hz. As for supported mobile GPUs, right now it’s just the 965M, 970M, and 980M. Nvidia is dedicated to the G-Sync cause, so expect to see a proliferation of G-Sync gaming notebooks in the near future.

In our experience, no. Both will greatly improve your gameplay experience.

A minor niggle, however: As things stand right now, both technologies work only with DisplayPort inputs, meaning the vast majority of TVs are locked out of the equation. That’s bad news for folks rocking an HTPC or perhaps an upcoming Steam Machine in their living rooms over HDMI. AMD may come to the rescue in the near future, as the company recently demonstrated an early version of FreeSync working over HDMI. Nvidia currently has no announced plans to incorporate HDMI into G-Sync.

Both FreeSync and G-Sync work exceptionally well at combating the decades-old visual problems plaguing PC gaming. Neither has an exclusive feature compelling enough to warrant switching camps, however. Nvidia has a slight advantage at the very low and very high end of the frame rate spectrum, and its G-Sync does a better job with ghosting, but these are what we’d call edge cases that won’t affect the vast majority of gamers. On the other hand, AMD has a price advantage with comparable FreeSync monitors clocking in at an average of $100 to $150 less.

Whichever you choose, we enthusiastically encourage you to jump on the variable refresh rate train, as long as you’re okay with the fact that you’re locking yourself into purchasing graphics cards from the same brand for the life of the display. If you’re rocking Radeon, make FreeSync your next monitor upgrade. If you’re gaming with GeForce, take a good hard look at the crop of G-Sync options. It can’t be stressed enough how dramatically they improve a game’s immersion, and how effectively they eliminate nasty screen tearing, stutter, and input lag, all without introducing new problems into the mix.

I’ll go on record saying that if given the choice between a non-G-Sync/FreeSync 4K monitor and a smaller, G-Sync/FreeSync-enabled 1440p monitor, I’ll choose the latter every single time. It’s just that awesome. Find a way to witness it for yourself and you’ll be convinced.

(www.pcworld.com)

Jason Evangelho

Zur Startseite