Hands-on with AMD's FreeSync: The technology that could kill Nvidia's G-Sync

19.03.2015
If there's one thing tech market doesn't need, it's another standards cat fight. But you survived Firewire vs. USB, HD-DVD vs. Blu-ray, and RDRAM vs. DDR  so get ready for the battle between Nvidia's G-Sync and AMD's FreeSync to kick into high gear.

That war formally kicked off this morning, with AMD announcing that no fewer than four monitors supporting its sync technology were finally available for sale in the U.S., and another seven are expected soon. By the end of the year, the company says, expect 20 monitors supporting FreeSync to be available.

FreeSync and G-sync, if you didn't know, are technologies from the leading graphics companies that let the graphics cards synchronize the display of a frame in a video game with the actual output of the video card. Both promise to eliminate tearing and stuttering in games, but neither are compatible and they take different routes to get there.

Nvdia's G-Sync was out the chute first, having been announced way back in October of 2013. From all accounts it was impressive. The problem was actually getting G-Sync monitors. The first one's were based on existing stocks of high-frame-rate 3D monitors, modified by OEMs, and it would take months for the press to touch the panels let alone the public. G-Sync puts actual hardware inside the monitors that communicate with Nvidia's modern GPUs. The hardware, naturally, is sold only by Nvidia.

Within months of Nvidia announcing G-Sync, AMD introduced FreeSync. Instead of panel makers adding modules to the monitors, AMD's version would rely on an idea already being kicked around to vary refresh rates in laptops to save power. An unlike Nvidia's tack to get hardware out as soon as possible by using proprietary components, AMD proposed putting variable sync matching directly into the specs, so future monitors would support it. AMD was successful and VESA, the group that blesses monitor standards, baked Adaptive Sync into DisplayPort 1.2a last April.

G-Sync actually works

My experience with Nvidia's G-Sync aside from trade shows and demos has been limited to an Acer 4K G-sync panel. Outside the control of any company, and without someone peering over my shoulder while I mucked with it, I'd have to say G-Sync is exceedingly easy to setup. You just need a compatible GeForce GPU (which is pretty much most of Nvidia's modern GeForce 6, 7, 8, and Titan GPUs) and a driver with G-Sync support. Switch it on in the Nvidia control panel and you're done.

On the Acer 4K panel I drove, the impact is fairly impressive. Not only is stutter and tearing reduced to almost nothing, you can actually play games at lower frame rates you'd normally turn your nose up at. That was one of the early arguments for G-Sync too. Even though it added cost to the monitors, you could ease back a bit on your GPU budget to fund the purchase, since a frame rate of 45 frames per second would be acceptable with G-Sync.

As it's the more mature technology, there's really no mystery here. G-Sync works and it works great if you have an Nvidia GPU. PCWorld's GPU reviewer Brad Chacos, for example, ran into a problem trying to get a Radeon R9 290X to work with an Acer 4K G-Sync panel and had to abandon it for a standard Dell 4K monitor to complete his testing of that card.

Firing up FreeSync

AMD's FreeSync isn't quite as easy, but it's by no stretch difficult: You need a compatible Radeon card or APU. While Nvidia's list of supported cards stretches pretty far back and deep, AMD's is very limited: The Radeon R9 295 X2, -290X, -290, -285; and the Radeon R7 260X or -260. That's it for discrete GPUs.

You'll also need a compatible driver to switch on the support. For my hands-on, I relied on a beta driver that AMD provided.

I used LG's wide-aspect 34-inch 34UM67 display. This is a 2560x1080 IPS panel with a maximum refresh of 75Hz. To turn on FreeSync, you go into the monitor's OSD, drill into the general settings tab, and flip it on.

Once that's done, you'll also need to make sure it's enabled in the control panel for your Radeon card. I failed on my first attempt, because the machine I used--a Core i7-5960X Haswlell-E rig with a pair of Radeon R9 290X cards--had CrossFireX enabled by default when I installed the new driver. Turning CrossFireX off allowed FreeSync to be switched on, but I lost the power of one GPU. Yup, there's no CrossFireX support--yet. Remember, it's a beta. An AMD official said to expect CFX capability with FreeSync next month. We'll see. Did I mention that G-Sync works fine in SLI

My first test was AMD's own FreeSync demo. It's a windmill with the blades rotating and tearing was very apparent with both FreeSync and V-sync turned off. WIth V-sync turned on, I ran into stuttering that occurs due to the timing of how frames are rendered and pushed to the monitor being out of sync. 

FreeSync isn't just about demos, so I also fired up the CodeMaster's Grid 2 driving game. I wanted an older game that would hit high frame rates and this two-year-old title set to medium quality didn't disappoint. Even on the wide-aspect 2560 x 1080 monitor I could hit in excess of 250 frames per second with V-sync switched off and FreeSync on. This excessive frame rate would normally exhibit tearing, but I didn't see any. Unfortunately, I didn't have an identical G-Sync rig and monitor nearby,  so I could only rely on my memories of running 4K and G-Sync. But FreeSync seems to work pretty damned well.

That makes this whole standards war even harder. If one were clearly superior you might want to root for one, but both technologies seem to work, and for most people the end result will be the same. Yes, there is a chance Nvidia could spring some secret feature since its technique reaches far deeper into the guts of a monitor than AMD's, but for variable refresh rates, most people won't be able to tell any difference. AMD says FreeSync actually improves frame rate over G-Sync when enabled, but the difference is minimal. With FreeSync on, you might see a 0.16-percent boost versus a minus 1.14-percent frame rate hit on G-Sync. Really

That leave's consumers in a pretty tough bind on a new monitor purchase: Do you pick G-Sync or FreeSync on something you'll probably use for at least the next five years 

And if you think that's hard, imagine being a monitor maker stuck in the middle. Most of the G-Sync panel vendors are also making FreeSync monitors.

PCWorld spoke to one panel maker, who asked not to be identified for fear of wrath from either company, and the vendor confirmed they wish it was over with Adaptive Sync crowned as the winner. 

At the same time, the vendor said AMD's claim that FreeSync is completely free isn't necessarily true, since panel makers must retool to bake in FreeSync. The cost still won't be as high as Nvidia's G-Sync module though: AMD claims those cost $150, but I'm told it's closer to $100. Nvidia officials declined to comment on cost; they also said the company doesn't spill details on its partner relationships.

In the end, the monitor vendor said it would be better for everyone if Nvidia and AMD would compete on GPU technology, drivers, and price rather than this sync war.

So who will win

But back to the consumer who will be forced to choose between the two when buying a monitor. If the 11 monitors that support FreeSync actually all appear, it would mean AMD has an advantage in support. Even almost a year and a half after announcing G-Sync, the number of current G-Sync panels is six according to Nvidia's own page.  If AMD is right, and we see 20 FreeSync panels by the end of this year, that's a strength in numbers G-Sync has never enjoyed.

Nvidia's strength, on the other hand, is the popularity of its GPUs. Most hardware surveys give Nvidia roughly a 2:1 advantage in discrete graphics market share, which means there's a higher chance of a gamer buying a G-Sync monitor to match his or her Nvidia GPU. 

To balance that out, monitor's using FreeSync appear to have a price advantage: A 27-inch monitor with G-Sync and a resolution of 2560x1440 is $780. A competing monitor with the same-size panel and resolution is $630. In addition to the cost advantage, AMD points to the fact that since FreeSync is baked into the DisplayPort 1.2a spec, any new monitor introduced going forward will support FreeSync by default.

Some believe that Nvidia could easily end the debate by supporting Adaptive Sync as well, but that's unlikely to happen as long as Nvidia leads the GPU market and can convince gamers to spend more to buy G-Sync displays. An Nvidia spokesman pretty much confirmed that: "We have no plans to support the Adaptive Sync optional protocol," he told PCWorld "We are only focused on G-Sync."

In other words, bunker down because this war is going to last longer than you want it to.

(www.pcworld.com)

Gordon Mah Ung

Zur Startseite