Acer's $500 AMD FreeSync monitor drastically undercuts Nvidia G-Sync pricing

24.03.2015
In the months leading up to the launch of monitors compatible with AMD's FreeSync technology, everyone speculated that the Radeon graphics-card-friendly displays would severely undercut comparable Nvidia G-Sync monitors on price. It's easy to see why: While Nvidia's technology eliminates stuttering and screen tearing with the help of a proprietary hardware module inside the display itself, AMD's simply relies on the optional DisplayPort 1.2a Adaptive Sync spec--no fancy-pants extra hardware required.

Well, AMD launched FreeSync last week. And at least one FreeSync display is already drastically undercutting the G-Sync competition, as Acer announced yesterday that its XG270HU FreeSync monitor costs $500, as opposed to the $800 sticker price of Acer's similar XB270HU G-Sync display.

That's not quite a fair comparison, however. Both 27-inch monitors pack 2560x1440 resolution and 144Hz refresh rates, but the FreeSync monitor also has a few features its G-Sync counterpart lacks: built-in speakers, dual-link DVI, and an HDMI 2.0 port. (Both monitors pack a DisplayPort connection.) But more crucially, Acer's FreeSync-compatible monitor packs a twisted nematic (TN) panel, while the G-Sync monitor uses an in-plane switching (IPS) display.

TN panels offer superior response time to IPS panels--as evidenced by the FreeSync monitor's 1ms response time, versus the G-Sync monitor's 4ms--but IPS monitors offer superior color and viewing angles, so they cost more. By using a TN panel in its FreeSync display, rather than matching the XB270HU's IPS, Acer not only to keeps the price tag down, but also obfuscates the cost of including the G-Sync hardware.

The cost of adding Nvidia's G-Sync hardware module to a display is a closely guarded secret, but it's estimated to add an extra $100 to $150 to the final retail sticker price of a monitor.

But Acer's panel choice renders any comparison between Acer's FreeSync and G-Sync monitors something of an apples-and-oranges thing.

A much better comparison would be Acer's $500 XG270HU FreeSync display versus Asus' ROG Swift, a 27-inch G-Sync display that also sports a 2560x1440, 144Hz TN panel and a 1ms response time. The price of Asus' G-Sync monitor Also a cool $800, just like Acer's IPS G-Sync display.

That's a pretty gargantuan price difference between comparable G-Sync and FreeSync monitors.

Pick your poison

While I've only tried G-Sync--and instantly became a true believer in adaptive refresh technology--my colleague Gordon Mah Ung went hands-on with LG's widescreen FreeSync display and found it to be just as compelling. But in an interview with Forbes' Jason Evangelho, Nvidia's Tom Peterson defended the continued use of its G-Sync module, saying the hardware gives Nvidia and its partners greater control over how displays handle drastic variations in frame rates and flickering below the screen's minimum refresh rate, among other things.

Peterson pointed towards evidence of ghosting in FreeSync displays, which PC Perspective thoroughly documented in its FreeSync impressions piece. (Check out page three for the ghosting discussion.) Evangelho says he saw ghosting effects in the FreeSync monitor he's testing, though the LG 34UM67 monitor we tested here at PCWorld didn't have evidence of it. AMD told PC Perspective that the components chosen by display manufacturers can affect the ghosting.

The story behind the story: Forget 4K, adaptive refresh is the display technology most gamers need. It's wonderful in action. But the competing FreeSync and G-Sync standards introduce a scary new potential for lock-in. Since each type of monitor can only use G-Sync or FreeSync's adaptive refresh technology when paired with a GeForce or Radeon card, respectively, buying one of these displays likely means pledging your allegiance to either Nvidia or AMD for years to come. Displays tend to last far longer than graphics cards.

Choose wisely. Or just sit it out and wait for an open standard to hopefully emerge if you don't want to lock yourself into buying Radeon or Nvidia cards exclusively for the next decade.

(www.pcworld.com)

Brad Chacos

Zur Startseite