Radeon R9 Fury review: The Fury X's little brother is AMD's GeForce GTX 980-slayer

10.07.2015
AMD's water-cooled, luxuriously designed Radeon Fury X graphics card was supposed to be the star of the show--the technology-packed counterpunch to Nvidia's ferocious GeForce GTX 980 Ti. AMD drip-drop-dripped information about the Fury X in the days ahead of the card's launch, slowly teasing enthusiasts with leaks and glimpses and internal benchmarks. In the end, the $650 Fury X lived up to its name, proving competitive--if not outright dominant--to Nvidia's beast, and promptly selling out in stores.

But forget about the Fury X.

It's the Fury X's little brother that AMD should be shouting about from the rooftops: the $550 Radeon Fury. Sure, it's not quite as powerful as AMD's liquid-chilled flagship, but the Radeon Fury is nothing less than a stellar card that clearly outpunches its GeForce GTX 980 counterpart--something the Fury X can't quite claim against the 980 Ti.

Let's dig in.

AMD Radeon Fury detailed

AMD originally implied that the Radeon R9 Fury was merely an air-cooled version of the Fury X, but that's not quite true.

The Fury indeed mimes the vast majority of the Fury X's technical features, from its 4GB of cutting-edge high-bandwidth memory (HBM) to its 275W power draw via a pair of 8-pin connectors. All of the software features found in the Fury X also work with the Fury, from Frame Rate Target Control to Virtual Super Resolution.

The differences from the Fury X are fairly major, however. First, and most noticeable: The Radeon Fury is indeed air-cooled, while the Fury X is available only in its liquid-cooled reference design. AMD partners are allowed to slap customized hardware and overclocks on the Fury, which Asus did to full effect with the Strix R9 Fury DirectCU III OC we reviewed.

The card largely replicates the design of Asus' Radeon R9 390X Strix, featuring Asus' vaunted DirectCU II tri-fanned cooling system for cooler, quieter running. Those fans actually stay off (and therefore silent) until the card hits 65 degrees Celsius, relying on the beefy heatsinks and pipes underneath to cool things down.

The sleek-looking backplate on the top of the card leaves the back of the Fiji GPU and HBM exposed for more airflow. (The pulsating, illuminated red-and-white Strix logo on the side of the card is a nice touch, too.) Interestingly, while the reference-only Fury X eschews a DVI port completely, Asus stocked the Strix Fury with DVI-I, HDMI (still 1.4, sadly), and a trio of DisplayPorts.

While the Fury X was a pint-sized powerhouse, measuring in at just 7.6 inches, the air-cooled Strix Fury is a full-width graphics card, measuring 11.8 inches long. Sapphire's competing Tri-X R9 Fury is also a full-length card, though that's due to Sapphire's extending the cooling assembly beyond its shorter-than-normal circuit board. In other words: Don't expect Fury boards to be as tiny as the Fury X.

More significant than the aesthetics and cooling are the Fury's under-the-hood tweaks, which AMD didn't mention previously. The Fury sports a cut-down version of the beefy new Fiji GPU found in the Fury X, chopping off 32 texture units, 512 stream processors, and 50MHz off the max clock speed, to 1000MHz boost. The Asus Strix Fury is overclocked to 1000MHz base with a 1020MHz boosted clock speed.

We received our review sample a scant 15 hours before the embargo time for reviews lifted, so we were unable to test out the Fury's overclocking capabilities. However, the modest factory OC on the Asus Strix Fury (and Sapphire's Tri-X R9 Fury, as well, at 1050MHz) suggest that this card may not be an overclocking fiend. Remember: Our attempts to overclock the liquid-cooled Fury X resulted in a mere 60MHz boost, good for an extra 2 to 3 frames per second in gameplay. The Strix Fury packs 12-phase Super Alloy Power II materials, DIGI + VRM, and Asus' stellar GPU Tweak II overclocking software to help you squeeze all the performance you can out of the card.

Before we move on to the meat of the review--performance benchmarks!--a few quick notes. The R9 Fury, of course, supports AMD's defunct Mantle API, as well as its performance-enhancing Vulkan successor and the similar DirectX 12 technology coming with Windows 10.

Radeon Fury performance benchmarks

Let's clear the air right off the bat: While the Radeon Fury isn't quite as capable as the Fury X or the GTX 980 Ti, it pummels the GTX 980 in many games, and it's a dead heat in the handful of titles where it's close. Heck, you could even feasibly use this for single-card 4K resolution gaming for today's games, but if you do you'd definitely want to invest in a FreeSync panel to smooth things out. The GTX 980 and R9 390X simply don't offer that.

As ever, we tested the Asus Radeon Strix Fury on PCWorld's graphics card testing system. You can read our build guide for the machine if you're interested, but here's the quick and dirty version:

We tested each title using the in-game benchmark provided, and stuck to the default graphics settings unless mentioned otherwise. A mix of both AMD- and Nvidia-leaning titles were used. V-Sync and G-Sync were always disabled.

To see how the Fury stacks up, we've compared it to the $650 Fury X, the $650 GTX 980 Ti, the vanilla $500 GTX 980, Asus' overclocked, custom $469 Strix R9 390X, and AMD's older Radeon R9 290X reference card (atrocious stock cooler and all). The stock Fury price is $550; the Asus Strix Fury is $580.

First up: The long-awaited Grand Theft Auto V. This game's known for using a punishing amount of memory, but the Fury's 4GB of HBM holds up just fine even at 4K resolution with all the graphics options cranked. Enabling MSAA effects at 4K sends the total memory use over the card's 4GB capacity, but doing so doesn't really add any benefits to the visual, because the graphics already look so damn smooth at 4K.

We tested it three ways: at 4K with every graphics setting set to Very High' with FXAA enabled, at 2560x1440 with the same settings, and at 2450x1440 with the same settings but with 4x MSAA and 4x reflection MSAA enabled. AMD's new Catalyst 15.7 drivers appear to have caused a big performance jump in the title, but sadly, we weren't able to retest the Fury X's performance for this review due to time constraints. 

The Strix Fury and GTX 980 hang neck-and-neck here, which is a huge accomplishment for AMD considering how much better Nvidia cards ran the title at its launch.

Next up: Middle-earth: Shadow of Mordor. We tested it at the default Medium and High graphics presets, then by cranking everything to its highest available option and using the optional (free) HD Textures Pack download, which consumes a big chunk o' memory itself. HBM's sheer speed helped the Fury hang with extra RAM usage just fine.

Dragon Age: Inquisition was one of the best games of 2014, and it's a looker, too, utilizing EA's powerful Frostbite engine.

Sniper Elite III is a blast to play, though not as demanding on graphics cards as most of the other titles. 

Sleeping Dogs: Definitive Edition is a beefed-up remake of the surprisingly awesome sleeper hit, and it can absolutely murder graphics cards at its most extreme graphics settings, no matter the resolution.

Metro: Last Light Redux is another remake of a tremendous game, built using 4A Games' custom 4A engine.

Alien Isolation, like Dragon Age, was an AMD Gaming Evolved title, and like Dragon Age, it scales well across all hardware. (It's also utterly terrifying and stress-inducing.)

Finally, oldie but goodie Bioshock Infinite is our stand-in for Unreal Engine 3. Both AMD and Nvidia have had plenty of time to optimize their drivers for the game by this point.

We also benchmark the titles using the tried-and-true 3DMark Fire Strike and Unigine Valley synthetic benchmarks. 3DMark Fire Strike Ultra is a more robust version of Fire Strike designed to test a graphics card's 4K chops.

To test power consumption and GPU temperature, we run the grueling worst-case-scenario Furmark benchmark for 15 minutes, taking temperature information at the end using the tool's built-in temperature gauge and verifying it with SpeedFan. Power draw is measured during the run on a whole system basis, not the GPU individually, by plugging the computer into a Watts Up Pro meter rather than directly into the wall.

Note that while the power and thermal use appear high for the Fury Strix in the chart, this measures a worst-case scenario. During actual game use the Fury hit temperatures roughly around 75C under most extreme gaming loads. (Also note how brightly the Fury X's integrated closed-loop water cooling shines when it comes to temperature.)

Bottom line

There you have it: Despite rocking a cut-down version of AMD's hulking new Fiji GPU, the Radeon Fury Strix still packs a hell of a punch, landing closer to the GTX 980 Ti in terms of performance than the GTX 980 in general. And it's definitely a vast improvement over the older R9 290X in performance, power consumption, and sheer heat.

Out of the box, the Fury delivers exactly what you'd want out of a card with its price point. For $50 more than the GTX 980, it offers a solid jump in performance. The Radeon Fury is unequivocally a superior 2560x1440 gaming option, period.

Don't expect its release to shake up pricing too hard, though: The GTX 980 and Radeon R9 390X still seem positioned well when it comes to price/performance. 

The Fury also delivers a (comparatively) cheap entry into reasonable 4K gaming. There are some concerns that prevent the Fury from being an absolute must-buy for folks looking for a lower-cost, single-card 4K experience, however, no matter how close it comes to the Fury X and GTX 980 Ti.

First of all, with High graphics settings enabled at 4K resolution, it juuust squeaks past the 30fps minimum required for a decent gameplay experience with a few of the titles (Dragon Age, GTA V, Sleeping Dogs). To be honest, 4K gaming on a single GPU is still in its infancy, and games only become more demanding as time goes on. If you're looking to play games at 4K resolution with a single graphics card today, you'd probably be better off spending the extra $100 for higher frame rates and grabbing a Fury X, GTX 980 Ti, or AMD's beastly dual-GPU Radeon R9 295x2, which outpunches both of the others. The slightly higher performance those cards provide would give you a more comfortable level of future-proofing.

You wouldn't want to drop $550 on a graphics card and have to drop down to Medium settings at 4K in next year's hottest games. That would hurt.

Related, while HBM is a powerful new technology delivering an insane amount of memory bandwidth, the 4GB cap on the first-generation version is worrisome for people looking to dip their toes into 4K gaming. Gaming at such a high resolution absolutely chews through RAM, and current-day titles like GTAV and Shadow of Mordor are already skirting the Fury's 4GB capacity at 4K. The Fury can run today's games at 4K without issues, but what about tomorrow's AMD's engineers think driver optimizations can keep things running smoothly in the future, but it's yet to be proven.

Pushing the card even further is another potential hesitation point for enthusiasts. While we didn't have time to test the Fury's overclocking chops, the GPU itself proved resistant to MOAR POWER when we tested the Fury X. GeForce cards built using Nvidia's supremely power-efficient Maxwell architecture are known for their extreme overclocking prowess, with some chips hitting an insane 20 percent OC. That said, the Strix Fury solidly beats the GTX 980 in most games, so even Maxwell's vaunted overclocking capabilities might not be enough to tip the scales.

Don't let those qualms turn you off the Radeon Fury, though. Out of the box, the air-cooled Fury does everything we'd hoped it'd do--perhaps even a bit more--by blowing the pants off Nvidia's GeForce GTX 980 in most games and delivering a stunning 2560x1440 gaming experience. This card comes highly recommended at that resolution.

And if you're looking for the lowest-cost way to start legitimate 4K gaming today, the Radeon Fury delivers--as long as you keep our caveats about future-proofing in mind. And you buy a FreeSync monitor.

(www.pcworld.com)

Brad Chacos

Zur Startseite