Nvidia GeForce GTX Titan X review: Hail to the new king of graphics cards

17.03.2015
Nvidia sure knows how to strike a killer first impression.

The company revealed its new GeForce GTX Titan X not with a massive event, not with a coordinated marketing blitz, but by CEO Jen-Hsun Huang striding unannounced into Epic's Game Developers Conference panel, introducing "the most advanced GPU the world has ever seen," autographing one for Epic's Tim Sweeney, then casually striding back out.

Like a boss.

Nvidia's walking the walk to back up the talk, though. The $1,000 Titan X truly is the bestest, baddest, most firebreathing single-GPU graphics card in all the land--and it's the first one able to play many games on high detail settings at 4K resolution all by it's lonesome, with no multi-card setup necessary. It is a beast.

This is going to be fun.

Meet the Titan X

Let's talk about technical design before jumping into raw performance specs. Huang stayed vague on tech specs when he revealed the Titan X at GDC, only teasing that the graphics card contains 8 billion transistors and 12GB of memory. That extreme amount of memory led some to believe the Titan X would be a dual-GPU card, like AMD's Radeon R9 295x2 or Nvidia's own Titan Z.

Nope.

The Titan X's beating heart is the all-new 28nm GM200 graphics processor unit (GPU), which is basically the bigger brother of the GM204 chip found in the GTX 980 and 970. Since it's based on "Big Maxwell" rather than the GTX 960's newer GM206 chip, the Titan X lacks the GTX 960's H.265 decoding abilities, and likely its HDCP 2.2 compliance as well. (We've asked Nvidia but haven't received an answer yet.) GM200 can handle H.265 encoding, however. 

Built using the same energy-efficient Maxwell architecture as its GTX 900-series brethren, the Titan X packs a whopping 3072 CUDA cores--compared to the GTX 980's 2048--along with 192 textures units. The card comes clocked at 1000MHz, with a boost clock of 1075MHz. You can see the full list of specifications in the chart at right. For most of the core GPU specs, it's basically a GTX 980 plus 50 percent more.

That 12GB of onboard RAM is clocked at a speedy 7Gbps--just like the GTX 900-series graphics cards--and it utilizes a 384-bit bus. AMD's high-end Radeon GPUs use a wider 512-bit bus, but slower 5Gbps memory, for comparison.

Physically, the black, aluminum-clad Titan X rocks three DisplayPort connections, a solitary HDMI 2.0 port, and dual-link DVI. The card draws 275 watts of power through an 8-pin and 6-pin power connection. It measures 10.5-inches long in a traditional dual-slot form factor. Unlike Nvidia's GTX 980 reference card, the Titan X has no backplate, ostensibly to better facilitate cooler airflow in multi-card setups.

Speaking of, here's how Nvidia describes the Titan X's cooler design:

"A copper vapor chamber is used to cool TITAN X's GM200 GPU. This vapor chamber is combined with a large, dual-slot aluminum heatsink to dissipate heat off the chip. A blower-style fan then exhausts this hot air through the back of the graphics card and outside the PC's chassis."

The card runs extremely quietly even under load, to the point that I'm not sure if I was hearing the case fans or the GPU cooler during intense benchmarking sessions. That's essential, especially since all Titan X designs will rock reference coolers only--there will be no aftermarket cooler options from board vendors. Nvidia claims the Titan X overclocks like a champ, hitting up to 1.4GHz in the company's internal testing. I was unable to OC the Titan X due to time constraints, but given the superb overclocking capabilities of every other Maxwell-based GPU, I heartily believe the claim.

The Titan X features the same basic software features as the GTX 980 and 970, including Voxel Global Illumination (VXGI), which lets developers create better dynamic lighting without invoking a massive performance hit, and VR Direct for virtual reality gaming. (The Titan X was actually used to power many of the VR demos on display at GDC 2015--hence the surprise launch during Epic's panel.)

It also fully supports Nvidia's impressive Multi-Frame-Sampled Anti-aliasing (MFAA) technology, which smooths out jagged edges at a level similar to traditional MSAA, but with much less of performance hit. This awesome technology works with any DirectX 10 or DX11 title that supports MSAA and basically provides a free--and often substantial--frame rate increase. That's a huge deal at any resolution, but it can mean the difference between a playable game and stuttering garbage at 4K resolution.

If you use Nvidia's GeForce experience to automatically optimize your games, it'll enable MFAA in place of MSAA by default.

Next page: Performance benchmarks and a final verdict on Nvidia's Titan X graphics card.

Benchmarking the Titan X's performance

So why does the Titan X rock such a ridiculous amount of RAM The massive 12GB frame buffer is frankly overkill for today's games, but it helps future-proof one of the Titan X's biggest strengths: Ultra-high-resolution gaming. Higher resolutions consume more memory, especially as you ramp up anti-aliasing to smooth out jagged edges even more.

The Titan X is the first video card that can play games at 4K resolution and high graphics settings without frame rates dropping down to slideshow-esque rates.

Not at ultra-high-level details, mind you--just high. And still not at 60 frames per second (fps) in many cases. But you'll be able to play most games with acceptable smoothness, especially if you enable MFAA and have a G-Sync-compatible monitor.

Nvidia sent a G-Sync panel--Acer's superb, 3840x2160-resolution XB280HK gaming monitor--along with the Titan X for us to test, and it's easy to see why. When enabled in a compatible monitor, Nvidia's G-Sync technology forces the graphics card and the display to synchronize their refresh rates, which makes stuttering and screen tearing practically disappear. (Monitor makers are expected release displays with AMD's competing FreeSync soon.)

Merely reading the words on a screen doesn't do the technology justice. It rocks. G-Sync makes games buttery smooth. When it's paired with the Titan X at 4K resolution, you won't even care that the games aren't technically hitting 60fps.

That said, I disabled G-Sync and MFAA during our benchmark tests to level the playing field for Radeon cards. For comparison benchmarks, we included AMD and Nvidia's top-end mainstream consumer cards--the R9 290X and GTX 980, respectively--as well as two 980s running in SLI and AMD's Radeon R9 295x2, a single-card solution that packs a pair of the same GPUs found in the 290X. And, of course, the original Titan.

Since most people don't commit GPU specs the memory the same way they do obscure baseball statistics from 64 years ago, here's a quick refresher chart to help. The Radeon R9 295x2 isn't on the chart but it's essentially two 290X GPUs crammed into one card.

An interesting side-note: The R9 290X refused to play nice on the G-Sync monitor, flickering constantly. A 4K Dell UltraSharp was called in as cavalry. All tests were done in our DIY test bench consisting of the following components. (You can find full details in our build guide for the system.)

First up we have Middle-earth: Shadow of Mordor. While our reviewer wasn't blown away by the game itself, Shadow of Mordor garnered numerous industry awards in 2014 for its remarkable Nemesis system--and with the optional Ultra HD Texture pack installed, it can give modern graphics cards a beating. The add-on isn't even recommended for cards with less than 6GB of onboard RAM, though it'll still run on more memory-deprived cards. (Click to enlarge any graph or image in this article.)

The game was tested by using the Medium and High quality presets, then by using the Ultra HD texture back and manually cranking every graphics option to its highest setting (which Shadow of Mordor's Ultra setting doesn't actually do). You won't find numbers for the dual-GPU Radeon R9 295x2 here, because every time I tried change the game's resolution or graphics settings when using AMD's flagship, it promptly crashed the system, over and over and over again. Attempts to fix the problem proved fruitless.

Sniper Elite III was released in the middle of 2014. While it's not the most graphically demanding game, it scales well across various resolutions, and it's an AMD Gaming Evolved opposite to Shadow of Mordor's Nvidia-focused graphics. Plus, it's always fun to snipe Nazis in the unmentionables in slow motion.

Next up: Sleeping Dogs: Definitive Edition. This recent remaster of the surprisingly excellent Sleeping Dogs actually puts a pretty severe hurting on graphics cards. Even the highest of highest-end single-GPU options hit 60fps in Sleeping Dogs: Definitive Edition with detail settings cranked, at 4K or 2560x1600 resolution.

Metro Last Light Redux is a remaster of the intensely atmospheric Metro Last Light, using the custom 4A Engine. Not only is the game gorgeous, it's an utter blast to play. It's tested with SSAA disabled, because SSAA drops frame rates by roughly 50 percent across the board. 

Alien Isolation is the best, most terrifying Aliens experience since the original Ridley Scott movie. The game scales well across all hardware, but looks especially scrumptious in 4K.

Bizarrely, we couldn't coax Bioshock Infinite, a regular in our test suite, into offering a 4K resolution option in its benchmarking utility, despite being able to actually play the game in 4K. Here's how the Titan X stacks up to the competition at lower resolutions, though.

I also tested the systems using two off-the-shelf benchmarking tools: 3DMark's Fire Strike, and Unigine's Valley. Both are synthetic tests but well respected in general.

Finally, here's the power usage and thermal information. For thermals, we run the Furmark stress test for 15 minutes and record the GPU temperature using SpeedFan. Power usage is the total power consumed by the PC at the wall socket, measured with a Watts Up meter during a Furmark run.

All the various Nvidia reference cards run hotter than the Radeon R9 295x2, which uses an integrated closed-loop water-cooling solution, but none of them ever generated much noise or began throttling back performance. No surprise, our Radeon R9 290X--which is known for running hot on account of its atrocious reference cooler--hangs out at the front of the pack. 

Nvidia's GeForce GTX Titan X: The final verdict

Nvidia was right: Single-GPU graphics cards don't come more powerful than the Titan X. It's no contest. The Titan X truly is the first solo GPU card capable of playing 4K games at reasonable detail settings and frame rates. And that ferocious power pushes even further if you're playing with MFAA enabled, especially if you're lucky enough to have a G-Sync monitor to match.

Still, that doesn't mean the Titan X is for everybody.

If you're in the market for a graphics card this expensive, raw power is obviously a major concern. And when it comes to raw power, both the Radeon R9 295x2 and dual GTX 980s running in SLI outpunch the Titan X. While a pair of 980s is fairly equal in price ($1,100 total) to a $1,000 Titan X, the cooler-running 295x2 is far cheaper, starting at $700 on the street today, and available even cheaper with rebates. Monitors bearing AMD's FreeSync technology will also likely cost less than competing G-Sync displays when they hit the market, given that G-Sync requires the use of a costly, proprietary hardware module where FreeSync simply works over DisplayPort 1.2a.

But!

Dual-GPU solutions require compromise. For one thing, they suck up a ton of case space--two full-length cards in the case of a pair of GeForce 980s in SLI, and a long, heavy card with a sizeable water cooling setup if you go with AMD's flagship 295x2. Drivers and optimizations for multi-GPU setups also tend to be slower to appear and much more finicky, as evidenced by the Shadow of Mordor wonkiness with the 295x2. (Nvidia has had an initiative to have SLI support on the day of launch for top titles but the lower-tier games don't get the same commitment.) 

Likewise, single-GPU graphics cards can also fit in tighter spaces than multi-GPU solutions. You could, for example, squeeze the Titan X into a relatively small form factor PC, which would be downright impossible with the Radeon 295x2 or dual 980s. Dual-GPU solutions consume more power and tend to spit out more waste heat than single cards, too.

Further reading: Tested: Nvidia GeForce and AMD Radeon graphics cards for every budget

Because of all that, our standard recommendation is to rock the most powerful single-GPU graphics card you can buy. If you're looking for pure, unadulterated, price-is-no-concern single-GPU power, that card is clearly the Titan X, and the 12GB frame buffer helps guarantee the card will continue to be relevant as we move deeper into the 4K resolution era. Hail to the new GPU king, baby--though AMD has a new generation of Radeon cards barreling down the pipeline soon.

And if you're made of cash and aren't scared of running multiple graphics cards, can you imagine how potent two Titan Xs in SLI would be I'm quivering just thinking about it....

(www.pcworld.com)

Brad Chacos

Zur Startseite