Windows 10's DirectX 12 graphics performance tested: More CPU cores, more oomph

17.08.2015
If you’re buying or building a new gaming PC for Windows 10 and DirectX 12, your priority should be as many “real” CPU cores you can afford, running at high clock speeds.

At least, that’s the conclusion after playing with the first DirectX 12-based test that uses an actual game engine. After giving a quirk whirl around Oxide Games' new Ashes of the Singularity DirectX 12 pre-beta benchmark, it’s clear that having more CPU cores will matter most when it comes to a potential DX12 performance boost—but clock speed contributes too.

That validates—but also slightly contradicts—my findings from March using preview copies of 3DMark and Windows 10. That testing indicated that core count, including Hyper-Threading, was the biggest factor in a potential DirectX 12 performance increase, while sheer clock speeds matter less.

But while 3DMark is a synthetic benchmark and will never be a game, Stardock and Oxide’s upcoming Ashes of the Singularity will indeed ship sometime next year, which makes it more “real world.” 

For my tests, I used the same Core i7-4770K with 16GB of DDR3/1333 that I used as a baseline last time, but rather than bringing back the GeForce Titan X from my previous test, this time I used an Nvidia GeForce GTX 980 Ti card that was recommended by Oxide and Nvidia.

The Ashes benchmark is powerful and flexible and specifically designed to let the user tailor it to a scenario. Rather than spit out a single score, it has granular details on each load. The idea, Oxide told me, is to prevent emphasizing a single score that can taken out of context. The game can be tailored to test a GPU’s DX12 performance or the CPU’s DX12 performance depending on how you load the test up.

First, to get it out of the way, DirectX 12 performance can indeed be significantly better than DirectX 11 performance in Ashes of the Singularity. My test with the same video card and setup mentioned above, but running DirectX 11 mode, puts DirectX 12 at about 30 percent faster. Others reported an even larger gulf, depending on PC component configurations.

Nvidia’s lab ran the new benchmark on a six-core Intel Core i7-5820K at stock clocks and low-clocks, paired it with a Titan X, and saw even heftier improvements going from DirectX 11 to DirectX 12.

Nvidia saw very hefty performance increases of up to 82 percent on DirectX 12 over DirectX 11 on a multi-core low-clock speed chip. The reason At lower clock speeds, DirectX 11’s inability to use more of the cores on the Core i7-5820K held performance back since it’s mostly single-threaded. DirectX 12 spreads the load across those cores so that even at low clock speeds, you see a significant performance increase.

My own tests reflected that, but also show a little bit more about the CPU’s impact.

After talking with Stardock and Oxide I determined the best benchmark to run would be the Heavy batch load with the graphics set to the default “low” value so as not to make the GPU the bottleneck. My rationale was not to test the GPU specifically, but to try to replicate my previous 3DMark tests to find out the impact of cores counts on gaming in DX12.

I ran the benchmark with the CPU’s default clockspeed of 3.5GHz to 3.9GHz, with all CPU cores and Hyper-Threading on. I then twisted knobs in the BIOS and ran the test with fewer cores active, as well as switching Hyper-Threading on and off and turning the clock speed down to 1.7GHz. 

Next page: Analysis of results, good news for AMD, and a strong rebuke from Nvidia.

In many ways, the Ashes of the Singularity benchmark validates my tests of five months ago: With DirectX 12, the more CPU cores, the better. But unlike that early March preview of DX12, clock speeds also seemed to help with the Ashes of the Singularity benchmark. Going from four cores at 1.7GHz to four cores at 3.9GHz gives you a nice bump from 36 fps to 51 fps.

That’s very significant.

Contrast that with what happened in my earlier 3DMark DirectX 12 feature test. I simulated a Pentium G3258 with two cores and no Hyper-Threading running overclocked at 4.9GHz. The result in 3DMark was only slightly faster than when I simulated a 3.5GHz Core i3-4330 with two cores and Hyper-Threading turned on. In the synthetic 3DMark, Hyper-Threading made big contributions to performance. You can read the original story or just peep this chart of my 3DMark testing.

In the Ashes of the Singularity benchmark, clock speeds have far more impact than they did in 3DMark's feature test, while Hyper-Threading has a minimal performance impact. Oxide developers told me the reason they suspect Hyper-Threading didn’t knock it out of the ball park on their new game engine is the shared L1 cache design of Intel’s CPUs. 

With Hyper-Threading a yawner and high-clock speeds a big bonus, AMD’s budget-priced chips are pretty much set up as the dark horse CPU to for DirectX 12—if this single benchmark test is indicative of what we can expect to see from DirectX 12 overall, of course.

In fact, Oxide’s developers said their internal testing showed AMD’s APUs and CPUs having an edge since they give you more cores than Intel for the money. AMD’s design also doesn’t share L1 cache the way Intel’s chips do. 

The numbers really add up when you factor in the cost-per-core from AMD. An AMD FX-8350 gives you 8-cores (with some shared resources) for $165. That doesn’t even net you a quad-core from Intel CPUs. The cheapest quad-core from Intel is the 3.2GHz Core i5-4460 for $180—and that quad-core Haswell CPU doesn’t even have Hyper-Threading turned on. Nor can it be overclocked.

Oxide developers told me their internal testing with the Ashes of the Singularity benchmark showed 8-core AMD CPUs giving even the high-end Core i7-4770K a tough time. 

But don’t take this to mean AMD’s suddenly in the pole position. When I asked Oxide and Stardock officials what the ultimate CPU is for Ashes of Singularity, the choice was Intel’s 8-core monster, the Core i7-5960X.

Still, this is finally some good news for AMD’s CPU division, which all but the most die-hard fanboy would agree has been in third place against Intel’s CPUs for years now. Intel’s CPUs, to be frank, have been so good that they compete more with each other than AMD's counterparts. AMD’s CPUs have failed to trounce their Intel equivalents even when they outnumber them in cores.

As a consumer, you won’t be able to use Ashes of the Singularity to test DX12 performance until the game is released sometime next year—but there is one option if you want to try it sooner. Oxide and Stardock say those who want to play with the test early can buy the $50 Founder's Edition of the game, which will grant early access to the benchmark in a week or so.

All this would seem like a first solid step towards our first DirectX 12 test using a real game. Oxide officials said the reason they let the media demo the test first was to celebrate multi-core CPU gaming, which is finally supported in the new Windows 10-exclusive API. 

Nvidia officials didn’t seem to think the test was all that, though, and in an unexpected move pretty much trashed it as a measurement tool.

“We do not believe (Ashes of Singularity) is a good indicator of overall DirectX 12 gaming performance,” the company said in its guidance on using the new test as a benchmark.

Nvidia said the pre-beta benchmark has bug that affects MSAA performance, which makes it unreliable. To put an even finer point on it, the company’s spokesman Brian Burke told PCWorld: “We believe there will be better examples of true DirectX 12 performance.”

Translated for gamers: That’s pretty much a “shots fired!” moment.

Nvidia went on to say to expect the same lead it has had over AMD’s Radeon graphics card drivers in DirectX 11 to carry over to DirectX 12.

“Gamers and press have seen GeForce DX11 drivers are vastly superior to Radeon’s.We’ve worked closely with Microsoft for years on DirectX 12 and have powered every major DirectX 12 public demo they have shown,” Nvidia said. “We have the upmost confidence in our DX12 drivers and our architecture’s ability to perform in DX12. When DX12 games arrive, the story will be the same as it was for DX11.”

Oxide officials soon fired back in a blog post saying the alleged bug in MSAA is no bug at all, and that Ashes of the Singularity is a perfectly valid DirectX12 benchmark.

”We assure everyone that is absolutely not the case,” said Oxide’s Dan Baker in a blog post. “Our code has been reviewed by Nvidia, Microsoft, AMD and Intel. It has passed the very thorough D3D12 validation system provided by Microsoft specifically designed to validate against incorrect usages. All IHVs have had access to our source code for over year, and we can confirm that both Nvidia and AMD compile our very latest changes on a daily basis and have been running our application in their labs for months. Fundamentally,  the MSAA path is essentially unchanged in DX11 and DX12. Any statement which says there is a bug in the application should be disregarded as inaccurate information.”

If you can’t read between the lines, let me do it for you: Nvidia just launched preemptive missiles to let anyone who sees tests of a Radeon outperforming a GeForce card by even a little know it’s the test that’s busted, not their drivers. With DirectX 12 as the new undiscovered country for gamers, the company doesn’t want to get off on the wrong foot with the notion that AMD has better drivers.

For its part, Oxide said there’s no reason for anyone to freak—hardware vendors nor gamers. DirectX 12 is a new API and everything is in flux. 

“Immature drivers are nothing to concerned about. This is the simple fact that DirectX 12 is brand-new and it will take time for developers and graphics vendors to optimize their use of it. We remember the first days of DX11,” Baker wrote in the blog. “Nothing worked, it was slower then DX9, buggy and so forth. It took years for it to be solidly better then previous technology. DirectX12, by contrast, is in far better shape then DX11 was at launch. Regardless of the hardware, DirectX 12 is a big win for PC gamers.”

I’d like to point out that Baker is right in some regard. DirectX 12 is a reset for all hardware parties and it’s going to at least a few months—if not years—to determine what hardware will be the best for Ashes of the Singularity and other DirectX 12 games going forward.

(www.pcworld.com)

Gordon Mah Ung

Zur Startseite