Intel Arc Graphics Performance Revisited: DX9 Steps Forward, DX12 Steps Back

Intel Arc Graphics Performance Revisited: DX9 Steps Forward, DX12 Steps Back

Intel’s Arc Alchemist GPUs launched toward the latter part of 2022, vying for a spot among the best graphics cards. You can see where they land on our GPU benchmarks hierarchy as well, but that’s only part of the story. Intel has been busily updating drivers on a regular basis since launch, and rarely does a fortnight pass without at least one new driver to test.

And that’s the crux of the story: How much better are Intel’s latest drivers compared to the original launch drivers? The company made a lot of noise about its improved DirectX 9 performance, claiming 43% higher performance on the A750 with recent drivers compared to the original 3490 launch drivers — and at the same time dropping the price of the A750 to $250 down from $290. What about newer games, though?

We set about testing (and retesting) every Arc graphics card, including the Arc A770 8GB using an ASRock model that we haven’t quite got around to reviewing just yet, due to all the retesting that’s been going on.

We began our testing a couple of weeks back, right after the 4123 Intel drivers became available. Last week, version 4125 drivers came out, but a quick check on one of the Arc cards indicates the only changes are related to some recent game launches and that the performance otherwise remains the same as the 4123 drivers for our test suite.

Plugging the Holes in the Arc

Arc arrived with what were arguably the best GPU drivers Intel has ever created – but that’s not saying much. A few years ago, a lot of games simply refused to work at all on Intel’s integrated graphics solutions — and even those games that worked would often perform poorly. Intel’s DG1 helped pave the way for more frequent driver updates, but things were still iffy back in 2021.

Since the first dedicated Arc A380 cards started showing up in China — which meant they also started getting shipped to individuals around the world — Intel has been cranking out updated drivers on a regular basis. There are presently eleven different versions of Arc drivers available from Intel: five WHQL (Windows Hardware Quality Labs) certified and six beta drivers, but there were other “hotfix” or beta drivers that are no longer listed.

Many of the updates have been targeted at one or two specific games — there were two different drivers that addressed problems with Spider-Man Remastered performance on the Arc A380, for example. Other updates have had much further reaching ramifications, with the biggest change being DirectX 9 optimizations.

DirectX 9, 20 Years Later

Microsoft’s DirectX 9 API became publicly available back in 2002, but it continues to see quite a bit of use even today. Some of that goes back to the venerable Valve Source engine used for Half-Life 2, which saw widespread adoption for various mods and spinoff games like Counter-Strike: Global Offensive, Left 4 Dead 2 and Team Fortress 2; StarCraft 2 and League of Legends likewise continue to use DX9.

Considering how old the API is, it’s easy to understand Intel’s decision to “focus on modern [DX12 and Vulkan] APIs” for the launch of the Arc graphics cards. Most recently released games use DirectX 12 or DirectX 11, or sometimes Vulkan. Why put a bunch of effort into trying to optimize the drivers for older games? For DX9 compatibility, Intel opted for support via DX12 emulation, which probably sounded like a good idea at some point.

Dedicated Graphics, Round Two

Intel Arc Graphics Performance Revisited

(Image credit: Tom’s Hardware)

Arc isn’t Intel’s first discrete graphics rodeo. It announced its intention to create dedicated graphics cards back in November 2017, when it hired Raja Koduri shortly after his departure from AMD. This wasn’t going to be a quick and easy task, though the initial plan was almost certainly supposed to be something competitive before 2022. And there was a precursor to Arc, even if most people never saw it.

First shown off at the start of 2020 during CES, Intel’s DG1 test vehicle — Discrete Graphics 1 — felt more like a way to drum up hype rather than something substantive. The hardware finally started shipping, in limited fashion, in mid-2021, with cards from Asus and Gunnir. The hardware basically consisted of the integrated graphics used in Intel’s 10th Gen Tiger Lake mobile processors, minus the CPU and with dedicated VRAM. But Intel opted to stick with LPDDR4x, the same memory used by the laptop processors.

How did it perform? In a word, poorly. In our DG1 benchmarks, it was roughly on par with Nvidia’s GT 1030 GDDR5 graphics card, an anemic solution that launched in 2017, cost $69, and wasn’t fit for much more than 720p gaming at low to medium settings. DG1 was also about as fast as AMD’s Vega 8 integrated graphics found in Ryzen 4000 U-series (15W) processors. But it was a start.

The problem is that a lot of older games remain immensely popular. Token support for DX9 via DX12 emulation might have sounded fine on paper, but in practice it was a glaring indication that Intel’s GPUs and drivers weren’t as good as AMD and Nvidia drivers. That’s even though most DX9 games aren’t particularly demanding, especially when compared with recent DX11/12 releases, performance in DX9 is only as good as the emulation.

Simply put, at launch, the DX9 via DX12 emulation for Intel Arc wasn’t very good. It was very prone to fluctuations in frametimes, leading to stuttering in games. People complained and Intel decided to rethink its DX9 strategy. The solution Intel came up with was to leverage more translation layers to turn DX9 API calls into either DX12 calls or Vulkan calls. The DX12 emulation still comes from Intel’s internal work on the drivers, or perhaps Microsoft’s D3D9On12 mapping layer, but now with a bit more tuning.

The bigger change appears to be the use of DXVK — DirectX to Vulkan — from Steam for “some cases.” Intel hasn’t detailed exactly which games use DXVK and which use DX12 emulation, but whatever the changes, frametimes and average performance have increased substantially. This was all rolled out with Arc’s December driver update.

This is great news if you want to buy an Arc GPU for a modern PC to run old games. Remember: PCIe Resizable BAR is basically required for Arc GPUs, which means you generally need a 10th gen or later Intel CPU, or Ryzen 5000 or later AMD CPU. Otherwise, you’ll lose about 20 percent of the potential performance, based on testing, at which point you should just buy an AMD or Nvidia GPU.

So Intel can’t simply ignore older games, as weak performance in games that millions of people still play on a regular basis makes all Intel GPUs look bad. But how much has really changed?

Arc Performance Update: Test Setup

Given all the changes, we pulled out a collection of fifteen different games, including two DX9-based games, to check how the various cards perform. Note that we didn’t test DX9 games at launch, so we only have the current results.

The fifteen games and their respective APIs consist of Borderlands 3 (DX12), Bright Memory Infinite Benchmark (DXR, or DirectX Raytracing), Control Ultimate Edition (DXR), Counter Strike: Global Offensive (DX9), Cyberpunk 2077 (DXR), Far Cry 6 (DX12), Flight Simulator (DX11), Forza Horizon 5 (DX12), Horizon Zero Dawn (DX12), Mass Effect 2 (DX9), Metro Exodus Enhanced (DXR), Minecraft (DXR), Red Dead Redemption 2 (Vulkan), Total War: Warhammer 3 (DX11), and Watch Dogs Legion (DX12).

All of the games were tested at “medium” and “ultra” settings, except for Mass Effect 2 — it only has three graphics options, and it’s old enough that we simply enabled all three options for the “ultra” testing. The three A7-class cards were tested at 1080p, 1440p, and 4K resolutions, except for in ray tracing games where we dropped 4K testing (for what will become obvious reasons in a moment). The A380 dropped the 4K testing completely, and also dropped 1440p testing for ray tracing games.

Each game and setting were tested at least three times, dropping the high and low result. Where there was more variance (looking at you, CSGO), the tests were done five times and we took the median result. This is an important point, because sometimes the first run was very poor (again, CSGO).

Arc GPUs Test PC

All these tests were done using the same Core i9-12900K PC that we used for the previously published Arc GPU launch reviews, so that we can look at how performance has changed. How would the old drivers differ from the launch performance and now? That’s where things get a bit messy.

There have been multiple major updates to games in our test suite over the past few months, and we’ve been working to shift all of our testing to a new system with a Core i9-13900K, plus adjusting our test suite and settings as needed. Our older test system is now effectively out of date, and while we’ve retested some cards with the updated games, it was best to just hit the reset button.

What we know for certain is that Cyberpunk 2077, Flight Simulator, Forza Horizon 5, and Total War: Warhammer 3 all had updates that changed performance and/or settings quite a bit — on some GPUs more than others. Cyberpunk now has DLSS 3 and FSR 2.1 support, Flight Simulator added DLSS 3 and DLAA, and Forza Horizon 5 added DLSS 2 and FSR 2.2 along with TAA. All three of those games also added support for Nvidia Reflex. As for Total War: Warhammer 3, a couple of months back there was a major update that improved performance by roughly 20% on virtually all GPUs.

And those are just the changes that were readily visible. Almost all of the other games in our test suite have also received various updates, and keeping track of what has changed and what remains consistent is difficult. Regardless, we’ll have the original launch performance data in our charts, with an asterisk indicating it’s from a different version of the game and may not reflect current performance. Just for good measure, we also applied all the latest Windows 11 updates and flashed our motherboard BIOS.

Intel Arc Graphics Performance: 3490 Launch vs. Current 4123 Drivers

We have up to three results for each of the test cards in the following chart: 3490 are the launch drivers for the A770 and A750 — the Arc A380 was available several months earlier with 3259 drivers, and we saw 3267, 3268, and 3276 drivers released before our review was finished. But we’re sticking with the 3490 drivers for all of the Arc “launch” drivers to keep things consistent. 4123 are the “current” drivers (4125 drivers are “Game On” for Company of Heroes 3, The Settlers: New Allies, Atomic Heart, and Wild Hearts). Then we also have “launch” results from the A380, A750, and A770 16GB. Let’s start with the 1080p medium test suite.

Note that the overall performance charts omit the “launch” drivers, since those were only tested on a subset of the games, plus we know that nearly all of the games have received at least one update in the past four months. That’s also why the launch results are marked with an asterisk (*).

We should also discuss some of the anomalies we’ll see before we get to the charts. First, Red Dead Redemption 2 using the Vulkan API consistently crashed our test PC with the A750 card — and it crashed a different test PC as well. It was mostly stable using the original 3490 launch drivers, enough so that we could complete a few test runs before the PC would restart, but every later driver we checked had stability problems. Intel was able to replicate the problem and it’s limited to the A750; a fix should hopefully be available by the time you read this. (We used the same scores for the 4123 drivers on the A750 for the overall chart, to avoid skewing the results by omitting one game.)

Next, Bright Memory Infinite Benchmark, a ray tracing title, has a rather severe VRAM memory leak problem — or at least, that’s how it appears. We could complete one benchmark run at 1080p with the Normal preset on the 8GB cards at 1080p, but higher settings (and the A380) would go from being somewhat smooth to a stuttering mess partway through the benchmark, with minimum framerates of 1 fps. This is another bug that Intel has confirmed and is working to fix — it’s not clear if this affects the game as well, or only the standalone benchmark (which is far more demanding and better looking than the actual game, if you were wondering).

Finally, Minecraft performs very poorly on all Arc GPUs right now. When the cards first launched, you couldn’t even turn on ray tracing in the game, as apparently it was coded with a “whitelist” of cards that support DXR. An update to the game in December finally allowed the Intel GPUs to run with ray tracing enabled, but the results remain much lower than expected. As a point of reference, in other demanding ray tracing games like Cyberpunk 2077, the A770 is only 10 percent slower than an RTX 3060; in Minecraft, the RTX 3060 is currently nearly triple the performance. In fact, even the RX 6600 outperforms the fastest Arc GPU by 25 percent, where the A770 is 60 percent faster in Cyberpunk. Intel has not confirmed the issues with poor Minecraft performance – yet. We figure it’s only a matter of time.

Disclaimers aside, here are the results of our testing.

Overall performance changes between the 3490 and 4123 drivers at 1080p medium are… not a lot. That’s pretty much expected. The three A7-class cards are 1–2 percent faster, while the A380 is 5% faster.

We don’t have a “medium” result from Mass Effect 2, and CSGO appears to be mostly CPU limited to around 360 fps (with fluctuations as it’s a non-deterministic test sequence — I played against harmless bots on the Mirage map, running the same route each time, basically a loop through checkpoint A as a counter terrorist). That means two of the biggest potential gains are out of the picture, except on the A380 where CSGO performance was 16% higher on the updated drivers.

Flipping through the individual charts, the only major differences are the somewhat larger than expected drop in performance in Control on the two Arc Limited Edition cards — the ASRock A770 8GB and A380 didn’t show such behavior — with modest gains elsewhere. That’s mostly the same pattern we’ll see at the other test settings.

What’s more surprising is to look at the launch performance testing results from last year. Did our PC become slower? Or maybe game updates broke some Intel optimizations that were present? Possibly it’s just Windows 11 updates that are to blame. But regardless, most of the launch testing results show a performance regression. Ouch.

The A770 16GB current scores are nearly 10% lower on aggregate for the twelve games tested at launch versus now — CSGO and Minecraft weren’t tested at launch. The A750 looks a bit better overall, mostly thanks to a huge jump in the Bright Memory Infinite Benchmark result, but that’s because we modified our testing slightly. We set the desktop resolution to 1080p so that we didn’t have to change it in the game, and only ran the test once before exiting, restarting the game, and running the test again. Take out that result and it also dropped about 10% on aggregate. The A380 was the most consistent, only dropping 4% overall, with Control being the only significant regression.

Again, we can’t pinpoint the exact cause of the performance changes. Lots of things have happened in the four months (six for the A380) since we first tested the Arc cards. Game updates certainly play a role, with Cyberpunk 2077 showing a major drop from roughly 50 fps in October down to about 30 fps now. It would certainly be unfortunate if Intel’s Arc driver optimizations were somehow tied to older versions of these games.

Moving the quality setting to “ultra” results in bigger gains overall for the latest drivers, but much of that comes from the two DirectX 9 games. Overall, we’re looking at a 5–10 percent aggregate improvement in performance with the 4123 drivers compared to the 3490 launch drivers. However, if we drop the two DX9 games from the geometric mean calculation, then the net improvement is only 3–6 percent.

There are positive and negative changes once again, with some of the biggest deltas coming in DXR games where they’re often meaningless — A380 benchmarked at 6.1 fps compared to 4.5 fps, is a 34% increase in performance… at completely unplayable framerates.

There are a few cases where we see up to a 10% improvement in games that aren’t using DXR or DX9, like Total War: Warhammer 3, but overall the changes are mostly within the margin of error, or at least close to it. But look at the two DX9 games… or at least the one DX9 game.

Mass Effect 2 performance improved by anywhere from 40% on the A380 to as much as 90% on the A770 8GB. That’s for average fps, but the 99th percentile fps shows even bigger gains. The low fps results are 60% faster with the 4123 drivers on the A380 and up to 135% higher on the A770 8GB.

What about CSGO? The A770 16GB card shows basically no change, which makes sense given the large amount of VRAM. The A770 8GB improved by 9%, and the A750 by a few percent, but the A380 performance was slightly slower. Perhaps Intel focused its DX9 optimizations more on the bigger chips than on the ACM-G11 used in the A380?

Against the launch testing results from last year, things also don’t look substantially improved. Total War: Warhammer 3 is the only game that shows consistently higher performance now than last year, and we know from our AMD and Nvidia GPU testing that this was due to a game update. Nearly everything else shows at best the same performance, and at worst up to 10% lower performance — which might also come from game updates.

Moving to 1440p ultra testing, the A380 is now mostly not a factor to consider outside of playing older DX9 games. Overall, we again see a 5–10 percent improvement in performance for the three A7-class Arc cards, with the A750 showing the biggest overall increase. Of course, if we omit the DX9 games, it goes back to being almost margin of error, 2–3 percent, with the A750 still improving by 9% thanks to a big increase in the Cyberpunk result, from an unplayable 10 fps to a still-unplayable 14 fps.

The gains in CSGO are still pretty muted, 3% on the A770 16GB, 10% on the A770 8GB, and 17% on the A750 — and the A380 performance again dropped, this time by 6%. Mass Effect 2 on the other hand shows gains of 50% on the Intel Limited Edition cards, and 70% on the ASRock A770 8GB — but only 22% on the A380, which still managed to break 100 fps on the 13-year-old game.

There’s not much more to add, other than to note that again the current results using updated versions of the same games are generally slightly slower than our launch review test results.

Last we have the 4K testing results, though we didn’t do any DXR tests so the overall chart is down to 10 rasterization games. The three ACM-G10 based graphics cards show 3–7 percent aggregate gains in performance using the full test suite, but only a 1–5 percent increase if we omit the two DX9 games.

Mass Effect 2 now shows improvements of 10–35 percent, with the 16GB card benefiting the least, and the factory overclocked ASRock card benefiting the most. CSGO does show modest gains as well this time on the two Intel cards, while the ASRock card only shows a 3% improvement. Again, there was a lot of variability between runs in CSGO, and the minimum fps does show much better results with 35–50 percent higher 99th percentile framerates.

Against our launch review testing, the results are still down, however — by 3% on the A770 16GB and by 7% on the A750. Forza Horizon 5 is the biggest drop, and we can attribute that to the updates as the AA options were changed quite a bit last year when Microsoft added DLSS, FSR 2.2, and TAA support. We switched from 2xMSAA to TAA, but even if we go back to 2xMSAA we can’t get anywhere near the old levels of performance.

Playing Intel Arc, the Iteration Game

(Image credit: Tom’s Hardware)

It’s been an interesting six or so months since the Arc A380 first appeared. Given all the claimed performance gains, we were hoping to see more… consistency across our test suite. The latest drivers generally do outperform the launch drivers, but issues remain, and compared to our 2022 testing results the changes aren’t nearly as impressive.

None of our findings are particularly surprising, including the continued oddities. AMD and Nvidia have been playing the drivers game for decades, and even they have occasional problems. Intel has had graphics drivers for integrated graphics solutions for decades as well, but the performance on tap was so low that there often wasn’t any pressure to even try to optimize for newly released games. Creating dedicated graphics solutions changes the user expectations, and now Intel is playing catch up.

The most important things to note with the driver updates coming out of Intel are that there’s a regular cadence, and that older DirectX 9 games got some much-needed TLC to get them running more smoothly. It’s rare that more than two weeks passes without some new driver being announced, and Intel has also been doing better on getting Game Ready drivers out for bigger launches, including “Game On” drivers for the Hogwarts Legacy and Atomic Heart launches over the past couple of weeks.

If Intel can keep that up, plus add the occasional larger overhaul that provides more universal improvements, in a couple of years we hopefully won’t even need to have a serious discussion about Intel’s GPU drivers. And of course, we’re only scratching the surface with our game performance testing.

The games, settings, resolutions, and even test sequences can absolutely affect performance and potential gains from newer drivers. Maybe a different CSGO map would have yielded bigger performance improvements from the newer drivers. Or maybe the 12900K CPU was holding us back a bit more than the 13900K Intel used in its testing.

Regardless, let’s hope that Intel’s GPU and driver teams get the needed time to continue working on future drivers and architectures. Right now, it doesn’t sound like many people are biting on Arc graphics cards. Intel recently dropped the official price of its A750 Limited Edition card to $249, down $40 or 14 percent from the launch price. Big companies don’t do that if parts are flying off the shelves. How long can Intel continue to bleed money on consumer GPUs? Perhaps more importantly, can Intel afford to not invest more money into consumer GPUs?

There’s a lot riding on Arc Battlemage, which is currently slated to arrive in 2024. Intel’s Raja Koduri has also said that the company is more interested in competing with mainstream GPUs than worrying about halo cards, with $200–$300 being the sweet spot. We’ll have to wait and see if Intel can narrow the gap between Arc and its AMD and Nvidia competition come next year — which, incidentally, is when we expect we’ll see the next generation Nvidia Blackwell and AMD RDNA 4 GPUs launch.

Add a Comment