We Tested 250 Games on an Intel Arc GPU: How Did It Go?

Great in-depth article! As someone whose last day-one game was Battlefield 3, these certainly have my attention. If my 2060 were to die today, the A770 would be on the sort list of replacements.

I'm not certain my old rig supports resizable BAR without workarounds, though, so AMD is my other option. I've used Nvidia exclusively for personal builds since I started building PCs 20 years ago (I always had driver issues when using AMD on builds for other people), but they have given up providing anything competitive in my price range.
 
I bought the A770 on launch…I have no regrets and will likely buy the B770 or whatever it’s called.
This first gen of GPUs from Intel has been a clear success. They'll take the lessons and challenges learned from this and make BattleMage a much more refined set of products.
 
This first gen of GPUs from Intel has been a clear success. They'll take the lessons and challenges learned from this and make BattleMage a much more refined set of products.
Are we looking at the same data here? Sure Intel's improved the driver situation from a stability standpoint but there's still a healthy performance gap between Intel's best and the 4 year old RTX 3060 Ti and RX 6700 XT.

Two cards from the established players that use less of three things:
- Your money
- Your electricity
- Your time troubleshooting
 
Are we looking at the same data here? Sure Intel's improved the driver situation from a stability standpoint but there's still a healthy performance gap between Intel's best and the 4 year old RTX 3060 Ti and RX 6700 XT.

Two cards from the established players that use less of three things:
- Your money
- Your electricity
- Your time troubleshooting

Those cards are/were more expensive than the A770 and use the same power. But hey 1 outta 3 ain't bad.
 
Are we looking at the same data here? Sure Intel's improved the driver situation from a stability standpoint but there's still a healthy performance gap between Intel's best and the 4 year old RTX 3060 Ti and RX 6700 XT.

Two cards from the established players that use less of three things:
- Your money
- Your electricity
- Your time troubleshooting

For me personally, at the time of purchase, A770 was the only AV1 hardware encoder/decoder on the market at my price range. The electricity cost is nominal since I live in the US and power is cheap compared to Europe and I’m not using the A770 all day. And while I won’t deny I have spent some time troubleshooting…that was kinda half the fun and I don’t spend any time troubleshooting at this point.
 
I love the ambition of this article, but for me it could've been just a single sentence long:

We ran into multi-monitor problems trying to run our Alienware AW3423DW and a second monitor

That's the end of Intel Arc for me. I haven't run just a single monitor in almost a decade.
 
I love the ambition of this article, but for me it could've been just a single sentence long:



That's the end of Intel Arc for me. I haven't run just a single monitor in almost a decade.
Don't know what to say. The percentage of people who always run with multiple monitors is very,very small. And for someone who never went back to using a single monitor, (let's just assume that you don't also have a laptop), the percentage is even much much smaller.

Clearly, if you only run with multiple monitors, chances are you are already running a beefier GPU than this Arc.
 
Don't know what to say. The percentage of people who always run with multiple monitors is very,very small. And for someone who never went back to using a single monitor, (let's just assume that you don't also have a laptop), the percentage is even much much smaller.

Clearly, if you only run with multiple monitors, chances are you are already running a beefier GPU than this Arc.

Whatever. Dual monitor support has been a solved problem for 30+ years. GPUs of any price level almost always come with 2+ display outputs. I get that not everybody will put them to use all the time, but then again there's no good reason to give up that option either. Decent monitors are very affordable these days and more workspace is excellent for productivity or multi-tasking, like a game on one and a video on another.
 
I was going to say "Well, I wonder how many of the games would run well with Wine (or Steam + Proton) and the Mesa Gallium 3D drivers." (Assuming there would be quite a few failures with the Windows drivers based on past reports.) But I'm duly impressed, they've gotten their drivers into pretty good shape! Also, I don't seriously expect you to reinstall 250 games just for that kind of testing.

Good article! Glad to see Intel has gotten their driver issues sorted out!
 

The results are quite disappointing, as only 218 games are working properly, and the performance still lags behind an older Nvidia 3060ti. Additionally, multi-monitor support is unreliable. So, who exactly is the target audience for these cards? It seems to cater mostly to those who enjoy tinkering. I understand that tinkering and problem-solving can be addictive for some. However, if you actually want to play video games, get work done, and trust that your system is stable, why would you choose an Arc card?

The review also makes a great point: if the card struggles to stay within a 30 percent performance gap of a 6650, 7600, or even a 3060ti, the ARC has to be very cheap to be competitive. Intel should price these at $99 and just push the remaining inventory out there. I wouldn't make any more. The people using them are really your beta testers. Their only goal at this point is getting their driver team ready for Battlemage. Intel cannot make money on Arc, so instead of trying, they should make the product cheap enough to attract tinkerers. Tinkerers generally like to exaggerate how good a product is, as part of their joy is solving the problem. They just forget that the problem they solved is their own because if they had chosen a different product, they wouldn't have had one. As the saying goes, when starting at the bottom, the only way to go is up.
 
About the article itself, there's nothing I can say others haven't already said. Just amazing and very in-depth. It's for this type of content that I keep coming to TS!

Now about the results, like some, I am a bit disappointed.

Out of 250 games, 218 titles working flawlessly without any technical issues and with decent performance doesn't sound bad, but...

1. I'd say it's more like 208 flawless titles out of 250, since I don't find anything less than High Settings to get good framerates @ 1440p acceptable for a company's top tier GPU priced over $220

2. If the problematic games that didn't pass with flying colors consisted mostly of niche, obscure and/or indie titles, 218 out of 250 would seem better. But that wasn't the case, with many of the problematic games being relatively recent high profile AAA titles such as Final Fantasy 7 Remake, Alan Wake 2, TLOU Part 1, Starfield, Ghost of Tsushima, and also older AAA titles that people might still want to play such as No Man's Sky, Ghost Recon Wildlands (which worked flawlessly but with disappointing performance IMO, requiring medium settings for 60 fps - GTA 5 on high is also very disappointing), The Outer Worlds, The Witcher 3, Bioshock Infinite, Terminator Resistance, Yakuza: Like a Dragon, Dirt Rally, GTA 4, L4D2 and Metro: Last Light.

This shows Intel might have greatly improved their drivers since launch but is only halfway there, they still have a lot of work to do improving drivers and hopefully also better tweaking their Battlemage architecture. I hope some Intel's GPU engineers and driver devs read this article and read the comments.

In its current state the A770 really isn't good bang for buck at $270 MSRP. A more reasonable pricing would be around $180 - $200, so even with all the present issues, it would be a great 1080p card for gamers on a budget.

A few more notes:

- Assetto Corsa classic, Assetto Corsa Competizione, Automobilista 1 and 2, MS Flight Simulator X and American Truck Simulator are 6 titles I highly missed not being included in these tests, wish you had included them (though at least you tested Euro Truck Simulator 2, so I assume ATS will probably run just as well since it's by the same developer on the same engine).

- Pacific Drive didn't perform too well on the A770. But to be fair, it's widely known to be a terribly unoptimized game no matter which hardware it runs on, with even people running high end gaming PCs with latest i9 CPUs + RTX 4090 experiencing major performance issues. Lowering settings from Ultra/Very High to Medium or even lowest possible also has very little impact on performance, it only makes the game uglier. At least that was my experience with it. The A770 actually did good on this one if it managed a stable 60 fps at medium settings + 1440p.

- I know they're a grey area so I understand being left out, still, I'd love to see plenty of emulators also being tested. Mostly for compatibility / issues check than performance.
 
I found my Intel Xe on my notebook to have acceptable compatibility (performance is another matter, it's the low end "48 EU" model, and a low end CPU..), the Mesa Gallium 3D drivers are quite good. Based on the level of compatibility I've seen on there, I'd consider an Arc. But not at the price they are at! Agreed, they may have at least been somewhat competitively priced when they came out, but now they are not.

Have to throw in, 250 games? This is comprehensive, this article is amazing.
 
Whatever. Dual monitor support has been a solved problem for 30+ years. GPUs of any price level almost always come with 2+ display outputs.

This shows Intel might have greatly improved their drivers since launch but is only halfway there, they still have a lot of work to do improving drivers and hopefully also better tweaking their Battlemage architecture.

Intel has had GPUs since *many * years ago and all their Iris / Xe / Arc gave Intel enough time to polish their drivers at least to a very good state.

Exactly what both said are two important points:
- despite that huge amount of years to improve the basis (UI, settings, graphical errors, disk monitor support, etc) they still have issues

- despite being a huge powerful company, they didn’t sort out to improve compatibility/errors at least to AMD level

If games run DX11/12 (or a norm) I still can’t understand how a GPU/Drivers that comply that, run into graphical errors.

To summarize: Intel makes decent low /mid low end cards but the software is still far behind AMD/Nvidia (though it is improving, but it is taking years…)
 
Are we looking at the same data here? Sure Intel's improved the driver situation from a stability standpoint but there's still a healthy performance gap between Intel's best and the 4 year old RTX 3060 Ti and RX 6700 XT.

Two cards from the established players that use less of three things:
- Your money
- Your electricity
- Your time troubleshooting
Hey, if you can see the forest for the trees, that not anyone else's problem. All of the ARC card have been an excellent value and a very admirable first showing from Intel.

Context is important. What do you think you've missed?
 
Those cards are/were more expensive than the A770 and use the same power. But hey 1 outta 3 ain't bad.
This and..
A770 was the only AV1 hardware encoder/decoder on the market at my price range.
This.

I love the ambition of this article, but for me it could've been just a single sentence long:

That's the end of Intel Arc for me. I haven't run just a single monitor in almost a decade.
Except that this is not the experience that everyone is having. I've helped people setup dual and triple screen setups with ARC cards, no issue at all.
 
Last edited:
Back