News Intel Reveals First Arc A750 Desktop GPU Benchmarks

This is getting interesting now! The entry level A380 seemed a little lackluster, but IMO, decent for what is a first discreet GPU on the low end from Intel (Cough..Larrabee!)

This on the other hand is more impressive. It's a BIG if, that the comparisons were done with same details settings etc, but IF so, then this is a great first attempt mainstream SKU. Raja Koduri has brought some major culture change in Intel's GPU business. This is clear to see.

Hopefully this means more competition in the discrete GPU sector, and more FPS for us end users at decent prices, without the gouging that currently exists.
 
  • Like
Reactions: KyaraM
This is getting interesting now! The entry level A380 seemed a little lackluster, but IMO, decent for what is a first discreet GPU on the low end from Intel (Cough..Larrabee!)

This on the other hand is more impressive. It's a BIG if, that the comparisons were done with same details settings etc, but IF so, then this is a great first attempt mainstream SKU. Raja Koduri has brought some major culture change in Intel's GPU business. This is clear to see.

Hopefully this means more competition in the discrete GPU sector, and more FPS for us end users at decent prices, without the gouging that currently exists.
The comparisons were definitely done with the same hardware and settings. That's not the question. Intel says as much in the disclosure slide at the end of the video. The question is how these results apply to the thousands of other games that are currently available. I suspect Intel's driver team specifically optimized performance in these five games for the benchmarks. Other games probably perform okay as well. But even AMD and Nvidia have some games where they underperform, and they've been doing "real" GPU drivers a lot longer than Intel. I wouldn't be surprised if a random sampling of 30 demanding games revealed weaker performance from Intel in 75% of them.
 

cyrusfox

Distinguished
I suspect Intel's driver team specifically optimized performance in these five games for the benchmarks. Other games probably perform okay as well. But even AMD and Nvidia have some games where they underperform, and they've been doing "real" GPU drivers a lot longer than Intel.
Driver development is a marathon not a sprint. Intel has quite a way to go. Even with all that time working on DG1 till now. Gamer's nexus video on A750 reveal did a good job talking about drivers -inifinite number of games, which do you choose to work on.
Arc on DX12 & Vulcan expect great performance, anything earlier... not so much, also REBAR needed for experience to be playable on most titles.
 
The comparisons were definitely done with the same hardware and settings. That's not the question. Intel says as much in the disclosure slide at the end of the video. The question is how these results apply to the thousands of other games that are currently available. I suspect Intel's driver team specifically optimized performance in these five games for the benchmarks. Other games probably perform okay as well. But even AMD and Nvidia have some games where they underperform, and they've been doing "real" GPU drivers a lot longer than Intel. I wouldn't be surprised if a random sampling of 30 demanding games revealed weaker performance from Intel in 75% of them.

At this point, I'd totally agree with you.

I guess the point I was trying to make was, that with Raja on board and Intel very obviously making a 'real go' at this GPU thing (;)), that there's more to come. I'd imagine he is taking the 'Intel' GPU driver issue seriously. It won't be good enough they keep to their current output of drivers. They will step up and I'm sure over the course of time, have driver certification on par and as regularly as AMD/Nvidia!

Hopefully, this is an early taste of what's to come with driver optimization and regular rollouts of 'Game ready' drivers.

Edit: @JarredWaltonGPU , in terms of general raster performance, would the performance of this card not roughly translate to the same/similar performance across the majority of games? (Incidental variances aside).
I realise that you can't be certain as each gam engine might respond differently, along with architecture differences. But, in general terms?
 
Last edited:
I think someone else put it best somewhere I can't remember: "it's about Intel just dives into the pool and tries to stay afloat and swim like crazy, otherwise AMD and nVidia will be on the way back by the time they feel ready".

I think the best overall news is the tentative pricing of the A380. At ~$120 it's not a terrible deal. Could be better, for sure, but given the features and potential, certainly makes it an interesting proposition. AMD and nVidia won't have a low end card in like a year anyway, at best.

Regards.
 
  • Like
Reactions: KyaraM

aetolouee

Prominent
Apr 30, 2021
10
7
515
Arc on DX12 & Vulcan expect great performance, anything earlier... not so much
Would be interesting to see some benchmarks on dx11 and older games turned into vulkan games with dxvk
 
Last edited:
At this point, I'd totally agree with you.

I guess the point I was trying to make was, that with Raja on board and Intel very obviously making a 'real go' at this GPU thing (;)), that there's more to come. I'd imagine he is taking the 'Intel' GPU driver issue seriously. It won't be good enough they keep to their current output of drivers. They will step up and I'm sure over the course of time, have driver certification on par and as regularly as AMD/Nvidia!

Hopefully, this is an early taste of what's to come with driver optimization and regular rollouts of 'Game ready' drivers.

Edit: @JarredWaltonGPU , in terms of general raster performance, would the performance of this card not roughly translate to the same/similar performance across the majority of games? (Incidental variances aside).
I realise that you can't be certain as each gam engine might respond differently, along with architecture differences. But, in general terms?
I'm not sure why anyone would have a lot of faith in Raja. He was the man behind Vega, more or less, and while it wasn't bad it also wasn't great. But drivers are always a concern, with any GPU, and Intel just has a really poor track record. It's improving, but as recently as the Deathloop FSR 2.0 patch that game was having problems on all Intel GPUs. In theory, a driver just translates the high level DirectX calls into code that the GPU can run, but in practice I guess with infinite ways of accomplishing certain tasks, shader compilers and such can be a problem.

It's funny, because AMD's CPUs have to be 100% compatible with x86/x86-64, but somehow for graphics it's not quite so clear cut.

On paper, assuming 2.0GHz and higher clocks, Arc A-series GPUs should be pretty competitive. But there are so many architectural nuances that the paper specs can end up being completely meaningless. Like with the ReBAR thing on Arc A380 that people have discovered. I can't see a good reason why lack of ReBAR support on a platform could hurt performance as badly as it does, unless there's just a lot of poor memory management and other stuff going on in the drivers.

Put another way, the A750 that Intel showed benchmarks for has:
7% more memory bandwidth than the RTX 3060 (both are 12GB, 192-bit bus, but Intel is clocked at 16Gbps instead of Nvidia's 15Gbps)
Possibly 3.5% less compute performance than RTX 3060, depending on GPU clocks (and Nvidia's FP32 + FP32/INT32 pipeline split factors in)

If the architectures were perfectly comparable, we'd expect a very slight advantage for Intel if it clocks higher than 2.0GHz, like 2.25GHz for example. But there's absolutely no way that ends up being true, which means drivers and other elements come into play.
 

jp7189

Distinguished
Feb 21, 2012
323
187
18,860
I'm not sure why anyone would have a lot of faith in Raja. He was the man behind Vega, more or less, and while it wasn't bad it also wasn't great. But drivers are always a concern, with any GPU, and Intel just has a really poor track record. It's improving, but as recently as the Deathloop FSR 2.0 patch that game was having problems on all Intel GPUs. In theory, a driver just translates the high level DirectX calls into code that the GPU can run, but in practice I guess with infinite ways of accomplishing certain tasks, shader compilers and such can be a problem.

It's funny, because AMD's CPUs have to be 100% compatible with x86/x86-64, but somehow for graphics it's not quite so clear cut.

On paper, assuming 2.0GHz and higher clocks, Arc A-series GPUs should be pretty competitive. But there are so many architectural nuances that the paper specs can end up being completely meaningless. Like with the ReBAR thing on Arc A380 that people have discovered. I can't see a good reason why lack of ReBAR support on a platform could hurt performance as badly as it does, unless there's just a lot of poor memory management and other stuff going on in the drivers.

Put another way, the A750 that Intel showed benchmarks for has:
7% more memory bandwidth than the RTX 3060 (both are 12GB, 192-bit bus, but Intel is clocked at 16Gbps instead of Nvidia's 15Gbps)
Possibly 3.5% less compute performance than RTX 3060, depending on GPU clocks (and Nvidia's FP32 + FP32/INT32 pipeline split factors in)

If the architectures were perfectly comparable, we'd expect a very slight advantage for Intel if it clocks higher than 2.0GHz, like 2.25GHz for example. But there's absolutely no way that ends up being true, which means drivers and other elements come into play.
Let's also not forget game optimization is a 2 way street. Game studios have years of experience optimizing for nvidia and amd, which means those games will run well even before driver optimization. Intel has a lot to prove before they will merit the same treatment.
 
The question is how these results apply to the thousands of other games that are currently available. I suspect Intel's driver team specifically optimized performance in these five games for the benchmarks. Other games probably perform okay as well. But even AMD and Nvidia have some games where they underperform, and they've been doing "real" GPU drivers a lot longer than Intel. I wouldn't be surprised if a random sampling of 30 demanding games revealed weaker performance from Intel in 75% of them.
Yeah, looking at the A380 review that Gamers Nexus recently posted, that lower-end card performed rather competitively in F1 2021, outperforming the RX 6400 and not being too far behind the GTX 1650 in that game. But in most other games it performed worse than the 6400, and in the case of GTA5 it performed a lot worse, getting almost half the performance. Where the RX 6400 provided a mostly 60fps experience at 1080p, the A380 only managed a mostly 30fps experience in that game. They suggested that the card tended to underperform in games utilizing older APIs like DX11.

F1 2021 was the only game in this chart that GN tested on the A380, but it happened to be the one where it performed the best relative to the competition, which is probably indicative of these games being a best-case scenario, cherry-picked to show the A750 at its most competitive to the 3060. If this card sees similarly unpredictable performance in some games though, like getting half the performance in GTA5, then that's likely to be a major turn-off. Perhaps that's part of why these cards haven't been released yet though, as Intel attempts to get the drivers in a somewhat more optimized state. I question whether that's something they can manage in a matter of months though.

None of Intel's benchmarks used ray tracing effects, despite Control, Cyberpunk 2077, and Fortnite supporting DXR (DirectX Raytracing). We'd really love to see some ray tracing benchmarks for Arc, if only to satisfy our own curiosity.
Logic would dictate that if Intel is keeping quiet about RT performance, it most likely compares unfavorably against at least Nvidia's implementation. AMD wasn't really talking much about the performance of their RT implementation during the run-up to the launch of the RX 6000 series either, and it predictably ended up under-performing compared to the competition. It's also possible that the cards may perform better at some RT effects than at others though. And perhaps their RT implementation just requires more driver work before they are willing to show it.
 

KyaraM

Admirable
Well, things are getting interesting! Now, if they don't mess up pricing and things continue like this, these cards could actually be great alternatives, especially for people who can make uae of the additional features.
 

wpcp

Prominent
Jan 4, 2021
2
0
510
does nobody remember the intel i740 days? direct competitor to the 3dfx voodoo and nvidia riva tnt. The biggest issue with intel is driver support. Nvidia is releasing driver updates every month where as with intel it just dies out. We all know what happens to a GPU when driver support dies out... the card dies with it.