News New Intel Arc driver improves frame rates by up to 174%, adds support for 'Horizon Forbidden West

bit_user

Titan
Ambassador
If you don't list the "before" or "after" frame rates, how is this useful as anything but a PR piece for Intel? Just because some game's frame rates improved a lot still doesn't mean it's even playable!

@JarredWaltonGPU , don't you guys have "before" data on any of these games that could be used to add even one actual FPS datapoint to the article?
 
  • Like
Reactions: P.Amini

bit_user

Titan
Ambassador
To add, I don't mind ads for Intel graphics. We want them to succeed as a third player in the gaming graphics market.
I hope my comment isn't seen as anti-Intel. I'm not. I'm also not cynical enough to believe any good is accomplished by furthering a misleading narrative. If someone buys an Intel GPU, based on false expectations, the damage that could do to their brand/reputation is probably worse than the benefit of that 1 additional sale. People who buy their GPUs should do so eyes-open.

I expect Toms to do better than parrot meaningless press releases, at least when it should be easy enough for them to do so. For a (potential) user, what would make this news meaningful is knowing what the actual frame rates are. There's none of that, here.

The other key piece of information I'm lacking is whether any of these games have been prior targets of substantial optimizations (i.e. does this reflect Intel's driver getting even better, or is it just adding game-specific optimizations it didn't previously contain).
 
Last edited:
  • Like
Reactions: P.Amini
  • Like
Reactions: SSGBryan

bit_user

Titan
Ambassador
For anybody interested in if they actually improve their drivers or not this should be interesting.
Note that I didn't say they never improve their drivers. The question I raised was whether this constituted a fundamental improvement, or was just game-specific optimizations. And furthermore, did most of these games even have game-specific optimizations, or was it the first time for any of them (or, at least the games with the biggest improvements)? On that front, your reply was unhelpful.
 
  • Like
Reactions: P.Amini
Note that I didn't say they never improve their drivers. The question I raised was whether this constituted a fundamental improvement, or was just game-specific optimizations. And furthermore, did most of these games even have game-specific optimizations, or was it the first time for any of them (or, at least the games with the biggest improvements)? On that front, your reply was unhelpful.
Not everything I post is directed to you...
You raised a question and I thought that this video might be of interest to some people.
 

bit_user

Titan
Ambassador
Not everything I post is directed to you...
Sure, but since you wrote it in a reply to me, I think it was reasonable to conclude that it was meant as a direct response.

You raised a question and I thought that this video might be of interest to some people.
The question I raised was due to seeing a string of announcements touting big gains by Intel's drivers. Many of these gains are game-specific, but that's often not obvious. So, people could get the mistaken impression that round-after-round of driver releases with huge gains mean Intel has caught up to the competition.

In contrast, if we're talking about game-specific optimizations, what AMD and Nvidia often do is provide driver updates on launch day of major titles. If Intel isn't doing game day drivers, then they're actually behind, but news articles like this somehow manage to spin that as a positive!

That's why it's crucial that Toms provide actual FPS data, where possible. Without that, it can be hard to know what to make of such announcements.
 

Shirley Marquez

Honorable
Sep 2, 2019
13
7
10,515
I hope my comment isn't seen as anti-Intel. I'm not. I'm also not cynical enough to believe any good is accomplished by furthering a misleading narrative. If someone buys an Intel GPU, based on false expectations, the damage that could do to their brand/reputation is probably worse than the benefit of that 1 additional sale. People who buy their GPUs should do so eyes-open.

I expect Toms to do better than parrot meaningless press releases, at least when it should be easy enough for them to do so. For a (potential) user, what would make this news meaningful is knowing what the actual frame rates are. There's none of that, here.

The other key piece of information I'm lacking is whether any of these games have been prior targets of substantial optimizations (i.e. does this reflect Intel's driver getting even better, or is it just adding game-specific optimizations it didn't previously contain).

A big part of the improvement on older games is that they now have native DX9 support in the driver, rather than using a DX9 to DX12 translation layer. Native DX11 support is a work in progress.

Alas, one of my favorite things to use my GPU for is based on OpenGL, and Intel's support for that API is still horrible. That limits the appeal of an ARC graphics card for me. ARC GPUs will also yield poor performance on older PCs because they need resizable BAR support to reach their full potential.
 
  • Like
Reactions: bit_user
If you don't list the "before" or "after" frame rates, how is this useful as anything but a PR piece for Intel? Just because some game's frame rates improved a lot still doesn't mean it's even playable!

@JarredWaltonGPU , don't you guys have "before" data on any of these games that could be used to add even one actual FPS datapoint to the article?
The only game listed I can think of that anyone I read uses is God of War, and this can give you an example of the current performance:

https://www.techpowerup.com/review/sapphire-radeon-rx-7600-xt-pulse/20.html
god-of-war-1920-1080.png


I too wish they talked about the existing performance as a lot of these are DX11 titles where Arc performance wildly varies from title to title. Assuming the Intel numbers are comparative it'd put Arc up in the area of the 6600 XT.
 
  • Like
Reactions: bit_user

ilukey77

Reputable
Jan 30, 2021
807
332
5,290
been using my 14600kf ARC 770 16gb ..

since my main 7900xtx cooked one of the pcie plugs :(

Ive been quite happy with the performance over all ..

No 7800x3d 7900xtx but it does 70 to 80 fps 4k ultra in nightingale which is all ive been playing for the last few weeks ..

Im really Impressed with the ARC GPU if they keep improving there GPU's i will be buying Intel GPU's soon ..

When they keep improving the drivers regularly you know they are looking at long term ..

I think im going to go 5090 next gen and sit on it for a few gens now !!

Then see where ARC is after that !!