News Intel Demoes Arc A770 GPU, Leaves Old APIs In The Dust

wifiburger

Honorable
Feb 21, 2016
609
102
11,190
9
meh, it's going to take Intel forever to optimize those drivers
That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.

Look at AMD, they still haven't matched Nvidia massive team that do per game optimization & testing.

The only good thing about Intel GPUs will be that it will force AMD to stop matching NVIDIA gpu prices.
And I would bet you, that's the sole reason why Intel is going after the GPU market, to drop AMD profits.
 

UnCertainty08

Commendable
Sep 9, 2020
35
6
1,545
1
You are correct. That's why I stick to Nvidia as well. Nothing worse than paying top dollar for hardware that doesn't work because of software issue. Also part of the reason I stick to intel CPU's.
 
Reactions: Why_Me
meh, it's going to take Intel forever to optimize those drivers
That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.
But as the article (and video) pointed out, Intel will supposedly be pricing the cards based on their performance in titles utilizing older APIs where they don't perform as well. So you will theoretically get the performance you are paying for in DX11 and DX9 titles, but get additional performance in DX12/Vulcan. So in titles supporting those newer APIs, particularly ones Intel has optimized for, you might potentially get performance comparable to a higher-tier graphics card than you paid for. "Inconsistent" performance wouldn't be bad if the inconsistencies tend to err on the side of giving you additional performance for your money.

Another thing to consider is that if their performance in older, higher-overhead APIs is holding the cards back at launch, that leaves them with a lot of room to potentially improve performance in the future. It's possible that the cards could age a lot better than similarly-priced models from the competition. That is, so long as Intel keeps updating and optimizing the drivers for these first-generation cards for years to come. Even without optimizing much for older games though, as newer titles focus on newer APIs, and game developers begin optimizing specifically for the hardware, the performance situation is likely to improve.

Like I said previously, Intel tends to price their hardware competitively when entering into new markets, and the big price mark-ups on competing cards leaves them with a lot of flexibility to impress in terms of pricing. Of course, there are still many unknowns, like how raytraced effects will perform, and whether the hardware holds potential for that to improve in the future. And also whether features that have become the norm on AMD and Nvidia cards will be present, and function as expected. There could potentially be quirks that make one think twice about trialing these cards. So I would fully expect Intel to price them competitively if they hope to establish a presence in the market.
 

LastStanding

Great
May 5, 2022
72
26
60
0
As soon as Intel (and Apple) releases a GPU that matches, or best, NVIDIA's current best flagship on the market, all the DX9-11 non-supported titles complaints from gamers will not even matter anymore.

It's not like all these old relic titles' devs/pubs are just going to go back and add support to Intel (nor Apple's M architecture) new infrastructure. 🙄
 
Reactions: artk2219 and rtoaht

LuxZg

Distinguished
Dec 29, 2007
201
27
18,710
0
Actually, as soon as performance in older titles is "good enough" it will stop to matter. Those top 5 Steam titles sound harsh, until you realize one is CS GO (350+ FPS with 3070) other is Dota (250+). So for 1440p even "bad" performance could be good enough for most gamers. And even if the issues persist to Battlemage generation, it will be even less pronounced. Sure some will check FPS and benches and complain, but many will just play. Specially in OEM builds, as most "real gamers" (geeks, techies, the vocal crowd) won't even look at OEM builds. But many casual gamers, yeah, they won't care. I just keep wondering about the prices, because as article says, they aren't delivering vs eg 6400. Prices keeping to tumble, and with next gen in a few months, will they price 770 cards at 300$ or less?

Oh and BTW, that must be a typo in article with sentence comparing 770 to 3070 and 6500 XT (?)
 
Jul 7, 2022
102
68
160
0
meh, it's going to take Intel forever to optimize those drivers
That's mostly why I stick with Nvidia, you're guaranteed consistent performance regardless of old,new game.

Look at AMD, they still haven't matched Nvidia massive team that do per game optimization & testing.

The only good thing about Intel GPUs will be that it will force AMD to stop matching NVIDIA gpu prices.
And I would bet you, that's the sole reason why Intel is going after the GPU market, to drop AMD profits.
I’ve been an nvidia money giver my entire computing life starting with a GeForce 4400 TI, then FX 5800, twin GT 7800s, GTX 260, GTX 470, twin GTX 680s, GTX 980, GTX 1080. For this generation though, I risked buying an AMD 6900xt and I’ve been super happy with the card and the drivers.
 

KyaraM

Notable
Mar 11, 2022
988
351
890
42
So. I read a pretty interesting article about the A380 today in comparison to the GTX 1650. Sadly, it's a German article, so I will essentially just summarize the article and then link it below.

So what they did was testing both cards in the Tier 1 games, and then OC the A380. In contrast to this article, though, they gave actual performance improvements. Clock speeds improved by a modest 150MHz over stock, power draw from 35 to 55W. All they did was raise the slider "GPU Performance Boost" to 55%. The boost to performance was pretty steep, 37% according to the website.

View: https://imgur.com/gallery/sxeZCgP


However, as you can see, the A380 suddenly matches the 1650 quite well with the same settings, while supposedly still being more frugal. If estimated release prices are correct at $175... and if the same can be done with the A770...

Link to the article
https://www.notebookcheck.com/Eine-uebertaktete-Intel-Arc-A380-tritt-im-Gaming-Vergleich-gegen-die-Nvidia-GeForce-GTX-1650-an.635575.0.html

Edit: none of the above is my own stuff, they belong to the authors of the article and the dude who performed the test.
 
Last edited:
However, as you can see, the A380 suddenly matches the 1650 quite well with the same settings, while supposedly still being more frugal. If estimated release prices are correct at $175... and if the same can be done with the A770...
While I don't believe US retail pricing for the A380 has been confirmed yet, supposedly, at least according to the LTT video this article is referencing, prices for the A380 may start at $130. It's possible that information was wrong, but considering Intel's head graphics marketing people were present for the video, it might be accurate. It also lines up with a recent Intel chart showing a vague price bar for the card positioning it roughly within that range. Of course, even if that's the suggested pricing, many partner cards may be priced higher, though I would be surprised if typical US retail pricing for the A380 were above $150, seeing as some RX 6400's can already be had for that.

I just keep wondering about the prices, because as article says, they aren't delivering vs eg 6400. Prices keeping to tumble, and with next gen in a few months, will they price 770 cards at 300$ or less?
Based on that previously-mentioned chart from Intel, it looks like both the A750 and A770 may be priced within the $300-$400 range. Their locations in the chart appear to position the A750 roughly around the $300-$320 range, and the A770 around $350-$370 range, along with an A580 for around $220-$230. But it's hard to say exactly where prices will land, and the exact value could be affected by the prices of competing cards at the time of their announcement.

And while prices have been coming down, I wouldn't expect new generation cards from Nvidia or AMD around this price range this year. A 4080 and 4090 may launch within the next several months or so, but I wouldn't expect 40-series cards in the sub-$400 range until early next year. We could of course see prices adjusted in response to new card releases though.
 
Reactions: artk2219 and KyaraM

KyaraM

Notable
Mar 11, 2022
988
351
890
42
Such a hefty price for a card with immature drivers, 50% additional power consumption compared to AMD and Nvidia's both lower end tier.
The 3070 has an MSRP of $500, the RX6700XT of $480. Neither is actually going for those prices. The TGP mentioned as set in the drivers according to the article is 190W, the 3070 got 230W and the RX6700XT of around 210W. The article only mentions TGP, though, not actual consumption; the A380 in my link above was shown to consume 30W less than TGP (75W) even after OC, and we have no idea what this card drew in any of the tests. TGP gives only an upper limit, not actual consumption, and 50W or 70W, respectively, aren't 50% of either 230W, nor 210W, and not even of 190W. All in all, despite drivers (which from my experience aren't very mature in AMD, either, and sometimes have issues in Invidia, too), this sounds like a famtastic deal. Especially when taking into account that drivers can, and do, improve. And those cards are mid-tier, maybe even lowest high-end, not low-end.

All this info was collected within 5 minutes. You could have doe it yourself, would have helped.
 
Reactions: artk2219

ASK THE COMMUNITY

TRENDING THREADS