News Intel Arc A770 GPU Shows RTX 2070-Like OpenCL Performance

InvalidError

Titan
Moderator
question isn't "is it enough for a flagship" its how will support be (i.e. drivers)
Most of us neither need nor want a flagship. We want something reliable that will bring the overall cost per FPS down by a meaningful amount. Something similar to what AMD did with the RX470-580 - relatively solid mainstream performance for its time at a fairly budget-friendly price.

Driver-wise, Intel still has a lot of work left to do in that department if we go by reviews from people who got their hands on Samsung's Galaxy Book Pro 2. Those are definitely a concern.
 
  • Like
Reactions: watzupken

Eximo

Titan
Ambassador
Yeah, not sure why the driver's are poor. It doesn't make a lot of sense.

Supposedly they have had Xe graphics since 2020 and the DG1 was 2021, which required a special motherboard bios to work. Was apparently even a desktop Iris Xe you could get, might have to look for that later... Iris Xe MAX which is a discrete GPU for mobile. (Wait that is misleading, but it is filed under the desktop section, so not really sure)

So in two years they have lots of hardware out there, various tech demos, etc, and they can't even launch some games. Blows my mind.

I should point out that I have had issues with the i7-1165G7 I have at work. Screen goes to black when certain website videos and/or ads try to run. Luckily have an Nvidia GPU I can force it to use...
 
Last edited:
Jan 3, 2022
67
13
35
How is going to handle raytracing on custom engines vs. Unreal/Unity? Through OpenAPI or some other method? Has Intel communicate that?
 

rluker5

Distinguished
Jun 23, 2014
624
376
19,260
Per tflop the shaders in my UHD 770 don't seem slow compared to anything, not even RDNA2 (about 3500 graphics per tflop firestrike, 1000 graphics per tflop timespy) and plays most games I've tried flawlessly albeit slowly. But I've seen some oddities with some games like SOTTR that indicate they need some more driver work (in the built in bench there were texture issues similar to what a GTX580 has, but they were fixed and now a static ghost image appears behind everything about 1/3 the way through the final scene).
If the shaders scale the high end won't be like a 2070 but like a 3070.
Edit: I do have a 3080in my system and disable it in device manager when I want to test my UHD770, but maybe there are some driver conflicts present because the Nvidia driver is still installed. Not worth enough to me right now to uninstall it when I get the notion to fool around with igpu.
 

jacob249358

Commendable
Sep 8, 2021
636
215
1,290
rtx 2070 performance is fine for flagship. The fact is most people don't need much more than that. If intel can pump out a bunch of gpus with decent drivers ranging from 1650 level to 2070 level performance the market will be great.
 
  • Like
Reactions: bolweval

Eximo

Titan
Ambassador
rtx 2070 performance is fine for flagship. The fact is most people don't need much more than that. If intel can pump out a bunch of gpus with decent drivers ranging from 1650 level to 2070 level performance the market will be great.

Flagship at what price? If $500, not great in terms of performance per dollar. Radeon 6600 XT can be had for just over $400 and is on par with an RTX 2070 / Super. Plus it is a known quantity.
 

rluker5

Distinguished
Jun 23, 2014
624
376
19,260
Flagship at what price? If $500, not great in terms of performance per dollar. Radeon 6600 XT can be had for just over $400 and is on par with an RTX 2070 / Super. Plus it is a known quantity.
I just noticed a770, 12GB. It is the cut down 384 eu version, not the 512, 16GB flagship. Add 1/3 perf and you get to the 3070.
Edit: just read 12GB is supposed to be the 448 eu version. Hopefully higher clocks will make up the rest.
 

InvalidError

Titan
Moderator
The fact is most people don't need much more than that. If intel can pump out a bunch of gpus with decent drivers ranging from 1650 level to 2070 level performance the market will be great.
The 1650 got bemoaned for being a pretty small improvement over the 1050Ti and lacking Turing NVEnc. I'd bump my minimum expectations of what an entry-level gaming GPU should be in 2022 to at least the 1650 Super which got a whole lot more praise.
 
Most of us neither need nor want a flagship. We want something reliable that will bring the overall cost per FPS down by a meaningful amount. Something similar to what AMD did with the RX470-580 - relatively solid mainstream performance for its time at a fairly budget-friendly price.

Driver-wise, Intel still has a lot of work left to do in that department if we go by reviews from people who got their hands on Samsung's Galaxy Book Pro 2. Those are definitely a concern.
exactly.

even if it was only 2060 as long as they have the driver support ppl would be happy.
 
Jan 3, 2022
67
13
35
Based on OpenCL benchmark can one conclude what is raytracing performance like? Nvidia has RT cores, Optix and has the lead in this area currently.
 
The results are a little disheartening. Hopefully, they still have some tweaking to do to bring performance up somewhere around the RTX 3060, at least.
Or maybe this is just an old, early engineering sample. Having a solid third player in the GPU game will only help the consumer.

Based on OpenCL benchmark can one conclude what is raytracing performance like? Nvidia has RT cores, Optix and has the lead in this area currently.
No. OpenCL is more about parallel processing, scientific number crunching, AI stuff, etc.
 
  • Like
Reactions: saltweaver
Jan 3, 2022
67
13
35
The results are a little disheartening. Hopefully, they still have some tweaking to do to bring performance up somewhere around the RTX 3060, at least.
Or maybe this is just an old, early engineering sample. Having a solid third player in the GPU game will only help the consumer.
It's disheartening results are 2 generation behind. Intel should strive being better than this generation.
 

InvalidError

Titan
Moderator
It's disheartening results are 2 generation behind. Intel should strive being better than this generation.
It took AMD 3-4 generations of GPUs to catch up with Nvidia after several years of GPU coma while it poured everything into Ryzen in a bid to save itself from bankruptcy. I never expected Intel to be anywhere near competitive in its first somewhat serious attempt at re-entering the market after being mostly out for 20 years.

To me, DG1's limited launch was basically a private beta to gather early field data to sort out a fair selection of the worst bugs ahead of the next generation's launch so it doesn't get completely destroyed for being junk, DG2 is an open beta to gather more data on something that is closer to production-ready, which the repeat delays tells us is taking longer than expected and DG3 was always going to be the first serious product in the series.
 
Apr 16, 2022
1
0
10
I think I know what's going on here there's less memory than they're supposed to be on the desktop version so I think it's actually just the mobile version being tested on the desktop and if that's the case these benchmark scores aren't that bad
 
Jan 3, 2022
67
13
35
I think I know what's going on here there's less memory than they're supposed to be on the desktop version so I think it's actually just the mobile version being tested on the desktop and if that's the case these benchmark scores aren't that bad
They haven't put any desktop specs officially.
 

watzupken

Reputable
Mar 16, 2020
1,022
516
6,070
It took AMD 3-4 generations of GPUs to catch up with Nvidia after several years of GPU coma while it poured everything into Ryzen in a bid to save itself from bankruptcy. I never expected Intel to be anywhere near competitive in its first somewhat serious attempt at re-entering the market after being mostly out for 20 years.

To me, DG1's limited launch was basically a private beta to gather early field data to sort out a fair selection of the worst bugs ahead of the next generation's launch so it doesn't get completely destroyed for being junk, DG2 is an open beta to gather more data on something that is closer to production-ready, which the repeat delays tells us is taking longer than expected and DG3 was always going to be the first serious product in the series.
I kinda agree, but I do want to point out a huge difference between AMD and Intel. And that is the budget. Raja at AMD probably was on a shoestring budget because the company was at its lowest point then. And because Intel wanted badly in the dedicated graphic space, I am sure they will not cheap out. Like you, I don’t expect Intel’s first ”proper” dGPU to be competitive, at least not from a software perspective. The concern is the fact that they keep kicking the release can down the road. Each delay is a quarter gone. And if they are still somewhat competitive with Ampere and RDNA2 now, they are trending red and expected to lose even that small opportunity.
 

KyaraM

Admirable
Correct me if I'm wrong. But didn't they announce a 780 as well? If anything, that would be the flagship... also, it's just the first gen with loads of software issues etc at start I wager. For me, the interesting generation is the next one, when they (hopefully) fixed all/most issues. That said, I never expected a 3090/6900X on their first shot either way. It will depend on the price.

"presumed flagship SKU " and "Intel", in one sentence.

BTW, I believe in ghosts too.
Looking at the rest of your post, I fully believe you that you do.
 
Last edited:

InvalidError

Titan
Moderator
The concern is the fact that they keep kicking the release can down the road. Each delay is a quarter gone. And if they are still somewhat competitive with Ampere and RDNA2 now, they are trending red and expected to lose even that small opportunity.
The worse the conditions get for Intel's launch, the better the chances we get a flood of cheap 1650-3060Ti class GPUs when Intel feels ARC is worth launching.... unless the reason for Intel's repeat delays is show-stopping bugs that may cause ARC to get scrapped altogether.
 
  • Like
Reactions: KyaraM