Intel's Arc A770 desktop graphics card has emerged in a new benchmark.
Intel Arc A770 GPU Shows RTX 2070-Like OpenCL Performance : Read more
Intel Arc A770 GPU Shows RTX 2070-Like OpenCL Performance : Read more
Very good question. I'd hope that Intel would have the driver side of things pretty well down in basically muscle memory. Whether they can match the cadence of NVidia pushing out updates is another question.question isn't "is it enough for a flagship" its how will support be (i.e. drivers)
Most of us neither need nor want a flagship. We want something reliable that will bring the overall cost per FPS down by a meaningful amount. Something similar to what AMD did with the RX470-580 - relatively solid mainstream performance for its time at a fairly budget-friendly price.question isn't "is it enough for a flagship" its how will support be (i.e. drivers)
rtx 2070 performance is fine for flagship. The fact is most people don't need much more than that. If intel can pump out a bunch of gpus with decent drivers ranging from 1650 level to 2070 level performance the market will be great.
I just noticed a770, 12GB. It is the cut down 384 eu version, not the 512, 16GB flagship. Add 1/3 perf and you get to the 3070.Flagship at what price? If $500, not great in terms of performance per dollar. Radeon 6600 XT can be had for just over $400 and is on par with an RTX 2070 / Super. Plus it is a known quantity.
The 1650 got bemoaned for being a pretty small improvement over the 1050Ti and lacking Turing NVEnc. I'd bump my minimum expectations of what an entry-level gaming GPU should be in 2022 to at least the 1650 Super which got a whole lot more praise.The fact is most people don't need much more than that. If intel can pump out a bunch of gpus with decent drivers ranging from 1650 level to 2070 level performance the market will be great.
exactly.Most of us neither need nor want a flagship. We want something reliable that will bring the overall cost per FPS down by a meaningful amount. Something similar to what AMD did with the RX470-580 - relatively solid mainstream performance for its time at a fairly budget-friendly price.
Driver-wise, Intel still has a lot of work left to do in that department if we go by reviews from people who got their hands on Samsung's Galaxy Book Pro 2. Those are definitely a concern.
Only if it was available at a fair price. Based on historic performance per dollar growth, RTX2060-like performance shouldn't be over $200 today despite inflation.exactly.
even if it was only 2060 as long as they have the driver support ppl would be happy.
No. OpenCL is more about parallel processing, scientific number crunching, AI stuff, etc.Based on OpenCL benchmark can one conclude what is raytracing performance like? Nvidia has RT cores, Optix and has the lead in this area currently.
It's disheartening results are 2 generation behind. Intel should strive being better than this generation.The results are a little disheartening. Hopefully, they still have some tweaking to do to bring performance up somewhere around the RTX 3060, at least.
Or maybe this is just an old, early engineering sample. Having a solid third player in the GPU game will only help the consumer.
It took AMD 3-4 generations of GPUs to catch up with Nvidia after several years of GPU coma while it poured everything into Ryzen in a bid to save itself from bankruptcy. I never expected Intel to be anywhere near competitive in its first somewhat serious attempt at re-entering the market after being mostly out for 20 years.It's disheartening results are 2 generation behind. Intel should strive being better than this generation.
They haven't put any desktop specs officially.I think I know what's going on here there's less memory than they're supposed to be on the desktop version so I think it's actually just the mobile version being tested on the desktop and if that's the case these benchmark scores aren't that bad
I kinda agree, but I do want to point out a huge difference between AMD and Intel. And that is the budget. Raja at AMD probably was on a shoestring budget because the company was at its lowest point then. And because Intel wanted badly in the dedicated graphic space, I am sure they will not cheap out. Like you, I don’t expect Intel’s first ”proper” dGPU to be competitive, at least not from a software perspective. The concern is the fact that they keep kicking the release can down the road. Each delay is a quarter gone. And if they are still somewhat competitive with Ampere and RDNA2 now, they are trending red and expected to lose even that small opportunity.It took AMD 3-4 generations of GPUs to catch up with Nvidia after several years of GPU coma while it poured everything into Ryzen in a bid to save itself from bankruptcy. I never expected Intel to be anywhere near competitive in its first somewhat serious attempt at re-entering the market after being mostly out for 20 years.
To me, DG1's limited launch was basically a private beta to gather early field data to sort out a fair selection of the worst bugs ahead of the next generation's launch so it doesn't get completely destroyed for being junk, DG2 is an open beta to gather more data on something that is closer to production-ready, which the repeat delays tells us is taking longer than expected and DG3 was always going to be the first serious product in the series.
Looking at the rest of your post, I fully believe you that you do."presumed flagship SKU " and "Intel", in one sentence.
BTW, I believe in ghosts too.
The worse the conditions get for Intel's launch, the better the chances we get a flood of cheap 1650-3060Ti class GPUs when Intel feels ARC is worth launching.... unless the reason for Intel's repeat delays is show-stopping bugs that may cause ARC to get scrapped altogether.The concern is the fact that they keep kicking the release can down the road. Each delay is a quarter gone. And if they are still somewhat competitive with Ampere and RDNA2 now, they are trending red and expected to lose even that small opportunity.