News Intel Arc Alchemist: Release Date, Specs, Everything We Know

VforV

Respectable
Oct 9, 2019
280
125
1,870
1
The amount of speculation and (?) in the Specification table is funny... :D

If Arc's top GPU is gonna be 3070 Ti level (best case scenario) and it will cost $600 like the 3070 Ti, it's gonna be a big, BIG fail.

I don't care their name is intel, they need to prove themselves in GPUs and to do that they either need to 1) beat the best GPUs of today 3090/6900XT (which they won't) or to 2) have better, a much better price/perf ratio compared to nvidia and AMD, and also great software.

So option 2 is all that they have. 3070 TI perf level should not be more than $450, better yet $400. And anything lower in performance should also be lower in price, accordingly.

Let's also not forget Ampere and RDNA2 refresh suposedly coming almost at the same time with intel ARC, Q1 2022 (or sooner, even Q4 2021, maybe). Yikes for intel.
 

JayNor

Commendable
May 31, 2019
299
42
1,710
0
Intel graphics is advertising a discussion today at 5:30ET on youtube, "Intel Arc Graphics Q&A with Martin Stroeve and Scott Wasson!"
 
Jul 14, 2021
7
9
15
0
If Arc's top GPU is gonna be 3070 Ti level (best case scenario) and it will cost $600 like the 3070 Ti, it's gonna be a big, BIG fail.
Why do people quote MSRPs? There is nowhere you can buy a 3070-TI for $600.

If I could buy an Intel 3070ti equivalent for $600 I would do it in a heartbeat. Availability and "street price" are the only two things that matter. MSRPs for non-existent products are a waste of breath.
 
As I said before, what will make or break this card is not how good the silicon is in theory, but the driver support.

I do not believe the Intel team in charge of polishing the drivers has had enough time; hell, not even 2 years after release is enough time to get them ready!

I do wish for Intel to do well, since it'll be more competition in the segment, but I have to say I am also scared because it's Intel. Their strong-arming game is worse than nVidia. I'll love to see how nVidia feels on the receiving end of it in their own turf.

Regards.
 
Reactions: btmedic04

waltc3

Commendable
Aug 4, 2019
207
101
1,760
0
ZZZZz-z-z-z-z-zzzzz....wake me when you have an actual product to review. Until then, we really won't "know" anything, will we?....;) Right now it's just vaporware. It's not only Intel doing that either--there's is quite a bit of vaporware-ish writing about currently non-existent nVidia and AMD products as well. Sure is a slow product year....If this sounds uncharitable, sorry--I just don't get off on probables and maybes and could-be's...;)
 

InvalidError

Titan
Moderator
I do not believe the Intel team in charge of polishing the drivers has had enough time; hell, not even 2 years after release is enough time to get them ready!
I enabled my i5's IGP to offload trivial stuff and stretch my GTX1050 until something decent appears for $200. Intel's Control Center appears to get fixated on the first GPU it finds so I can't actually configure UHD graphics with it. Not sure how such a bug/shortcoming in drivers can still exist after two years of Xe IGPs in laptos that often also have discrete graphics.

Intel's drivers and related tools definitely need more work.
 
Reactions: eye4bear
I enabled my i5's IGP to offload trivial stuff and stretch my GTX1050 until something decent appears for $200. Intel's Control Center appears to get fixated on the first GPU it finds so I can't actually configure UHD graphics with it. Not sure how such a bug/shortcoming in drivers can still exist after two years of Xe IGPs in laptos that often also have discrete graphics.

Intel's drivers and related tools definitely need more work.
It baffles me how people that has an Intel iGPU has never actually had the experience of suffering trying to use it for daily stuff and slightly more advanced things than just power a single monitor (which, at times, it can't even do properly).

I can't even call their iGPU software "barebones", because even basic functionality is sketchy at times. And for everything they've been promising, I wonder how their priorities will turn out to be. I hope Intel realized they won't be able to have the full cake and will have to make a call on either consumer side (games support and basic functionality) or their "pro"/advanced side of things they've been promising (encoding, AI, XeSS, etc).

Yes, I'm being a negative Nancy, but that's fully justified. I'd love to be proven wrong though, but I don't see that happening :p

Regards.
 
Reactions: eye4bear

btmedic04

Honorable
Mar 12, 2015
221
32
10,740
22
The amount of speculation and (?) in the Specification table is funny... :D

If Arc's top GPU is gonna be 3070 Ti level (best case scenario) and it will cost $600 like the 3070 Ti, it's gonna be a big, BIG fail.

I don't care their name is intel, they need to prove themselves in GPUs and to do that they either need to 1) beat the best GPUs of today 3090/6900XT (which they won't) or to 2) have better, a much better price/perf ratio compared to nvidia and AMD, and also great software.

So option 2 is all that they have. 3070 TI perf level should not be more than $450, better yet $400. And anything lower in performance should also be lower in price, accordingly.

Let's also not forget Ampere and RDNA2 refresh suposedly coming almost at the same time with intel ARC, Q1 2022 (or sooner, even Q4 2021, maybe). Yikes for intel.
I disagree with this. In this market, if anyone can supply a GPU with 3070 Ti performance at $600 and keep production ahead of demand, they are going to sell like hotcakes regardless of who has the fastest GPU this generation. Seeing how far off Nvidia and AMD are from meeting demand currently gives Intel a massive opportunity provided that they can meet or exceed demand.

As I said before, what will make or break this card is not how good the silicon is in theory, but the driver support.

I do not believe the Intel team in charge of polishing the drivers has had enough time; hell, not even 2 years after release is enough time to get them ready!

I do wish for Intel to do well, since it'll be more competition in the segment, but I have to say I am also scared because it's Intel. Their strong-arming game is worse than nVidia. I'll love to see how nVidia feels on the receiving end of it in their own turf.

Regards.
This right here is my biggest concern. How good will the drivers be and how quickly will updates come. Unlike AMD, Intel has the funding to throw at its driver team and developers, but money cant buy experience creating high performance drivers. only time can do that

I remember the days of ATi, Nvidia, 3dFX, S3, and Matrox to name a few. Those were exciting times and I hope for all of our sake that intel succeeds with Arc. We as consumers need a third competitor at the high end. This duopoly has gone on long enough
 
Reactions: JarredWaltonGPU

ezst036

Reputable
Oct 5, 2018
136
46
4,610
0
In this market, if anyone can supply a GPU with 3070 Ti performance at $600 and keep production ahead of demand, they are going to sell like hotcakes......
In the beginning this may not be possible. AFAIK Intel will source out of TSMC for this, which simply means more in-fighting for the same floor space in the same video card producing fabs.

When Intel gets its own fabs ready to go and can add additional fab capacity for this specific use case that isn't now available today, that's when production can (aim toward?)stay ahead of demand, and the rest of what you said, I think is probably right.

If TSMC is just reducing Nvidias and AMDs to make room for Intels on the fab floor, the videocard production equation isn't changing. - unless TSMC shoves someone else to the side in the CPU or any other production area. I suppose that's a thing.
 

VforV

Respectable
Oct 9, 2019
280
125
1,870
1
Why do people quote MSRPs? There is nowhere you can buy a 3070-TI for $600.

If I could buy an Intel 3070ti equivalent for $600 I would do it in a heartbeat. Availability and "street price" are the only two things that matter. MSRPs for non-existent products are a waste of breath.
Yes, I know that, we all know that, but companies still insist on telling us their MSRP, as you can see none of them gave up on this practice, even if it has proven to be a fake one.

So for that aspect alone, we (those with more than 2 neurons) are "forced" by them to still use the fake MSRP scaling while also in parallel we use the real street price scaling, between GPUs, to keep them honest.

This is how we determine the degree of scalping and scumminess, comparing their so called MSRP with the real price. Get it? It's not like we make the rules... I'm not a GPU manufacturer, are you? I also don't like it either...

So far we only have estimates of the possible MSRP for Arc GPUs, so imagine how fictional the real street prices are... can you tell me the street price of the top Arc GPU now? I don't think so...

MSRPs for non-existent products are a waste of breath.
We agree on that one, but since this article insists, I "played" it's game.
 
I enabled my i5's IGP to offload trivial stuff and stretch my GTX1050 until something decent appears for $200. Intel's Control Center appears to get fixated on the first GPU it finds so I can't actually configure UHD graphics with it. Not sure how such a bug/shortcoming in drivers can still exist after two years of Xe IGPs in laptos that often also have discrete graphics.

Intel's drivers and related tools definitely need more work.
What do you mean "get fixated" I have the iGPU running along a 1050ti and it shows both displays, one with an intel logo and if you choose the other one it tells you "some features are not supported on non-intel adapters"
 

InvalidError

Titan
Moderator
What do you mean "get fixated" I have the iGPU running along a 1050ti and it shows both displays, one with an intel logo and if you choose the other one it tells you "some features are not supported on non-intel adapters"
When I go in ICC's GPU tab, it shows my GTX1050 as 'Unknown' and no 11400/UHD730 IGP. The IGP does show up in Device Manager with drivers loaded and I can activate display outputs when I plug monitors into the motherboard, so the IGP and drivers are definitely loaded and working, ICC just isn't picking it up.

Another issue with Intel's drivers is that 9805 (still newest at time of writing) fails to register as a newer version than 9078 in Windows Update, so Windows keeps overwriting manually installed newest drivers with antique drivers that precede Rocket Lake's launch by four months every time WU does its thing. Either Intel isn't bothering to update Windows Update or something is screwed up there.

Edit: WU just overwrote my updated drivers with ancient drivers again, Intel's update tool is failing to update drivers even after two attempts at letting it auto-install after reboot. Turned off automatic driver updates in GPEdit and force-updated the drivers by manually picking the INF file from device manager. My Intel IGP driver experience so far is hot garbage.
 
Last edited:
Reactions: eye4bear

InvalidError

Titan
Moderator
ZZZZz-z-z-z-z-zzzzz....wake me when you have an actual product to review.
Reviews don't mean much until you can actually buy them for a reasonable price. Wake me up when the presumed $200 model can actually be bought for $200 and at least matches the 1660S, a two years old GPU that used to cost $230.

Kind of sad when you cheer for the same level of performance at about the same price as years-old models and that is the best you can realistically hope for in the current market mainly because the manufacturer has a poor reputation with much of the target audience.
 
The amount of speculation and (?) in the Specification table is funny... :D

If Arc's top GPU is gonna be 3070 Ti level (best case scenario) and it will cost $600 like the 3070 Ti, it's gonna be a big, BIG fail.

I don't care their name is intel, they need to prove themselves in GPUs and to do that they either need to 1) beat the best GPUs of today 3090/6900XT (which they won't) or to 2) have better, a much better price/perf ratio compared to nvidia and AMD, and also great software.

So option 2 is all that they have. 3070 TI perf level should not be more than $450, better yet $400. And anything lower in performance should also be lower in price, accordingly.

Let's also not forget Ampere and RDNA2 refresh suposedly coming almost at the same time with intel ARC, Q1 2022 (or sooner, even Q4 2021, maybe). Yikes for intel.
intel doesn't need to compete head to head with AMD or nvidia. rather that repeating the same mistake that AMD been doing before (where they try to win market share using cheap price and yet comparable/better performance) for intel it is much better if they give their focus on OEM market first where their strength are. gain more software experience and build their brand image slowly from there. they have lots of money so they should leverage that to strengthen their devrel and sponsoring lots of games.
 
Reactions: VforV

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
941
816
1,770
1
The amount of speculation and (?) in the Specification table is funny... :D
Mostly it's to indicate that Intel has not officially confirmed anything. I fully expect clocks to be in the 2.0-2.3 GHz range, possibly a bit higher even. TMUs and ROPs are probably correct. Vector Engine and Matrix Engine are for sure correct in terms of the maximum number on the die, but harvesting could yield various alternatives. Price and power are pure estimates, though, and could be completely off — in either a good or bad way.

Without being able to test the hardware yet, we can't say how much the cards are worth or if the drivers work okay or anything else. In fact, we can't know what the cards should be priced at until we reach the launch date and see if anything has changed in terms of AMD and Nvidia supply and pricing. But if Intel can basically match an RTX 3070 Ti in rasterization performance, best case I think it would put the MSRP at around $500. Worst case, it would go with the same price as Nvidia. If the cards are actually available at MSRP, though, they'll probably still sell just fine.
 
Reactions: VforV

VforV

Respectable
Oct 9, 2019
280
125
1,870
1
intel doesn't need to compete head to head with AMD or nvidia. rather that repeating the same mistake that AMD been doing before (where they try to win market share using cheap price and yet comparable/better performance) for intel it is much better if they give their focus on OEM market first where their strength are. gain more software experience and build their brand image slowly from there. they have lots of money so they should leverage that to strengthen their devrel and sponsoring lots of games.
Sure, that's one strategy, maybe even the better one for them, but if they do that A LOT of gamers will be very disappointed.

Those that are still holding off from buying a gaming GPU now at scalper prices.... hoping for intel to save them or to make a splash big enough to disrupt the market and to force nvidia and AMD to drop their prices, or at least for the prices to fall to the unobtanium-MSRP level.
 

InvalidError

Titan
Moderator
Can Intel's Arc Alchemist compete with AMD and Nvidia GPUs? Does it really matter at this point?
It doesn't matter much: if it can mine, it will get snatched up by miners and street price will likely rise proportionally with mining performance. If that doesn't happen, street prices will still get driven up by people desperate to have any dGPU in their PC and cannot score anything else.
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
941
816
1,770
1
i think intel has bad Drivers support.... have you noticed their IGPU drivers? they abandon the igpu support drivers when a new gen comes.
Intel has widespread driver support for all the Gen9 and later GPUs. Granted, that doesn't mean everything runs fine, but the problem is the Gen9/9.5 stuff just wasn't that great. You can still get up-to-date drivers for all the UHD 630 GPUs, but you're not going to get improved performance from them. If you're running a pre-Skylake CPU with integrated graphics, I'm not sure what you expect from a seven years old (or more) integrated solution, especially since DX12 isn't supported on those older Intel solutions. Basically, don't judge Intel's future GPU and architecture by drivers for relatively ancient integrated solutions. Even the DG1 drivers aren't a good barometer, since the DG1 was basically a forerunner to Arc and designed to get the drivers situation ironed out.

That's not to say I expect Intel's drivers to be as good as AMD or Nvidia drivers, but I suspect they'll be mostly sufficient. If the cards are as fast as a 3070 or 6700 XT, and the price is good, Arc Alchemist should be a decent addition. But I'm more interested in seeing what Intel does with Battlemage and Celestial in the coming years. Alchemist isn't going to be enough to compete with Lovelace and RDNA3 next year, except at the midrange and budget end of the spectrum.
 

InvalidError

Titan
Moderator
Intel has widespread driver support for all the Gen9 and later GPUs. Granted, that doesn't mean everything runs fine, but the problem is the Gen9/9.5 stuff just wasn't that great.
Intel's graphics drivers are usually good as far as core functionality needed for typical office and embedded use are concerned since there would be hell to pay in its massive corporate user base otherwise. Gaming is where things fall apart since the performance wasn't and still isn't really there beyond trivial stuff.

The real concern here regarding gaming is that Intel is effectively a virgin, having to build the gaming driver compatibility and stability library from scratch..
 

JarredWaltonGPU

Senior GPU Editor
Editor
Feb 21, 2020
941
816
1,770
1
Intel's graphics drivers are usually good as far as core functionality needed for typical office and embedded use are concerned since there would be hell to pay in its massive corporate user base otherwise. Gaming is where things fall apart since the performance wasn't and still isn't really there beyond trivial stuff.

The real concern here regarding gaming is that Intel is effectively a virgin, having to build the gaming driver compatibility and stability library from scratch..
I can say that more modern (Gen11 and Xe Graphics) iGPUs from Intel have been better in terms of gaming support. Having more raw performance definitely helps. When you have an incredibly slow iGPU (Gen9.5), many games basically fail because of timing issues related to running too slowly. I suspect a GPU like Arc with potentially 10X the performance will 'fix' a lot of issues simply by virtue of being so much faster. But we'll see how it goes in a few months!
 

abufrejoval

Prominent
Jun 19, 2020
22
9
515
0
Intel knows perfectly well it's not just about the chips, but the eco-system.

And that eco-system isn't just, or perhaps not even mostly about gamers any more.

In the current environment a GPU that was completely unattractive to miners available at volume and at MSRP levels could immediately grab 100% of the gamer GPU purchasing market.

If I was Intel, I'd give that some serious thought, because Nvidia and AMD can't easily move to block that (wouldn't we love to see them try?).

But with Xe designed with a primary focus towards compute, OneAPI and all the other stuff, those who'll require the least amount of work to adop this hardware are exactly those miners.

And again 100% of all Intel initial GPU output could easily land there with short-term benefits to the shareholders.

That could seriously hurt its adoption in the gaming (& custom game console) eco system.

I can't bet on any of these scenarios with confidence... and that feels very odd.
 

ASK THE COMMUNITY

TRENDING THREADS