News Intel Arc Branding Announced: 'Alchemist' Discrete GPUs Land In Q1 2022

Status
Not open for further replies.
D

Deleted member 2731765

Guest
INTEL ARK (database) is already a thing. Now it appears that it's just turned into a proper branding. Intel® Product Specifications

BTW, some gameplay demo footage has also been shared by Intel. The quality of the video is horrible though. Good to see METRO EXODUS running on DG2, along with CRYSIS Remastered, and few other PC titles. I don't have very HIGH hopes from INTEL's entry into the gaming GPU market though.

They haven't even mentioned on what screen resolution, and game settings this has been captured.

View: https://twitter.com/IntelGraphics/status/1427262530594869258
 
Last edited by a moderator:
  • Like
Reactions: drdccd

InvalidError

Titan
Moderator
Playing catch up with GPUs and CPUs seems to be Intel's fate these days.
Intel is fine as far as CPU designs are concerned, they just need the fabs to actually make them as-intended instead of having to retrofit them for different processes just to pump product out of the doors while they are still waiting for process to catch up with designs.

On the GPU side of things, numbers from Xe look decent for the parts we have benchmarks on so far, so the only major questions are whether it is going to scale well and how good drivers are going to be. Drivers were a major letdown for the i740 and not a particularly strong point on subsequent IGPs beyond essential functionality.
 

Giroro

Splendid
I definitely confused "Arc" with "Ark", plus I'm not sure what the word "Arc" has to do with GPUs.
Combined with Arc being a short and common word, I don't think this is very good branding. It calls attention to how generally terrible Intel has been at branding and marketing lately.
 

waltc3

Honorable
Aug 4, 2019
454
252
11,060
Intel is fine as far as CPU designs are concerned, they just need the fabs to actually make them as-intended instead of having to retrofit them for different processes just to pump product out of the doors while they are still waiting for process to catch up with designs.

On the GPU side of things, numbers from Xe look decent for the parts we have benchmarks on so far, so the only major questions are whether it is going to scale well and how good drivers are going to be. Drivers were a major letdown for the i740 and not a particularly strong point on subsequent IGPs beyond essential functionality.

Uh, if they are "fine" then please explain why firmware patching and Windows Microcode security-hole patching by the dozens, literally (which have to be reinstalled each time Windows is reinstalled), are still required for these "fine" CPUs? Ryzen, otoh, not being a warmed-over architecture from ten years ago, has maybe 2, or at the most 3, vulnerabilities and no windows microcode patches that I know of. Sorry to disagree with you on that score. It's far worse than just the process node for intel. I think your confidence is misplaced--it takes a lot more than just money and people walking around in suits spending it like water to win--just ask AMD who have zoomed right past Intel as if Intel was sitting still--on practically peanuts compared with Intel's R&D expenditures--the net result of which saw AMD win over Intel and saw Apple drop Intel CPUs in total. Those don't sound like "fine" CPUs to me...;)

Intel is strictly small potatoes on the GPU side of the house. i7xx, of which I had no less than three--all of them returned to place of purchase--were awful. Intel actually believed AGP texturing was great--and the i7xx's bombed hard. 16MB of onboard memory for texturing cards by nVidia and 3dfx ate them alive in performance and image quality. Soon afterwards, Intel dropped completely out of the discrete GPU markets and its IGPs have been behind AMD's APUs for years. With the recent release of the 5600/5700G APUs, the delta has grown as Intel drops further back.
 
Last edited:
  • Like
Reactions: VforV
Uh, if they are "fine" then please explain why firmware patching and Windows Microcode security-hole patching by the dozens, literally (which have to be reinstalled each time Windows is reinstalled), are still required for these "fine" CPUs? Ryzen, otoh, not being a warmed-over architecture from ten years ago, has maybe 2, or at the most 3, vulnerabilities and no windows microcode patches that I know of. Sorry to disagree with you on that score. It's far worse than just the process node for intel. I think your confidence is misplaced--it takes a lot more than just money and people walking around in suits spending it like water to win--just ask AMD who have zoomed right past Intel as if Intel was sitting still--on practically peanuts compared with Intel's R&D expenditures--the net result of which saw AMD win over Intel and saw Apple drop Intel CPUs in total. Those don't sound like "fine" CPUs to me...;)
So a CPU that has maybe 10% the market share and is around for maybe a tenth of the time that core has been around has 10% of the vulnerabilities.
What else is new?! How is that making one any better or worse than the other?!
This is the same stuff they are saying about linux, that it's so much "better" (safer) ...because nobody cares enough to code any attacks for it.
 

InvalidError

Titan
Moderator
Uh, if they are "fine" then please explain why firmware patching and Windows Microcode security-hole patching by the dozens, literally (which have to be reinstalled each time Windows is reinstalled), are still required for these "fine" CPUs? Ryzen, otoh, not being a warmed-over architecture from ten years ago, has maybe 2, or at the most 3, vulnerabilities and no windows microcode patches that I know of.
Zen architectures have had about as many new vulnerabilities discovered into it as Intel's CPUs have over the same period of time, most of which extremely similar to Intel's own and many more affecting even non-x86 architectures. In practically every case, the exploits are purely hypothetical and the patches are only necessary if you are running top-secret/mission-critical stuff where even hypothetical exploits are an unacceptable risk. In a real-world environment, any exploit that requires precise latency or other measurements would get screwed up by all of the background task-switching between a bunch of unrelated processes and mixed data.
 

husker

Distinguished
Oct 2, 2009
1,253
243
19,670
I definitely confused "Arc" with "Ark", plus I'm not sure what the word "Arc" has to do with GPUs.
Combined with Arc being a short and common word, I don't think this is very good branding. It calls attention to how generally terrible Intel has been at branding and marketing lately.

To go along with the "Alchemy" moniker, it might be short for "Arcane": understood by few; mysterious or secret.
 
Zen architectures have had about as many new vulnerabilities discovered into it as Intel's CPUs have over the same period of time, most of which extremely similar to Intel's own and many more affecting even non-x86 architectures. In practically every case, the exploits are purely hypothetical and the patches are only necessary if you are running top-secret/mission-critical stuff where even hypothetical exploits are an unacceptable risk. In a real-world environment, any exploit that requires precise latency or other measurements would get screwed up by all of the background task-switching between a bunch of unrelated processes and mixed data.

Well, the exploits aren't "hypothetical" - they are real. But the circumstances needed in which some of the vulnerabilities can be exploited can at times run into the realm of improbable or not very likely, that's for sure.
 

anonymousdude

Distinguished
Uh, if they are "fine" then please explain why firmware patching and Windows Microcode security-hole patching by the dozens, literally (which have to be reinstalled each time Windows is reinstalled), are still required for these "fine" CPUs? Ryzen, otoh, not being a warmed-over architecture from ten years ago, has maybe 2, or at the most 3, vulnerabilities and no windows microcode patches that I know of. Sorry to disagree with you on that score. It's far worse than just the process node for intel. I think your confidence is misplaced--it takes a lot more than just money and people walking around in suits spending it like water to win--just ask AMD who have zoomed right past Intel as if Intel was sitting still--on practically peanuts compared with Intel's R&D expenditures--the net result of which saw AMD win over Intel and saw Apple drop Intel CPUs in total. Those don't sound like "fine" CPUs to me...;)

Intel is strictly small potatoes on the GPU side of the house. i7xx, of which I had no less than three--all of them returned to place of purchase--were awful. Intel actually believed AGP texturing was great--and the i7xx's bombed hard. 16MB of onboard memory for texturing cards by nVidia and 3dfx ate them alive in performance and image quality. Soon afterwards, Intel dropped completely out of the discrete GPU markets and its IGPs have been behind AMD's APUs for years. With the recent release of the 5600/5700G APUs, the delta has grown as Intel drops further back.

As much as I want AMD to succeed so we have a major second player in the market, it's ignorant to think that there won't be more vulnerabilities found. Any non-trivial piece of hardware or software is going to have some kind of vulnerability. It just takes time and effort to find them. How severe it ends up being is something we can't predict.

If Intel was on time with 10nm or at least released alongside TSMC 7nm, do you honestly think AMD would have caught up and surged past Intel? Not even sure if surged is the right word here. I think at worst for Intel they would be at performance parity in that scenario. They're technically at performance parity right now if you exclude power usage and heat from the equation.

Apple was likely going towards custom chips regardless of whatever Intel did. I think that was very apparent the minute they decided to make their own ARM processor for their iPhones. The timing though would likely have delayed though.

Not gonna argue with you about Intel graphics being historically awful. I am gonna say that they headhunted a bunch of AMD engineers and the final results of that have yet to be seen.
 
  • Like
Reactions: purple_dragon

spongiemaster

Honorable
Dec 12, 2019
2,364
1,350
13,560
16MB of onboard memory for texturing cards by nVidia and 3dfx ate them alive in performance and image quality. Soon afterwards, Intel dropped completely out of the discrete GPU markets and its IGPs have been behind AMD's APUs for years. With the recent release of the 5600/5700G APUs, the delta has grown as Intel drops further back.
Performance wasn't that good with the i740, but image quality was universally praised for being top notch. Saying Nvidia ate Intel alive in IQ is comically wrong. Nvidia was perfectly ok sacrificing IQ for higher framerates making their IQ horrendous during this era and usually considered the bottom of the barrel.

https://www.tomshardware.com/reviews/graphic-chips-review-april-98,64-11.html

"The quality is better than what Voodoo2 has to offer, particularly at a resolution of 1024x768. Anyway, the i740 is showing the way, the MGA-G200 will follow it ... the times when 3Dfx was the leader in 3D quality are definitely over, at least for now."
"Mister Ugly is again NVIDIA's RIVA chip. At least it can deliver pretty decent frame rates, but please what do you think of the tower walls, the corridors and the ladder ? Doesn't it look horrible compared to the i740 picture? "

https://www.anandtech.com/show/202/6

"Compared to the Voodoo2 there is very little room for improvement with the i740's image quality. As you can tell by the above Image Quality Comparison, the i740 wipes the floor with the competition in terms of quality, "
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Q1 2022 is kinda late for Intel GPUs. Ampere Refresh is expected about the same time and after aprox 6 months RDNA3 will come.

Intel will be very much behind, unless their top Arc GPU is 3090/6900 XT performance levels... or they sell at half the MSRP prices of nvidia and AMD.
 

Dragonking_1

Commendable
Mar 30, 2020
14
9
1,515
Q1 2022 is kinda late for Intel GPUs. Ampere Refresh is expected about the same time and after aprox 6 months RDNA3 will come.

Intel will be very much behind, unless their top Arc GPU is 3090/6900 XT performance levels... or they sell at half the MSRP prices of nvidia and AMD.

Yes, but Intel is just starting out. Its probably just focused on the entry- and mid-level performance range and cost at best. It is definitely not ready to or going to compete with the "top" cards of Nvidia/AMD which only a handful on the market buy. So they will target the range that looks for affordable gaming.
And yea, they do need to give something more even at the same price I guess, since they have to prove their product, stability-wise, consistency-wise, etc.
 
  • Like
Reactions: VforV
Q1 2022 is kinda late for Intel GPUs. Ampere Refresh is expected about the same time and after aprox 6 months RDNA3 will come.
Well if nvidias and AMDs next launch is going to be the same as the last one then intel has nothing to fear, consumers will be able to choose between Nvidia and AMD at several times the MSRP from scalpers or an intel GPU at normal price.
Unless they turn out to be good for mining as well in which case all three will be only available at a huge markup from scalpers.
Intel will be very much behind, unless their top Arc GPU is 3090/6900 XT performance levels... or they sell at half the MSRP prices of nvidia and AMD.
They can't sell at half "the MSRP" because then the FTC will be all the way up their business again.
 

anticeon

Distinguished
May 22, 2014
36
0
18,530
Just be honest ,,, Intel GPU will be the best in Mining Hash-rate ...
no way manufacture want to missing this gpu mining bandwagon ....
selling GPU for gaming is hard because do profit is so little
 

InvalidError

Titan
Moderator
They can't sell at half "the MSRP" because then the FTC will be all the way up their business again.
They can sell GPUs at whatever prices they want, as long as they don,t just dump inventory, usually at a significant loss to themselves, for the purpose of running competition out of business. As long as the net retail price is at least enough for Intel to break even, the FTC will stay out of it as it is perfectly normal for a new entrant into a market to have much lower margins or even suffer losses while attempting to gain market traction.

I wouldn't buy a first-gen Intel GPU unless it offered drastically better performance per dollar and I bet I am far from alone.
 

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Well if nvidias and AMDs next launch is going to be the same as the last one then intel has nothing to fear, consumers will be able to choose between Nvidia and AMD at several times the MSRP from scalpers or an intel GPU at normal price.
Unless they turn out to be good for mining as well in which case all three will be only available at a huge markup from scalpers.

They can't sell at half "the MSRP" because then the FTC will be all the way up their business again.
  1. The rumor is that Arc will be actually good for mining (and other pro works), so...
  2. I said "at half the MSRP prices of nvidia and AMD" = which means if the MSRP of nvidia and AMD is $500 for next gen (low-mid tier), Intel's Arc being behind at that point 1 generation should be $250.
 
Status
Not open for further replies.