News Intel Arc A380 Reviewed, Specs of Full GPU Lineup Revealed

InvalidError

Titan
Moderator
If the A380 performs on par with the 1650S, then that makes it worse than the RX6500 when it isn't hobbled by its VRAM size, bandwidth and x4 PCIe. Wonder what this is going to translate to in street prices.
 
If the A380 performs on par with the 1650S, then that makes it worse than the RX6500 when it isn't hobbled by its VRAM size, bandwidth and x4 PCIe. Wonder what this is going to translate to in street prices.

It has to be <$200 to be viable. Intel is unproven in the field of long term support on hardware ventures. Larrabee/Knights Corner, Itanium, and XPoint come to mind. (Yes yes, I know Itanium had a long life considering, but Intel pretty much admitted defeat after adding x64 extensions by AMD. Any SERIOUS resource money stopped after x64 extensions were added to x86)
 

InvalidError

Titan
Moderator
It has to be <$200 to be viable.
The RX6500's festival of cut corners retails for ~$270 while the A380 has 50% more VRAM bandwidth, 50% more VRAM, full encode/decode acceleration and most likely at least 4.0x8 PCIe. I'd say until the RX6500's effective retail price drops, there is plenty of room for the A380 above $200 since it lacks at least four of the RX6500's worst shortcomings.

Given the choice between A380 and the RX6500 at $200, I'd probably pick the A380 mainly for the extra VRAM and not having to worry about 4.0x4 becoming a major bottleneck later. 50W lower TDP doesn't hurt either.
 
The RX6500's festival of cut corners retails for ~$270 while the A380 has 50% more VRAM bandwidth, 50% more VRAM, full encode/decode acceleration and most likely at least 4.0x8 PCIe. I'd say until the RX6500's effective retail price drops, there is plenty of room for the A380 above $200 since it lacks at least four of the RX6500's worst shortcomings.

Given the choice between A380 and the RX6500 at $200, I'd probably pick the A380 mainly for the extra VRAM and not having to worry about 4.0x4 becoming a major bottleneck later. 50W lower TDP doesn't hurt either.
Well I'm a cheap bastard. I thought $650 was too much for a 6800XT. And to convince me Intel really needs to take a loss on first gen. Their long term support history is horrid. The 1660S had a MSRP of $229. Undercutting it isn't too much to ask for.

Introducing GeForce GTX 1660 and 1650 SUPER GPUs, and New Gaming Features For All GeForce Gamers | GeForce News | NVIDIA
 
It has to be <$200 to be viable. Intel is unproven in the field of long term support on hardware ventures. Larrabee/Knights Corner, Itanium, and XPoint come to mind. (Yes yes, I know Itanium had a long life considering, but Intel pretty much admitted defeat after adding x64 extensions by AMD. Any SERIOUS resource money stopped after x64 extensions were added to x86)
Xeon Phi and Itanium were niche products anyway. One could argue Intel was late to the party regarding Phi since at that point, NVIDIA had ~3 year headstart in the GPGPU market. However, despite that you say that Intel gave up on Itanium the moment x64 was out, they still supported it until 2021. For a "dead on arrival" product line, that sure is a long support time.

I don't think Intel threw in the towel with XPoint, considering it still performs much better than flash memory in terms of IOPS, which is a much more useful spec than raw bandwidth. But if it's going to die, it'll be because Intel doesn't want to share the technology.

Intel still has a extensive NIC lineup. And if anything, Intel can "convince" system builders to use their GPUs anyway.
 

InvalidError

Titan
Moderator
Their long term support history is horrid. The 1660S had a MSRP of $229. Undercutting it isn't too much to ask for.
It is a lot to ask for when GDDR6 currently costs about twice as much as it did back then, shipping costs 6-10X as much, most other input costs have gone up 10-20% and then you have to add the US import tariffs that didn't exist back then on top. If Nvidia wanted to re-launch the 1660S today, Nvidia would likely need to take a hit to its gross profit margin for those GPUs to hit MSRPs below $300.
 
Xeon Phi and Itanium were niche products anyway. One could argue Intel was late to the party regarding Phi since at that point, NVIDIA had ~3 year headstart in the GPGPU market. However, despite that you say that Intel gave up on Itanium the moment x64 was out, they still supported it until 2021. For a "dead on arrival" product line, that sure is a long support time.

I don't think Intel threw in the towel with XPoint, considering it still performs much better than flash memory in terms of IOPS, which is a much more useful spec than raw bandwidth. But if it's going to die, it'll be because Intel doesn't want to share the technology.

Intel still has a extensive NIC lineup. And if anything, Intel can "convince" system builders to use their GPUs anyway.

Itanium budgets disappeared after x64 extensions took off. The only thing keeping it afloat was like sun sparc systems and hp that had clients dependent upon it.
 
It is a lot to ask for when GDDR6 currently costs about twice as much as it did back then, shipping costs 6-10X as much, most other input costs have gone up 10-20% and then you have to add the US import tariffs that didn't exist back then on top. If Nvidia wanted to re-launch the 1660S today, Nvidia would likely need to take a hit to its gross profit margin for those GPUs to hit MSRPs below $300.

You're claiming prices went up over 50%. Sorry I'm not buying that. That's like the aib's claiming they had to raise the prices because aluminum went through the roof. They use like $3.00 in Aluminum on most heatsinks. That's not an exaggeration. Even if you doubled the price it would not justify $50 price hikes.
 

InvalidError

Titan
Moderator
You're claiming prices went up over 50%. Sorry I'm not buying that. That's like the aib's claiming they had to raise the prices because aluminum went through the roof. They use like $3.00 in Aluminum on most heatsinks. That's not an exaggeration. Even if you doubled the price it would not justify $50 price hikes.
The doubling of VRAM prices adds ~$30, VRM component shortages adds ~$10, the sextupling of shipping costs adds ~$10, the 20-30% increase in raw material costs and machining costs for the HSF adds ~$5, then you have to add everyone else in the supply chain's increased sick days and other likely permanent post-pandemic costs adding 10-20%. Finally, you have the 25% import tax for GPUs made in China.

The import tax alone is a $60 price increase for a $220 GPU before accounting for any of the cost increases.
 
The doubling of VRAM prices adds ~$30, VRM component shortages adds ~$10, the sextupling of shipping costs adds ~$10, the 20-30% increase in raw material costs and machining costs for the HSF adds ~$5, then you have to add everyone else in the supply chain's increased sick days and other likely permanent post-pandemic costs adding 10-20%. Finally, you have the 25% import tax for GPUs made in China.

The import tax alone is a $60 price increase for a $220 GPU before accounting for any of the cost increases.

Well the price of DDR6 varies, but for 8 gigs you aren't looking at ~$7.50/chip*8 last I checked (1GB/16). And that's not bulk pricing. 3 years ago pricing was outrageous, but it's coming closer to DDR5 now. This is a low powered device so assuming 5 VRM's, your $10 add on is high (1 memory, 1 PCIe bus for GPU, 3 off PCIe cables for GPU), That many VRMs for such a low end card is...insane.

Die area is ~1/3 @ 8nm. And that will generate some savings. (Yes I realize 8nm is more expensive, but there is still savings there.)

GPU's partially avoid the import tax by doing what my old company did. They import a semi finished PCB and do the last of the assembly in the USA. Anything assembled in the USA avoids a number of import taxes. Only finished products get taxed.

But if you say so I guess so. But it's insane a lower end GPU can cost more than a higher end predecessor by 50%. Quite frankly it's unacceptable in my book. Decent $150 cards were prevalent and plentiful at one time.