News Intel Teases Arc Limited Edition Discrete Desktop GPU

The lack of PCIe aux power doesn't preclude that this is a 75W only board. After all, this is a rendering for marketing, not a technical one. Especially when you consider the VRM looks too beefy for 75W and unless Intel's trying to aim for the world's quiestest air cooled video card, a four heat-pipe setup is way overkill for a 75W board.
 
More GPU press news, and still no actual device to test or buy.

How about we get the "standard" desktop GPUs on the market, and then we talk about a probably tiny niche collectible product.
Oh intel is going to flood the market with GPUs, doesn't mean that they will succeed as a GPU maker or even that this first gen is going to succeed but the GPUs will be available at large numbers.
 
  • Like
Reactions: ezst036
Oh intel is going to flood the market with GPUs, doesn't mean that they will succeed as a GPU maker or even that this first gen is going to succeed but the GPUs will be available at large numbers.

Perhaps, I learn to not believe/trust what hardware makers state until I actually see it.

As you wrote, if they actually can - sorta - "flood" the market, and thier product is decent (performance/price compared to nvidia and amd) that will be a big relief after the last two years. Even if they can not go over an RTX 3070 performance.
But yet we have to see it happend first.
 

jp7189

Distinguished
Feb 21, 2012
334
191
18,860
Oh intel is going to flood the market with GPUs, doesn't mean that they will succeed as a GPU maker or even that this first gen is going to succeed but the GPUs will be available at large numbers.
Nope. As far as I understand Intel is vying for the same limit supply of TSMC produced chips that AMD and NVIDIA are. They won't be able to bring their foundry might to bear on Arc production. I don't know what the supply agreement looks like but my crystal ball says they won't outcompete on volume.

Also, with the timeline slipping, it looks like discrete Arc will compete with next gen GPUs. If they launched earlier against current gen and with insatiable demand, they might have had a chance.

Lastly, redirecting their GPU IP to bitcoin miners seems to me like a desperate Koduri's CYA to save his job, and an admission that they can't make a profit in the mainstream GPU market.
 

Soaptrail

Distinguished
Jan 12, 2015
302
96
19,420
My first computer about 22 years ago had an Intel GPU. Either Intel is only really going for mobile or this is going to be another short blip as it was 22 years ago. Limited Edition desktop, HA!
 

Eximo

Titan
Ambassador
I wish that GTX 770 I had to use most of last year had more than 2GB of RAM. Don't be so narrow minded and short sighted, I was definitely hindered by the lack of RAM on that card.

No one was saying that 2GB isn't too low. Saying 16GB is the new minimum is a bit off which is what that comment was directed at.

4GB is about the minimum for recent titles at high settings 1080p.

All those 12GB cards from Nvidia are there purely for market advantage. Bigger numbers sell. One might ask why a 3080Ti or 3080 12GB and a 3060 12GB would be anywhere close to needing the same amount of VRAM. (Some of that comes down to catering to miners)

AMD has done this in the past, having 8GB cards when 2-4GB was common. (That was about 9 years ago by the way) You don't see people clamoring for R9-280X and R9-390 today. (Though not actually a bad idea for you if you can find one on the cheap)
 
  • Like
Reactions: TJ Hooker
Nope. As far as I understand Intel is vying for the same limit supply of TSMC produced chips that AMD and NVIDIA are. They won't be able to bring their foundry might to bear on Arc production. I don't know what the supply agreement looks like but my crystal ball says they won't outcompete on volume.

Also, with the timeline slipping, it looks like discrete Arc will compete with next gen GPUs. If they launched earlier against current gen and with insatiable demand, they might have had a chance.

Lastly, redirecting their GPU IP to bitcoin miners seems to me like a desperate Koduri's CYA to save his job, and an admission that they can't make a profit in the mainstream GPU market.
I don't know if and how much this has changed but they are using tsmc together with in house processes and the external (tsmc) seems to be a smaller part than the in house.

As far as the timeline goes, nvidia and amd are only going to make high end GPUs or at least that's what it looks like, so there will be a big part of the market that will jump on the Xe if they are ok-ish and decently priced.
Intel-4_26.jpg
 
  • Like
Reactions: Co BIY

jp7189

Distinguished
Feb 21, 2012
334
191
18,860
I don't know if and how much this has changed but they are using tsmc together with in house processes and the external (tsmc) seems to be a smaller part than the in house.

As far as the timeline goes, nvidia and amd are only going to make high end GPUs or at least that's what it looks like, so there will be a big part of the market that will jump on the Xe if they are ok-ish and decently priced.
Intel-4_26.jpg
Thanks for the slide, but even that indicates Arc is fully external... Arc=HPG, no? Which i take to mean the silicon is TSMC while the surrounding card might be made elsewhere; same as the competition.

You make a good point that Arc might have some time against current gen mainstream (3050, 3060) while we wait for next gen to trickle down the tiers. In that case, I don't think Arc will approach the 60% margin that Intel has historically aimed for in the past.
 

InvalidError

Titan
Moderator
8GB of ram would be generous for a low-end card
When 6GB has become practically essential to achieve decent frame rates at anything above lowest details, I wouldn't call 8GB generous. Kind of necessary if decently fast entry-level SKUs are limited to x8 PCIe as future titles start relying more heavily on asset streaming from NVMe and system memory cache, especially as those assets continue getting bigger and more often passing the 6GB active data set mark even at reduced details.

New low-end gaming GPUs with greater than 1650S performance really need at least 6GB to achieve the highest details they can reasonably sustain, 8GB to be reasonably future-proofed against asset bloat within their useful life where you may get better overall appearance and performance from trading compute-intensive effects for increased raw geometry and texture details.
 

jp7189

Distinguished
Feb 21, 2012
334
191
18,860
No one was saying that 2GB isn't too low. Saying 16GB is the new minimum is a bit off which is what that comment was directed at.

4GB is about the minimum for recent titles at high settings 1080p.

All those 12GB cards from Nvidia are there purely for market advantage. Bigger numbers sell. One might ask why a 3080Ti or 3080 12GB and a 3060 12GB would be anywhere close to needing the same amount of VRAM. (Some of that comes down to catering to miners)

AMD has done this in the past, having 8GB cards when 2-4GB was common. (That was about 9 years ago by the way) You don't see people clamoring for R9-280X and R9-390 today. (Though not actually a bad idea for you if you can find one on the cheap)
I like to play games with mod support, and felt very limited with a 3080 10GB. I upgraded to a 3090 and routinely use more than 16GB.

Now I know that's not the mainstream, but if developers knew they had more VRAM to work with, then they'd surely use it to make better looking games. Esp. now that we have faster SSDs and streamlined APIs coming.
 
We are coming to gaming and rendering era where 16GB should be minimum.
Minimum for what tier? High end? I would agree we're approaching this, but it's still going to be a generation away before it becomes an absolute minimum. For anything lower? No.

Baking lightmaps, using 4K textures requires as much VRAM as you can get. Also if VRAM is inexpensive as memory RAM DDR4, then all the better to use more of inexpensive resource. If you plan to game on 4K you would need solid VRAM GPU.
Higher resolution textures and such also require a higher rendering resolution to actually be taken advantage of, so you'd need a relatively high-end GPU anyway to achieve 60FPS at 4K, which most of those come with plenty of VRAM anyway.

Besides that, depending on how things are set up, you don't need that much VRAM for every texture you'll see in a given level. The assets will just be streamed in and out as needed.

I wish that GTX 770 I had to use most of last year had more than 2GB of RAM. Don't be so narrow minded and short sighted, I was definitely hindered by the lack of RAM on that card.
And you're using a card from almost 8 years ago at the time. This is like going back to 2013 and complaining how your GeForce 7800 or Radeon X1900 sucks in modern titles (if they could run them at all).
 
Last edited:

evdjj3j

Honorable
Aug 4, 2017
315
325
11,060
No one was saying that 2GB isn't too low. Saying 16GB is the new minimum is a bit off which is what that comment was directed at.

4GB is about the minimum for recent titles at high settings 1080p.

All those 12GB cards from Nvidia are there purely for market advantage. Bigger numbers sell. One might ask why a 3080Ti or 3080 12GB and a 3060 12GB would be anywhere close to needing the same amount of VRAM. (Some of that comes down to catering to miners)

AMD has done this in the past, having 8GB cards when 2-4GB was common. (That was about 9 years ago by the way) You don't see people clamoring for R9-280X and R9-390 today. (Though not actually a bad idea for you if you can find one on the cheap)

It's called future proofing and I gave you a real life example of when having RAM in excess of what is CURRENTLY needed would have been greatly appreciated.
 

spongiemaster

Admirable
Dec 12, 2019
2,277
1,280
7,560
It's called future proofing and I gave you a real life example of when having RAM in excess of what is CURRENTLY needed would have been greatly appreciated.
There's no such thing as future proofing. 8 years from now, I would expect most games to be primarily ray traced. Nothing sold today regardless of the amount of RAM onboard is going to be usable for AAA games.
 
Thanks for the slide, but even that indicates Arc is fully external... Arc=HPG, no? Which i take to mean the silicon is TSMC while the surrounding card might be made elsewhere; same as the competition.

You make a good point that Arc might have some time against current gen mainstream (3050, 3060) while we wait for next gen to trickle down the tiers. In that case, I don't think Arc will approach the 60% margin that Intel has historically aimed for in the past.
Arc is the whole line up, sure HPG is the gaming line so that's what most people here care about but still, making much of everything else on 10nm opens up much more volume than if intel had to make their iGPUs, server and everything else on TSMC as well.
 
It's called future proofing and I gave you a real life example of when having RAM in excess of what is CURRENTLY needed would have been greatly appreciated.
Even if all that was needed to "future proof" a video card was to add more VRAM, there's a problem: video card manufacturers are at the mercy of what's available capacity wise for VRAM chips.

Take for instance that GTX 770. It was using 256MB chips. There was evidence to suggest 512MB chips were available, but given NVIDIA only put them on the Titan, it also likely means they were more expensive. This is also supported by the fact AMD had no video cards at the time using 512MB chips. So either the price of the card would've had to be jacked up to use 512MB chips, the memory controllers would've had to double, or segmented memory would've had to be used. While the first option would be the cheapest, I'm going to take a WAG and say it would've added at least $100 to the price of the card. The second option would've been too expensive, among other issues that go along with having more lines. The third option would get you the problem that the GTX 970 had.

And even then, what does that buy you? The performance levels of the GTX 770 now are worse than a GTX 1650.
 

InvalidError

Titan
Moderator
what are you talking about? There are many 4/6gb cards that do fine at 1080p.
Depends on your definition of fine. The RX6500 on 3.0x4 vs 4.0x4 shows that there is a pretty good number of recent games where 4GB just doesn't cut it unless you have somewhat decent access to system memory for asset streaming. It will only get much worse for 4GB cards from here.

4GB for new gaming cards is NFG beyond lightweight games.