News Intel's Xe-HPG DG2 Graphics Card Pictured: Big, Green and Power Hungry

InvalidError

Titan
Moderator
Quite the safe bet that the board size will go down considerably for the production model from eliminating all of the extra debug stuff like the wafer connector in the middle of the area between 12V inputs (which are missing standard connectors) and VRM..
 
Provided this card is a good ETH miner, and I see no reason for it not to be, it is dead in the water on launch day, at least for gamers. Any news on GPUs these days are tending to be more depressing.
 

mfilipow

Honorable
May 18, 2015
2
0
10,510
The article is clearly wrong as far as the power connector’ layout is concerned. There is an option for both rear and top mounted connectors, up two 2x 8 pin, - and it is the top position being propagated - with 1x8 plus 1x 6 pin!!!
 

thGe17

Reputable
Sep 2, 2019
70
23
4,535
Additionally the article ignores the statement, that the 512 EU-design now seems to target < 235 W, therefore the "power hungry" seems to be incorrect.
And again, this looks more like a dev/prototype board/PCB, therefore there may be a lot of things present, that will not be needed on a final design.
Still it seems we have to wait another six months before we will know for sure. ;-)
 

zodiacfml

Distinguished
Oct 2, 2008
1,228
26
19,310
They should secretly sell this as a mining card as soon as possible with basic drivers just enough for mining and old gen games. They will do the industry a huge favor while recoup their R and D early while drivers take their time being polished.
 

InvalidError

Titan
Moderator
They should secretly sell this as a mining card as soon as possible with basic drivers just enough for mining and old gen games. They will do the industry a huge favor while recoup their R and D early while drivers take their time being polished.
Since DG2 will be made on TSMC's 7nm, that won't necessarily help much until more of TSMC's clients move to 5nm and free up 7nm wafers.
 
  • Like
Reactions: zodiacfml

VforV

Respectable
BANNED
Oct 9, 2019
578
287
2,270
Additionally the article ignores the statement, that the 512 EU-design now seems to target < 235 W, therefore the "power hungry" seems to be incorrect.
And again, this looks more like a dev/prototype board/PCB, therefore there may be a lot of things present, that will not be needed on a final design.
Still it seems we have to wait another six months before we will know for sure. ;-)
The author did not even watch the entire video this leak is based of, or he would not said so many untrue things/facts about this leak.
It's the 2nd article (at least) that I see this guy Anton Shilov do it like an amateur at best (and I won't say the worst)...
I have no ideea how one can be so terrible, yet (still) employed to write these things.
 

InvalidError

Titan
Moderator
With the current state of supply anything running at or near ampere cards at a comparable price point will sell if it is available. Intel has a big advantage here as they own their fabs.
Nuh-huh, DG2 is on TSMC's 7nm, so availability is going to be just as crap as anything else on TSMC's overbooked 7nm. (At least that's what the rumors have been saying so far.)
 

watzupken

Reputable
Mar 16, 2020
1,145
636
6,070
The burning question in my mind is when will the card be available. Given that this card is expected to be a competitor to the high end GPUs from AMD and Nvidia this generation, they will be a year behind Nvidia and AMD if they take another 5 months to release this. That leaves them with around another year before it becomes obsolete with the release of the next gen AMD and Nvidia GPUs expected in late 2022. That's worst if its expected to be widely available in Q1 2022. Even if this product is a viable competitor to Ampere and RDNA2, I feel its too late and unlikely to sell well.
 

watzupken

Reputable
Mar 16, 2020
1,145
636
6,070
The way things are going at the price quoted they could be about as fast as a 3060 and would be a success.
I actually don't have high hopes for it. The best guess from me is in the ballpark of a RTX 3070. The more Raja beats around the bush like he did with the lackluster Vega release (lackluster when compared to Pascal), the more skeptical I am. Also, the product is terribly late and I don't think its going to take off in face of next gen AMD and Nvidia GPUs expected in 2022.
 
  • Like
Reactions: thisisaname

InvalidError

Titan
Moderator
Exclusively?! Or could they maybe also use samsung and whatnot.
I mean I know that they did a deal with TSMC but does that exclude other deals/sources?
It does not exclude them but including them would require process-specific re-spin of the design, which would double the tape-out and qualification work and tens of million$ in costs. Not worth doing unless you make millions of the thing. Will Intel make and manage to sell millions of DG2s? Maybe if it launched today. At the end of the year though, most people who wanted a new GPU this generation will likely have managed to get their hands on something from the usual suspects and most of the rest will be waiting for next-gen.
 
  • Like
Reactions: thisisaname

jasonf2

Distinguished
Nuh-huh, DG2 is on TSMC's 7nm, so availability is going to be just as crap as anything else on TSMC's overbooked 7nm. (At least that's what the rumors have been saying so far.)
I had not heard this tidbit. It probably doesn't matter much at this point with Intel at max too. I would expect in the mid to longer term as Intel gets its process node mess worked out that their GPU production at least could be brought back in house if it is successful on a future generation. If TSMC is the fab then all this will do is further compact their production issues. So perhaps not. This could simply be a good way to push AMD CPU availability further back by putting more pressure on TSMC production. Intel needs some time here to get back on top of the performance crown and that will require a competitive process node in production. I knew that Intel had some TSMC production purchased and I guess that this makes as much sense as anything to fill it with something that doesn't directly compete with anything in production in Intel's current fabs.
 
Last edited:

InvalidError

Titan
Moderator
This could simply be a good way to push AMD CPU availability further back by putting more pressure on TSMC production.
The only pressure Intel can put on TSMC against AMD is on bids for spare capacity in excess of whatever extra capacity options may already be in AMD's and other TSMC clients' contracts. I don't remember Intel being in the list of companies that inherited a significant chunk of Apple's 7nm wafers when Apple decided to move a good chunk of its orders to 5nm early.
 

magbarn

Reputable
Dec 9, 2020
138
127
4,770
The burning question in my mind is when will the card be available. Given that this card is expected to be a competitor to the high end GPUs from AMD and Nvidia this generation, they will be a year behind Nvidia and AMD if they take another 5 months to release this. That leaves them with around another year before it becomes obsolete with the release of the next gen AMD and Nvidia GPUs expected in late 2022. That's worst if its expected to be widely available in Q1 2022. Even if this product is a viable competitor to Ampere and RDNA2, I feel its too late and unlikely to sell well.
I don't think the GPU market normalizes in 1 year. More like 3-4 years. In that case, Intel will sell every one of them.
 
  • Like
Reactions: thisisaname

jasonf2

Distinguished
The only pressure Intel can put on TSMC against AMD is on bids for spare capacity in excess of whatever extra capacity options may already be in AMD's and other TSMC clients' contracts. I don't remember Intel being in the list of companies that inherited a significant chunk of Apple's 7nm wafers when Apple decided to move a good chunk of its orders to 5nm early.
I don't have any specifics, only remembering that Intel had pre-purchased some capacity a while back. As there is really no spare capacity right now (with anyone) and AMD needs to make more chips to gain market share the ability to keep things even status quo levels off market share losses for Intel even without the technology lead .
 
It does not exclude them but including them would require process-specific re-spin of the design, which would double the tape-out and qualification work and tens of million$ in costs. Not worth doing unless you make millions of the thing. Will Intel make and manage to sell millions of DG2s? Maybe if it launched today. At the end of the year though, most people who wanted a new GPU this generation will likely have managed to get their hands on something from the usual suspects and most of the rest will be waiting for next-gen.
But intel already stated that they would start making architecture independent designs when they backported sunny to 14nm, would make sense if they did that with their GPU design as well since it was in a very early stage anyway.
 

InvalidError

Titan
Moderator
I don't think the GPU market normalizes in 1 year. More like 3-4 years. In that case, Intel will sell every one of them.
AMD and Nvidia will move their high-end parts to 5nm next year which will free up a lot of 7-8nm space for lower-end parts. Intel's new GPUs will have a hard brand recognition battle against the RX6500-6600 and RTX3050-3060 .

I don't have any specifics, only remembering that Intel had pre-purchased some capacity a while back. As there is really no spare capacity right now (with anyone) and AMD needs to make more chips to gain market share the ability to keep things even status quo levels off market share losses for Intel even without the technology lead .
Intel may have purchased some allocation from TSMC but AMD, Mediatek and other long-time TSMC customers have extra capacity options built into their contracts that allow them to call dibs on any capacity that gets freed up before any other companies can bid on it. The only capacity Intel can get on a new contract is the capacity nobody else has claimed by exercising whatever they may have left of their contract's capacity options.

Since AMD is 4X as large a TSMC customer as Intel is, AMD likely has 4X or more as many wafer options as Intel and will be able to grow into whatever capacity gets freed up 4X as fast as Intel is.

Intel's ability to snatch up TSMC fab capacity to block AMD from expanding is severely limited.

But intel already stated that they would start making architecture independent designs when they backported sunny to 14nm, would make sense if they did that with their GPU design as well since it was in a very early stage anyway.
The architecture isn't process-independent, Intel had to turn Sunny Cove into Cypress Cove to make it workable on 14nm. Porting an architecture between fab processes with different performance characteristics means having to make process-specific compromises and optimizations. They may be based on the same original design but are almost completely different chips beyond that (messing with L1/L2/L3 caches, the differences in primitive cell footprints, wiring density, etc. means having to completely redo the core layout) and require completely different workflow from pre-tapeout qualification through packaging.

Not worth doing unless you know for certain you will sell enough chips to amortize the costs of doing all of this work for Samsung 8nm and TSMC 7nm. My bet is Intel knows most people will pass on DG2 due to Intel's history of crappy graphics drivers so it won't be expecting to sell many of those until it can demonstrate solid drivers. By then, it will be about to launch DG3.
 

spongiemaster

Admirable
Dec 12, 2019
2,311
1,303
7,560
My bet is Intel knows most people will pass on DG2 due to Intel's history of crappy graphics drivers so it won't be expecting to sell many of those until it can demonstrate solid drivers. By then, it will be about to launch DG3.
This would be common sense thinking in a normal market, but how bad would the drivers have to be for people to not buy their cards in this market? Terrible drivers hasn't stopped people from buying ATi/AMD cards. I know, I was one of them for more than a decade. It did eventually stop me, but that was because there was price competitive competition. If my choice is a $900+ 6700XT or a similarly performing Intel GPU for $650 (which is still an inflated price for $500 RTX 3070 level performance), I'd gamble on Intel.