• Happy holidays, folks! Thanks to each and every one of you for being part of the Tom's Hardware community!

News Intel's Raja Koduri Shows Off Xe-HPG Gaming GPU with 512 EUs

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
thing that shall not be named here
5byusu.jpg
 
  • Like
Reactions: btmedic04
According to MLID's leaks and commentary, Xe is going to be playing the "underdog" card to gain traction* in the market; which is good for consumers and what we all need right now. I do agree that, if Xe is good at mining, then it will just poof-away like any other card irrespective of price. Ironically, while this would be good for the sales pitch/numbers, it would be bad for Intel in the long run if they actually want to stay in it. Why? Well, when you're putting out your first product out there, if it's not used for the intended purpose you are selling it for, then you won't get any lessons learned and may as well just change the target demographic altogether. So, assuming Intel is not completely filled/run with/by stupid and incompetent neanderthals, I'll assume they have already thought of this and have a plan or a way to keep their cards going to their real target demographic first and a small portion thrown to whatever other markets can find a use for it. At least, reviewers will get them, so potential buyers will see what Intel wants to show for it ("reviewing guidance", remember) in numbers and, well, whatever else the card can actually do if reviewers get creative.

I want Xe to do good, I really do. Do I have my hope high? No... Not really. But not because of hardware concerns. Instead, I'm 150% wary of drivers (hence my original comment).

Cheers!
 
The only concern I have with Intel making their own GPUs is Intel being Intel and offering "incentives" for system builders to use their cards over anyone else's.
This is a very valid concern to have and one that makes perfect sense to ask Intel point blank.

And I'm sure they will. I think I heard on one of the many YT'ers that Intel was already strong arming OEMs to not make AMD-based laptops too high end and still keep them at the top. So...

Regards.
 
Ironically, while this would be good for the sales pitch/numbers, it would be bad for Intel in the long run if they actually want to stay in it. Why? Well, when you're putting out your first product out there, if it's not used for the intended purpose you are selling it for, then you won't get any lessons learned and may as well just change the target demographic altogether.
They can learn all they want and need to from laptops and the OEM systems that will ship without any dGPU, some percentage of those will be without an dGPU at least for some time.
Also they should have already send development units to all the mayor game studios, sure they might decide not to optimize for intel until it's worth it for them, but for a lot of competitive/esports titles at least it will basically be a must to include intel gpus in the optimization since that will be the only thing many systems will have, but now they will actually be able to play those games well.
So plenty of experience will be gained even if the gaming tier GPUs never reach gamer hands.
 
This is a very valid concern to have and one that makes perfect sense to ask Intel point blank.

And I'm sure they will. I think I heard on one of the many YT'ers that Intel was already strong arming OEMs to not make AMD-based laptops too high end and still keep them at the top. So...

Regards.

On the other hand, Intel is pumping out the F series chips like never before, if they start bundling Intel dGPUs with them to OEMs, might free up more low/mid range AMD and Nvidia cards for boutique builders...
 
On the other hand, Intel is pumping out the F series chips like never before, if they start bundling Intel dGPUs with them to OEMs, might free up more low/mid range AMD and Nvidia cards for boutique builders...
I wish you were right... Considering the big OEMs make their own GPUs and don't really affect the AIB's market, I'm not sure if you'd be correct there, but I hope you'd be though.

Regards.
 
That is true, but if Intel was providing dGPUs at a discount, they wouldn't need to manufacture their own cards (or would switch whoever is doing it for them over to Intel boards). It would take a bit to trickle down though, would have to wait for existing contracts/orders to lapse.
 
That is true, but if Intel was providing dGPUs at a discount, they wouldn't need to manufacture their own cards (or would switch whoever is doing it for them over to Intel boards). It would take a bit to trickle down though, would have to wait for existing contracts/orders to lapse.
Much like hotaru.hino said: Intel has ways to "convince" OEMs to follow suit and just do what it wants. That is not a trivial point to ignore in this, which also scares me a bit. Still, the lesser evil right now is getting more GPU players in the market.

Out the three, nVidia should be the most scared if Intel does well.

Regards.
 
Wants the influx of cash? Sure? Needs it? Not so much, I think. Intel isn't suffering financially.

I'm not of the opinion that Intel is hurting. But I am of the opinion that updating many fabs might require even many more billions than this. And that's just to get them up to date. Thats not including even more to get past 7nm and getting to 5nm, or even more so what it will cost to get ahead and beat TSMC to some even lower nm before they get there.

Extra cash will be needed.
 
They will be made on TSMC's 7nm, you still won't be able to buy them even after they launch for another year.
At the rate they are headed, these cards will be last-gen. Ampere Next and Navi 2's successors are already coming in 2022. Remember, it was marketing under Raja Koduri who pushed fake smack like "Poor Volta" when it should have been "Poor Vega" which fell way short of expectations.
 
They'll get tax breaks in the US that will let them cover the costs of new facilities and equipment. Same if Japan snags them for new fabs, I'm sure.

Keep in mind, they are always upgrading, so those costs will already be factored into to their current net profit.

As they say, money is cheap. Capital loans to Intel is basically a sure thing.
 
At the rate they are headed, these cards will be last-gen. Ampere Next and Navi 2's successors are already coming in 2022. Remember, it was marketing under Raja Koduri who pushed fake smack like "Poor Volta" when it should have been "Poor Vega" which fell way short of expectations.

Even an RTX 3050 Ti competitor would be welcome at this point, at almost any price, with all the SKUs planned, one will probably fall in that range.
 
I'm not of the opinion that Intel is hurting. But I am of the opinion that updating many fabs might require even many more billions than this. And that's just to get them up to date. Thats not including even more to get past 7nm and getting to 5nm, or even more so what it will cost to get ahead and beat TSMC to some even lower nm before they get there.

Extra cash will be needed.
Intel's process on one node is better than TSMC's. i.e., their 10nm is closer to TSMC's 7nm than TSMC's 10nm. Their 7nm is suggested to be better than TSMC's 5nm. In fact, it's best to just ignore the node numbers altogether because they've been effectively meaningless since 22nm or abouts.

They may also not update existing fabs considering they're building 7nm fabs (https://www.anandtech.com/show/1657...ndry-services-ibm-collaboration-return-of-idf) and not everything needs to be on the bleeding edge. TSMC owns a number of fabs that aren't even making <14nm parts.