News Intel Xe Graphics: Release Date, Specs, Everything We Know

waltc3

Reputable
Aug 4, 2019
423
227
5,060
Reminds me of the kind of nonsense people wrote about Larrabee years ago--although hyped for years by people who knew nothing about it at all, Intel cancelled it before the first product hit the market. Intel has a long, long way to go before it will catch AMD/nVidia on the discreet GPU side of the street.
 
I had an i740 - that model was called "Starfighter", if I recall correctly.

If they had stuck with their graphics development, they could have been on or even surpassing the levels of Nvidia/AMD today, given all of the resources they could have devoted to it.
 
  • Like
Reactions: JarredWaltonGPU

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
:Intel has repeatedly proven over the past decade that it makes inferior GPUs and bundles them into its CPUs : Yet those Inferior IGP graphics are one of the reasons AMD cannot get a hold in the OEMs for business customers - as "inferior" as they are - they mean that a video card is not needed in the BoM and no need to be supported by IT departments.
 
Intel's Xe Graphics will join the dedicated graphics card market this year, promising a new architecture and vastly improved features and performance for Intel GPUs.

Intel Xe Graphics: Release Date, Specs, Everything We Know : Read more

Ahhh, the good 'ole i740....; I bought one, installed it in my K6-2/350 rig and proceeded to reformat, reinstall OS/ chipset /i740 drivers, install everything fresh 4 times trying to get it to work, but Quake 2 consistently played with what looked like a moving broken glass wireframe superimposed on top of the well rendered game. Forced to return it and go to the 2 card solution, with a Voodoo2!
 
  • Like
Reactions: alextheblue
Reminds me of the kind of nonsense people wrote about Larrabee years ago--although hyped for years by people who knew nothing about it at all, Intel cancelled it before the first product hit the market. Intel has a long, long way to go before it will catch AMD/nVidia on the discreet GPU side of the street.
Larrabee actually had a ton of potential and was mostly killed by internal politics at Intel -- the CPU guys didn't want it to take over any of their turf, and Intel viewed gaming as something for kids. Still, Larrabee's descendants would live on in the form of the Xeon Phi -- not the best at everything, but certainly capable in the right workloads.

Xe Graphics is a completely different beast, however. Intel isn't trying to do software GPU running on x86 cores this time. It's doing a proper scale up of its existing GPU, with hopefully better driver support. Yes, Intel UHD Graphics 630 is weak compared to any modern dedicated GPU. But it's also only a 460 GFLOPS architecture, built with 24 EUs. Scaling that up isn't trivial, but Gen11 already did a lot of the legwork. Instead of one slice, eight sub-slices, and 64 EUs, Xe Graphics will scale up to at least eight slices, 64 sub-slices, and 512 EUs. Get properly functioning drivers behind that and it's going to be pretty impressive.

Question is, will Intel sell such a GPU at a reasonable price? 512 EUs for $500 would be very competitive I think. 512 EUs for $1000? Not so much.
 
  • Like
Reactions: PaulAlcorn
:Intel has repeatedly proven over the past decade that it makes inferior GPUs and bundles them into its CPUs : Yet those Inferior IGP graphics are one of the reasons AMD cannot get a hold in the OEMs for business customers - as "inferior" as they are - they mean that a video card is not needed in the BoM and no need to be supported by IT departments.

Business leases are into laptops these days. And Intel thanks to bulldozer->excavator just didn't have anything that was competitive in this space power efficiency/heat wise.

Now look at the design wins of intels latest versus AMD's latest laptop CPU's and you see a big difference. Smaller power envelope and greater performance for a similar price point with AMD is causing a lot of manufacturers to look twice. Just a matter of convincing people at the top to change their ways. They are more concerned about running a business than the latest tech advancements. Their life is mired in in the day to day operations and what they know. They no longer look at the latest and greatest, but rather look at what they know works, even if it's not the best choice.
 
  • Like
Reactions: bit_user
:Intel has repeatedly proven over the past decade that it makes inferior GPUs and bundles them into its CPUs : Yet those Inferior IGP graphics are one of the reasons AMD cannot get a hold in the OEMs for business customers - as "inferior" as they are - they mean that a video card is not needed in the BoM and no need to be supported by IT departments.
Do you even read tech reviews or its your 1st time? AMD have iGPU for a very long time. AMD's integrated GPUs are superior to Intel's. It has always been like that.

The 599$ Xe GPU seems decent for the price if Intel can provide it with fast bug-free and wide compatibility software.
 
Last edited:

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
Do you even read tech reviews or its your 1st time? AMD have iGPU for a very long time. AMD's integrated GPUs are superior to Intel's. It has always been like that.

Sure, which has always landed it in that wonderful no man's land of unnecessarily power for business desktops while still not fast enough to do any real gaming. A feature with no market.
 
Sure, which has always landed it in that wonderful no man's land of unnecessarily power for business desktops while still not fast enough to do any real gaming. A feature with no market.
Who told you it doesn't game loll? Even the Intel can play games like CSS. AMD is just better at gaming also. It can play any game at 720p. It can play not heavy games at 1080p.
There's many ppl here on the forum with APUs.

Both ways, he's wrong by saying Intel is better because AMD doesn't have iGPU lmao
 
Last edited:
  • Like
Reactions: alextheblue
Who told you it doesn't game loll? Even the Intel can play games like CSS. AMD is just better at gaming also. It can play any game at 720p. It can play not heavy games at 1080p.

Both ways, he's wrong by saying Intel is better because AMD doesn't have iGPU lmao
I'm pretty sure the point is that it doesn't run games well. CSGO isn't a good example of playing a lot of games, unless you only play light indie games or stuff like LOL.

Yes, you can play some games on Intel at reasonable framerates (>30 fps at 720p). Plenty of others (Doom Eternal, Red Dead Redemption 2) simply won't run at all on Intel. Yes, Vega 11 integrated graphics is 2-3 times faster than Intel's IGP. But there are still a lot of games that will only manage 30 fps at 720p and low to medium settings on Vega 11. It's playable, it's better than Intel, but it's not great.

As I said above, I'd rather get a Ryzen 1600 AF and then pay for a modest dedicated GPU over the 3400G, because it's a better overall gaming experience for a similar price.

Basically, for gaming performance: Nvidia GPU > AMD GPU > AMD iGPU > Intel iGPU
You can certainly play any current game with a $150 AMD or Nvidia GPU. Nearly every game will at least run on an AMD APU. And many games will have issues on Intel's existing (integrated) GPUs.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
Who told you it doesn't game loll? Even the Intel can play games like CSS. AMD is just better at gaming also. It can play any game at 720p. It can play not heavy games at 1080p.

If your goal is to play AAA titles on day one, you do not want to be using any form of iGPU. With, the sacrifices you will have to make, if it plays at all, it will be nothing like the experience the developers intended.
 
  • Like
Reactions: JarredWaltonGPU
Reminds me of the kind of nonsense people wrote about Larrabee years ago--although hyped for years by people who knew nothing about it at all, Intel cancelled it before the first product hit the market. Intel has a long, long way to go before it will catch AMD/nVidia on the discreet GPU side of the street.

Intel has vastly more money to throw at the problem than AMD and Nvidia. In fact if I remember correctly Intel makes more per quarter than the others do typically per year. May be slightly less. They have the money to throw at it.

They also have a massive software division with probably more combined experience than most out there.

I think if Intel really wants to do this and do it right they will do the product right. The issue will be the price.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360
Intel has vastly more money to throw at the problem than AMD and Nvidia. In fact if I remember correctly Intel makes more per quarter than the others do typically per year. May be slightly less. They have the money to throw at it.

They also have a massive software division with probably more combined experience than most out there.

I think if Intel really wants to do this and do it right they will do the product right. The issue will be the price.

Intel certainly has the financial wherewithal to make it happen. Their CCG (Client Computing Group - desktop PCs - one of many business units) alone pulls in more revenue per quarter ($9.8B 1Q2020) than AMD's entire operation does in a year ($6.73B for 2019), and nearly as much as Nvidia's yearly revenue ($11.7B in 2019), too. Zooming out to Intel's full operations - $19.3B last quarter.

Intel insists on running crazy-high margins too (60% or more). That meshes in with your comment on price though. Intel isn't interested in low-margin markets, so expect them to price at a premium - regardless if it's justified or not.
 

spongiemaster

Admirable
Dec 12, 2019
2,278
1,281
7,560
Intel insists on running crazy-high margins too (60% or more). That meshes in with your comment on price though. Intel isn't interested in low-margin markets, so expect them to price at a premium - regardless if it's justified or not.

That may work in the enterprise market, it won't work at all in the gaming market. There is a giant problem in both markets that's going to limit how much of a premium Intel is able to charge, and that problem is Nvidia who is already running Intel margins. In the GPU market, Nvidia has every bit the reputation and product portfolio that Intel has in the CPU market, so Intel is not going to be able to waltz in and charge more than Nvidia's already high margin prices.

AMD's GPU hardware is typically pretty solid, the problem is their garbage software. There are a lot of gamers who won't put up with the AMD drivers despite a typical price/performance advantage vs Nvidia. It would be shocking if Intel came out of the gate with top notch drivers. Intel isn't going to sell any cards to gamers if they think they can be in price parity with Nvidia, or even worse, while having AMD or worse level software.
 
  • Like
Reactions: JarredWaltonGPU

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360
That may work in the enterprise market, it won't work at all in the gaming market. There is a giant problem in both markets that's going to limit how much of a premium Intel is able to charge, and that problem is Nvidia who is already running Intel margins. In the GPU market, Nvidia has every bit the reputation and product portfolio that Intel has in the CPU market, so Intel is not going to be able to waltz in and charge more than Nvidia's already high margin prices.

AMD's GPU hardware is typically pretty solid, the problem is their garbage software. There are a lot of gamers who won't put up with the AMD drivers despite a typical price/performance advantage vs Nvidia. It would be shocking if Intel came out of the gate with top notch drivers. Intel isn't going to sell any cards to gamers if they think they can be in price parity with Nvidia, or even worse, while having AMD or worse level software.

Fair points. It might behoove Intel's longer-term strategies to absorb some losses/low margin in client to undercut Nvidia in the client market. Nvidia has put a hurting on Intel in DC (just look at how many Xeons a GPU can replace). Granted, that's different types of compute, but Intel is also pressing into HPC/DC with GPU, so stripping away some of Nvidia's economy of scale on the client side might be just what the doctor ordered, at least in Intel's eyes. Pricing will definitely be a big component here.
 
  • Like
Reactions: JarredWaltonGPU
Fair points. It might behoove Intel's longer-term strategies to absorb some losses/low margin in client to undercut Nvidia in the client market. Nvidia has put a hurting on Intel in DC (just look at how many Xeons a GPU can replace). Granted, that's different types of compute, but Intel is also pressing into HPC/DC with GPU, so stripping away some of Nvidia's economy of scale on the client side might be just what the doctor ordered, at least in Intel's eyes. Pricing will definitely be a big component here.
One of the big questions I have is how good Xe HPC will be when it comes to data center workloads. Intel has some relatively solid HPC designs for CPUs in Xeon. Strip out a bunch of the extraneous stuff and focus on just packing in a bunch of high performance compute cores (basically simplified ALUs) with lots of vector processing capabilities, and Intel could regain some lost ground.

I mean, there's a reason Nvidia can make a 4608 core GPU for data center stuff while Intel's CPUs top out at 28 cores -- the Nvidia cores are way less complex. There's no reason Intel can't dumb down some of their ALU complexity to make a fast and efficient compute core. And, who knows, maybe even keep a few bits and pieces that actually help performance relative to typical GPU cores?

My concern is if Intel's Xe HP also ends up being used in data center. That will mean more complexity and higher prices, and then the consumer products might end up with junk they don't need. I hope Xe HPC gets high performance FP64 and Xe HP doesn't -- then Xe HP consumer parts can be priced more competitively, because they won't cannibalize the high-end data center stuff.
 

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
Business leases are into laptops these days. And Intel thanks to bulldozer->excavator just didn't have anything that was competitive in this space power efficiency/heat wise.

Now look at the design wins of intels latest versus AMD's latest laptop CPU's and you see a big difference. Smaller power envelope and greater performance for a similar price point with AMD is causing a lot of manufacturers to look twice. Just a matter of convincing people at the top to change their ways. They are more concerned about running a business than the latest tech advancements. Their life is mired in in the day to day operations and what they know. They no longer look at the latest and greatest, but rather look at what they know works, even if it's not the best choice.

A lot of the people making the decision about what goes into the data center were likely the ones who had to clean up the Opteron ECC debacle - and AMD is still not seen as a stable player - Stock price is grossly inflated with a 160x P/E - should be a $5 stock. So, yeah, a lot to overcome - the main issue is their VM farms are all Intel and AMD is NOT a direct drop in replacement - and a homogeneous hardware platform makes managing a large install much easier since all the blocks are the same.

For my use case in a laptop - there is ZERO value in anything over 4 cores - I do not maintain a DTR or a gaming laptop - replaced my almost 2 year old Dell 13 2-in-1 with the Ice Lake Dell 13 2-in-1 back in October - and it's noticeably faster and more responsive - and does everything I need it to do - but they might get 300 hours of use per year - my wife's a bit more.

I own 3 businesses and employ well over 1000 people - I take the lead on hardware, and there won't be a point when AMD will be deployed here - when we do upgrade our Xeon Scalable 1st gen it will be with Ice Lake, all flash array will have 1st gen Optane U.2 replaced with 2nd gen U.2... and can seamlessly move VMs over with zero issues. 3 generation of Intel NUCs will be replaced with NUC11s (Tiger Lake). The 20 or so MS Surfaces installed will be replaced with Tiger Lake Surface.

The sun in setting on AMD - like it did when Core dropped - zero sunshine for more than a decade. Turns out world doesn't care about lithography or manufacturing methods... AMD will have some wins - but Ryzen and Epyc has not performed to the expectations Lisa Su had... I know the kiddies LOVE AMD and it can do no wrong -but revenues drive R&D, . Intel 10nm and 10nm+ are firing on all cylinders - a flurry of products will be dropping this year - Servers, Laptops, Desktop (Rocket Lake S (on 14nm) and NUCs - as well as next gen Optane DIMMs and SSDs, GPUs,etc.
 

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
One of the big questions I have is how good Xe HPC will be when it comes to data center workloads. Intel has some relatively solid HPC designs for CPUs in Xeon. Strip out a bunch of the extraneous stuff and focus on just packing in a bunch of high performance compute cores (basically simplified ALUs) with lots of vector processing capabilities, and Intel could regain some lost ground.

I mean, there's a reason Nvidia can make a 4608 core GPU for data center stuff while Intel's CPUs top out at 28 cores -- the Nvidia cores are way less complex. There's no reason Intel can't dumb down some of their ALU complexity to make a fast and efficient compute core. And, who knows, maybe even keep a few bits and pieces that actually help performance relative to typical GPU cores?

My concern is if Intel's Xe HP also ends up being used in data center. That will mean more complexity and higher prices, and then the consumer products might end up with junk they don't need. I hope Xe HPC gets high performance FP64 and Xe HP doesn't -- then Xe HP consumer parts can be priced more competitively, because they won't cannibalize the high-end data center stuff.
WOW - so GPU cores are = to CPU cores? Why would Intel change what works - I am sure no one has tried to make that comparison, kinda like inject bleach or Lysol to cure COVID-19 level of WTF.

XE HP and HPC WILL be used in Data Centers... The building blocks are basically the Gen12 GPU in Tiger Lake -Ponte Vecchio will be an evolution of that building block. None of this is hard to understand - but you seem to not understand how this is all going to work.

I will make it simple - Intel Xe HP (Intel Xe graphics with a higher power envelope) will compete with Nvidia Tesla (a data center compute card). GPUs are used for acceleration - all those small simple cores do COMPLETELY different things than a CPU (central processing unit - the brain).

So to recap. GPUs are not CPUs, CPUs are not GPUs. GPUs cannot run Windows or Linux (Operating systems) and CPUs are terrible at graphics (the pretty pictures you see on your monitor *(monitor is the screen on your PC (personal computer)).

Xe HPC will be several thousands of dollars, and Xe HP based video cards will range from low end to high end, with prices to match. Pretty sure Intel has done research to determine configurations and pricing. They have people who, you know, develop GPUs and not just write about them.

Seriously, Senior GPU Editor... WOW.
 

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
Fair points. It might behoove Intel's longer-term strategies to absorb some losses/low margin in client to undercut Nvidia in the client market. Nvidia has put a hurting on Intel in DC (just look at how many Xeons a GPU can replace). Granted, that's different types of compute, but Intel is also pressing into HPC/DC with GPU, so stripping away some of Nvidia's economy of scale on the client side might be just what the doctor ordered, at least in Intel's eyes. Pricing will definitely be a big component here.
A GPU cannot replace a CPU - used for different things. Yes Nvidia basically invented the "GPUs as an accelerator" segment - but those GPUs are almost exclusively installed on systems with Intel Xeons. I would imagine that the 1st gen of the Xe HP at least will be sold at a price that makes it revenue neutral - But price is only a small part of the equation for these segments.

Intel XE HP and HPC will make a significant dent in Nvidia - it's CUDA Ecosystem is entrenched and will take some $$$ and muscle to dislodge it. Intel will introduce applications to convert CUDA to OneAPI - at some point the bulwark will be breached. Price for these segments are secondary at best.

Removed
 
Last edited by a moderator:

Deicidium369

Permanantly banned.
BANNED
Mar 4, 2020
390
61
290
That may work in the enterprise market, it won't work at all in the gaming market. There is a giant problem in both markets that's going to limit how much of a premium Intel is able to charge, and that problem is Nvidia who is already running Intel margins. In the GPU market, Nvidia has every bit the reputation and product portfolio that Intel has in the CPU market, so Intel is not going to be able to waltz in and charge more than Nvidia's already high margin prices.

AMD's GPU hardware is typically pretty solid, the problem is their garbage software. There are a lot of gamers who won't put up with the AMD drivers despite a typical price/performance advantage vs Nvidia. It would be shocking if Intel came out of the gate with top notch drivers. Intel isn't going to sell any cards to gamers if they think they can be in price parity with Nvidia, or even worse, while having AMD or worse level software.
Some of us are not budget constrained when we buy gaming hardware - so Intel will sell in the Gaming sector - but the real driver of revenue in the Xe business unit will be Xe HP & HPC (competing with Nvidia Tesla cards) - not the desktop HPs. Price to performance is a thing in the consumer market and not so much in the high end in Datacenters.

Single vendor for CPU, GPU, FPGA, AI, networking, normal SSDs, Optane SSDs and Optane DIMMS... not to mention absolute domination in the DC, and the $$$ to get devs to switch from CUDA to OneAPI... Nvidia Ecosystem is formidable, but can be breached.'

<<Removed by moderator>>
 
Last edited by a moderator:

DavidC1

Distinguished
May 18, 2006
494
67
18,860
@JarredWaltonGPU You know Gen 11 Intel GPUs exist right? They are pretty competitive with Vega 11.

And yea its a bit embarassing to compare GPU "cores" to CPU cores as a GPU editor.

Also based on recent Intel leaks the only Xe dGPU we are going to see is the DG1 which is only 128EUs.
 
Last edited:
A lot of the people making the decision about what goes into the data center were likely the ones who had to clean up the Opteron ECC debacle - and AMD is still not seen as a stable player - Stock price is grossly inflated with a 160x P/E - should be a $5 stock. So, yeah, a lot to overcome - the main issue is their VM farms are all Intel and AMD is NOT a direct drop in replacement - and a homogeneous hardware platform makes managing a large install much easier since all the blocks are the same.

For my use case in a laptop - there is ZERO value in anything over 4 cores - I do not maintain a DTR or a gaming laptop - replaced my almost 2 year old Dell 13 2-in-1 with the Ice Lake Dell 13 2-in-1 back in October - and it's noticeably faster and more responsive - and does everything I need it to do - but they might get 300 hours of use per year - my wife's a bit more.

I own 3 businesses and employ well over 1000 people - I take the lead on hardware, and there won't be a point when AMD will be deployed here - when we do upgrade our Xeon Scalable 1st gen it will be with Ice Lake, all flash array will have 1st gen Optane U.2 replaced with 2nd gen U.2... and can seamlessly move VMs over with zero issues. 3 generation of Intel NUCs will be replaced with NUC11s (Tiger Lake). The 20 or so MS Surfaces installed will be replaced with Tiger Lake Surface.

The sun in setting on AMD - like it did when Core dropped - zero sunshine for more than a decade. Turns out world doesn't care about lithography or manufacturing methods... AMD will have some wins - but Ryzen and Epyc has not performed to the expectations Lisa Su had... I know the kiddies LOVE AMD and it can do no wrong -but revenues drive R&D, . Intel 10nm and 10nm+ are firing on all cylinders - a flurry of products will be dropping this year - Servers, Laptops, Desktop (Rocket Lake S (on 14nm) and NUCs - as well as next gen Optane DIMMs and SSDs, GPUs,etc.

Congrats on paying more than you need to. I'm glad your small business is that profitable that you can throw more money at a system then you need to. But your statement is blatantly false and misleading. There's no such thing for a drop in replacement for intel processors either unless you replace the entire board unless you use the same series chip. The newest intel sever chips also generate a tremendous amount of waste heat. That's wasted power going in, and wasted power in your Air Handling bill. And the later is a lot more expensive than you realize.

As to laptops, when you start doing serious stuff, developers and engineers always benefit from more cores. My compile times have dropped from 1:10 (hours:minutes) to 30 minutes as I have gone from early Sandy Bridge i7 4 cores->6 with higher clocks. But it's not a panacea as this laptop constantly throttles due to heat and eats through the battery. The thunderbolt is also unstable as hell. I breath on it wrong and I have to reboot the entire system.

In terms of data centers however and major servers (I'm talking google, AWS, and Azure) they all use a mix and they are looking at the lowest TCO which AMD provides. (Not just AMD. I see a mix of solutions. But AMD is gaining traction slowly there. These systems are on a 7-12 year cycle. So if you replace at most 15% of your servers a year, then you are doing pretty good if you as an OEM capture a significant percentage of that market. )

AMD's P:E ratio is based on the fact long term growth prospects are looking excellent. It's like looking at the P:E of Amazon or even Apple. It's absurd, but people believe in it's growth potential. The longer Intel takes to get their @#$@ together on smaller nodes and IPC improvements, the better AMD looks long run.

I'm not Anti Intel. I'm anti-monopoly and Intel has gotten too complacent for their own good. Competition is a good thing. This is why you now have cheaper (although non-competitive HEDT Intel chips now. You can thank AMD for that.)
 
Price to performance is a thing in the consumer market and not so much in the high end in Datacenters.

Ummm okay.

99% revolves around TCO. 1% revolves around having the fastest possible at any cost for limited use time sensitive applications where the workload cannot be split up. Even google Stadia which requires an extremely low latency and fast render times, only uses a middle tier Vega unit.
 
Last edited by a moderator: