News Intel Unveils Xe DG1 Mobile Graphics in Discrete Graphics Card for Developers

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
What do you consider to be a "real" GPU?
A standalone GPU, on a card, with dedicated GDDRn memory (or HBMn).

At least, when somebody says "a real GPU", that's what I assume they mean. I'm not saying it's exactly an industry-standard definition, but it seems clear enough to me.

What makes you think Intel has a "full working CPU with [gen 12] integrated graphics"?
Uh... numerous roadmaps.

Also, seemingly confirmed by this: https://www.tomshardware.com/news/intel-tiger-lake-cpus-benchmark-geekbench

Anandtech also states it more clearly than Tom's CES coverage:

Tiger Lake is monolithic and the Xe graphics inside will provide full INT8 support for AI workloads (which will be supported through Intel DL Boost)

There's some more tasty (as in wafer-licious) details, here: https://www.anandtech.com/show/15380/i-ran-off-with-intels-tiger-lake-wafer-who-wants-a-die-shot
they've never produced a discrete GPU before so I don't think what they've done in the past is necessarily relevant.
Jimmy has got this one.
 
Last edited:
A standalone GPU, on a card, with dedicated GDDRn memory (or HBMn).

At least, when somebody says "a real GPU", that's what I assume they mean. I'm not saying it's exactly an industry-standard definition, but it seems clear enough to me.
Err, so all non-socketed mobile GPUs are not "real GPUs"? And to be honest, even the socketed ones are barely (if at all) available for standalone retail purchase.

Uh... numerous roadmaps.

Also, seemingly confirmed by this: https://www.tomshardware.com/news/intel-tiger-lake-cpus-benchmark-geekbench

Anandtech also states it more clearly than Tom's CES coverage:

https://www.anandtech.com/show/1531...e-cpus-soon-tiger-lake-with-xe-graphics-later
There's some more tasty (as in wafer-licious) details, here: https://www.anandtech.com/show/15380/i-ran-off-with-intels-tiger-lake-wafer-who-wants-a-die-shot
We all know how good Intel has been lately at meeting their roadmap dates 😛

By "full working CPU [...]" I took that to mean a finished, production ready design. Your sources describe Tiger Lake as "a long way away" and "out by the end of 2020", so I'm wondering if we might still be looking at engineering samples at this point.
 
Last edited:
Considering they were showing off Tiger Lake laptops that have Xe iGPUs playing games and will be releasing this year I would say Intel has working CPUs with Xe iGPUs.
I could very well have missed it, but the only Tiger Lake gaming demo I saw used a discrete GPU.

Intel actually has produce discrete GPUs in the past. The i740 was, technically, the first AGP based GPU and was released to market:

https://www.anandtech.com/show/202

It was reviewed but it sucked.

And while Larabee didn't amount to anything for consumers it did become a GPGPU in the HPC space so Intel has experience in both markets it intends to release in.
I admit I had no idea about the i740, although I was aware of Larrabee. But those examples are either 20+ years old (i740), or a niche HPC accelerator with a strange, winding development history (Xeon Phi), both of which were more or less failures as products. I question whether either is really relevant for trying to establish a historical trend for how Intel rolls out its discrete GPUs, that could be applied to how they might roll out Xe.
 
Last edited:
I could very well have missed it, but the only Tiger Lake gaming demo I saw used a discrete GPU.


I admit I had no idea about the i740, although I was aware of Larrabee. But those examples are either 20+ years old (i740), or a niche HPC accelerator with a strange, winding development history (Xeon Phi), both of which were more or less failures as products. I question whether either is really relevant for trying to establish a historical trend for how Intel rolls out its discrete GPUs, that could be applied to how they might roll out Xe.

Might be right on the GPU however Intel did show of multiple laptops, mainly concepts, that were running with Tiger Lake CPUs not running game demos. The laptop playing Destiny 2 was also Tiger Lake. Some sources do say it was using a discrete DG1 GPU however it still has a Tiger Lake CPU meaning the CPUs are ready to go.

The i740 was a failure but it did succeed in its goal, showing that AGP was a viable slot for GPUs and after that we had AGP until PCIe.

Larrabee wasn't a failure as it never made it to market. It was a concept only, much like Intels Terascale. But it evolved into something that was sold which is how concepts typically work. Its unfortunate it didn't make it further as I wonder if Intel would have pushed Ray Tracing heavily back then and we may have had it more mainstream sooner. The demo they did in Ray Tracing was quite interesting.

As to how they plan to bring it out. I would bet we will see Tiger Lake with iGPUs and possibly dGPU options in mobile and HPC cards first. Enthusiast discrete markets will probably be one of the last markets but thats just my guess.
 
Something designed from the ground-up to be a GPU
Is that not exactly what the DG1 is, a product designed specifically to be Intel's first discrete GPU (in a long time)?

Note that when I say "discrete GPU" I do in fact mean "GPU", not "graphics card". I know those two are sometimes used interchangeably (although they're technically different things), maybe you're referring to a graphics card.
 
Might be right on the GPU however Intel did show of multiple laptops, mainly concepts, that were running with Tiger Lake CPUs not running game demos. The laptop playing Destiny 2 was also Tiger Lake. Some sources do say it was using a discrete DG1 GPU however it still has a Tiger Lake CPU meaning the CPUs are ready to go.

Sure, that shows that at the least they have some functional tiger lake CPU engineering samples. Doesn't really say about the state of the iGPU though, which is what I was talking about specifically.

Larrabee wasn't a failure as it never made it to market. It was a concept only, much like Intels Terascale. But it evolved into something that was sold which is how concepts typically work. Its unfortunate it didn't make it further as I wonder if Intel would have pushed Ray Tracing heavily back then and we may have had it more mainstream sooner. The demo they did in Ray Tracing was quite interesting.
I realize Larrabee evolved into Xeon Phi, but I was under the impression that Xeon Phi wasn't particularly successful. I could be wrong on that though.
 
Last edited:
Is that not exactly what the DG1 is, a product designed specifically to be Intel's first discrete GPU (in a long time)?

Note that when I say "discrete GPU" I do in fact mean "GPU", not "graphics card". I know those two are sometimes used interchangeably (although they're technically different things), maybe you're referring to a graphics card.

I'm referring to the idea I'm speculating on, which is that the DG1 may be the result of an experimental attempt to re-purpose the integrated graphics from a faulty CPU. Which would contrast with a "real GPU" which would not have most of its die area dedicated to a broken CPU.
I don't think there is a good word to describe such a concept, because as far as I know, nobody has attempted it. Marketing could arguably call that a discrete GPU... It sounds like the kind of bad idea that could have come from some executive without any experience in engineering, though.

I doubt Intel actually tried that (unless things are seriously off the rails over there), but I hope somebody gets the chance to check.
 
I'm referring to the idea I'm speculating on, which is that the DG1 may be the result of an experimental attempt to re-purpose the integrated graphics from a faulty CPU. Which would contrast with a "real GPU" which would not have most of its die area dedicated to a broken CPU.
I don't think there is a good word to describe such a concept, because as far as I know, nobody has attempted it. Marketing could arguably call that a discrete GPU... It sounds like the kind of bad idea that could have come from some executive without any experience in engineering, though.

I doubt Intel actually tried that (unless things are seriously off the rails over there), but I hope somebody gets the chance to check.
Ah, ok, gotcha. But do you have any reason at all to suspect that's the case though?
 
Sure, that shows that at the least they have some functional tiger lake CPU engineering samples. Doesn't really say about the state of the iGPU though, which is what I was talking about specifically.


I realize Larrabee evolved into Xeon Phi, but I was under the impression that Xeon Phi wasn't particularly successful. I could be wrong on that though.

I have no idea how good it did or did not do. But considering how long they did produce it I would say it was selling well enough. Although they killed it most likely because x86 is harder to scale than something like CUDA cores or SPUs.

The Horseshoe Bend laptops were Tiger Lake concepts and my guess would be sans a dGPU since those types of systems tend to not have dGPUs in order to be as small as possible.

We will find out more down the road, I assume shortly as Tiger Lake is supposed to launch this year. My best guess would be the same as Ice Lake which was Q3 2019 so I would estimate a Q3 launch for Tiger Lake. As for Xe no idea. Nothing to reference but I would guess they plan to launch it with laptops when Tiger Lake launches.
 
Warframe is a 7 year old game. If Xe DG1 struggles with Warframe, then I don't want to see how it handles something modern like Monster Hunter World or Forza Horizon 4. I'd like to see more competition in the GPU space, but I'm not seeing any reason to take Intel seriously yet. Too much marketing; not enough performance.

Get it together, Intel!
Love the design tho.