A standalone GPU, on a card, with dedicated GDDRn memory (or HBMn).What do you consider to be a "real" GPU?
At least, when somebody says "a real GPU", that's what I assume they mean. I'm not saying it's exactly an industry-standard definition, but it seems clear enough to me.
Uh... numerous roadmaps.What makes you think Intel has a "full working CPU with [gen 12] integrated graphics"?
Also, seemingly confirmed by this: https://www.tomshardware.com/news/intel-tiger-lake-cpus-benchmark-geekbench
Anandtech also states it more clearly than Tom's CES coverage:
Tiger Lake is monolithic and the Xe graphics inside will provide full INT8 support for AI workloads (which will be supported through Intel DL Boost)
There's some more tasty (as in wafer-licious) details, here: https://www.anandtech.com/show/15380/i-ran-off-with-intels-tiger-lake-wafer-who-wants-a-die-shot
Jimmy has got this one.they've never produced a discrete GPU before so I don't think what they've done in the past is necessarily relevant.
Last edited: