News Intel Xe DG1 Benchmarked: Battle of the Weakling GPUs


Given the current situation any new entry in the GPU market is welcome.

Intel, Zhaoxin entering or even if someday Matrox will want to get back in the game would mean more GPUs in the hands of the gamers.

Price/performance ratio has been stagnant and the current generation could have been the turning point but sadly shortages crushed our hopes.

Let's hope that late 2021/2022 will bring us some refreshing and a chance of getting a good GPU at a fair price.
 
The price for the gpu is fair always consider consider market situation. This GPU is designed for the data center no point to buy one for pc much rather go for cloud gpu services ... After a couple a year buy whatever Nvidia or AMD or Intel (if they will got proper GPU and drivers) GPU. Problem with current market prices is that these are not obtainable for a lot of people due to the rise. normal market situation after all hyper inflation come 🙁
 
Lol, thats one week GPU. Didnt expect that low performance. Will it fine wine like AMD for a little better formance in the future? Not that it matters anyway.
 
The DXG 1's performance is rather inconsistent. Driver issues perhaps??

However, i think i am more keen in 5700G's benchmarks instead. It pretty much says, unless you are playing only CS and similar games, just screw it and get a graphics card instead. Or you arent gaming at all.
 
Lol, thats one week GPU. Didnt expect that low performance. Will it fine wine like AMD for a little better formance in the future? Not that it matters anyway.

This isn't meant to blow your socks off at 600fps. It's literally the integrated GPU from a -lake cpu strapped to it's own external board. The big boys are coming and they're quite a bit faster.
 
Lol, thats one week GPU. Didnt expect that low performance. Will it fine wine like AMD for a little better formance in the future? Not that it matters anyway.

45% faster memory , and still lose to gt 1030, if the full version is 220W as the rumors claim maybe will land around gtx 1080 - rtx 2060 and wont be even close to rtx 3060 and rx 6600 non xt
 
  • Like
Reactions: vinay2070
@JarredWaltonGPU did you get a chance to OC it, measure power consumption at idle/load etc? Also where do you think thier high end card would land in terms of performance?Thanks.
There's not much support for doing overclocking on the card, and frankly I don't think it would matter much. A combination of lacking drivers, bandwidth, and compute mean that even if you could overclock it 50%, it still wouldn't be that great. And for power testing, my equipment to measure that requires a lot of modifications and runs on my regular testbed, so I'll need to poke at that a bit and see if I can transport that over to the CyberPowerPC setup for some measurements. I suspect the 30W TDP is pretty accurate, but I'll have to do some additional work to see if I can get hard measurements.

For the high-end DG2 cards, it's very much a wild card. Assuming Intel really does overhaul the architecture -- more than it did with Xe vs. Gen11 -- it could conceivably land as high up the scale as an RX 6800 XT / RTX 3080. That's probably being far too ambitious, though, and I expect the top DG2 (with 512 EUs) will end up looking more like an RTX 3070/3060 Ti or RX 6700 XT in performance. It could even end up being more like an RTX 3060 (and presumably RX 6600 XT).

Wherever DG2 lands, though, I expect driver issues and tuning will remain a concern for at least 6-12 months after the launch. I did some testing of Tiger Lake with Xe LP and 96 EUs back when it launched, and saw very similar problems to what I saw with the DG1 -- DX12 games that took forever to compile shaders, games that had rendering issues or refused to run, etc. It's better now, nearly a year later, but there are still problems. Until and unless Intel gets a graphics drivers team that's at least as big as AMD's team (which is smaller than Nvidia's drivers team), there will continue to be games that don't quite work right without waiting weeks or months for a fix.
 
  • Like
Reactions: vinay2070
I thought these XE gpus were supposed to be the next best thing since sliced bread? I knew when intel announced they were making gpus that it would be the i740 all over again.
 
I thought these XE gpus were supposed to be the next best thing since sliced bread?
And you thought that based on what exactly?
All intel said was that it would be a decent option next to having nothing.

"systems targeted to mainstream users and small- and medium-size businesses. "
" saw the opportunity to better serve the high-volume, value-desktop market with improved graphics, display and media acceleration capabilities."

"a compelling upgrade to existing options in the market segment. They feature three display outputs; hardware video decode and encode acceleration, including AV1 decode support; Adaptive Sync; Display HDR support and artificial intelligence capabilities thanks to DP4a deep-learning inference acceleration. The Iris Xe discrete graphics cards come with 80 execution units and 4 gigabytes of video memory."

https://newsroom.intel.com/articles/intel-releases-iris-xe-desktop-graphics-cards/#gs.6rqocf
Intel codesigned and partnered with two ecosystem partners, ASUS and Gunnir, to launch the Intel® Iris® Xe discrete desktop graphics cards (code-named “DG1”) in systems targeted to mainstream users and small- and medium-size businesses. The cards are sold to system integrators who will offer Iris Xe discrete graphics as part of pre-built systems.

Following the launch of Intel® Iris® Xe MAX for notebooks, Intel’s first Xe-based discrete graphics processing unit, Intel and its partners saw the opportunity to better serve the high-volume, value-desktop market with improved graphics, display and media acceleration capabilities.

The new cards offer a compelling upgrade to existing options in the market segment. They feature three display outputs; hardware video decode and encode acceleration, including AV1 decode support; Adaptive Sync; Display HDR support and artificial intelligence capabilities thanks to DP4a deep-learning inference acceleration. The Iris Xe discrete graphics cards come with 80 execution units and 4 gigabytes of video memory.