Intel takes down AMD in our integrated graphics battle royale — still nowhere near dedicated GPU levels, but uses much less power

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Aug 11, 2024
2
5
15
I'm sorry but why on earth did you test using an AMD laptop with only 8 gigs of RAM?! (In a single channel configuration to boot.) That completely nullifies any and all benchmarks using that laptop. Windows isn't even very happy on its own with only 8 gigs... Let alone running only 8 on an iGPU system that is going to need that RAM as well! Absolutely insane choice.
That laptop uses a dedicated GPU (RTX 3050 Ti), not an iGPU, so RAM is less relevant than in iGPU laptops. All three iGPU laptops have 32 GB of RAM.
 

DS426

Upstanding
May 15, 2024
267
199
360
Jarred: thank you for the thorough retest and much elaboration in this article. Great job on Intel finally making a competitive iGPU! AMD hasn't had a reason to pushed on this until now, so it's good for consumers as this space heats up. After all, why couldn't we have console-level gaming performance on laptops if the hardware already exists?

That said, a lot of gamers on laptops don't want and/or can't afford an additional subscription like GeForce Now. Your argument has some sense to it but relevance is limited, IMO. We're not raising the next generation of PC hardware enthusiasts if we're encouraging them to go with cloud gaming (isn't this Tom's Hardware after all?).

Most games tested here are more recent and demanding, so while 720p benchmark results make these iGPU's look like it's not even worthwhile, there's thousands of games that are much less demanding and can even be played smoothly at 1080p -- the same ones that are more practical to play on laptops like these will be in that are big beefy gaming models.

That then would be my advice -- just steer prospective buyers to more casual games or gaming in general.
 

Elusive Ruse

Estimable
Nov 17, 2022
458
596
3,220
Good stuff Jarred, with Intel competing well and now beating AMD in the iGPU space hopefully in a few years laptops can sport iGPUs which are not immediately written off as gaming options.
 

phxrider

Distinguished
Oct 10, 2013
108
54
18,670
Real performance is native performance.
I said "real world performance", not "real performance", and real world performance is what you get on your screen when you set everything up optimally for best performance in the game you're playing - and it's ALL anyone except lab nerds cares about when they plunk down their cash.
 
I said "real world performance", not "real performance", and real world performance is what you get on your screen when you set everything up optimally for best performance in the game you're playing - and it's ALL anyone except lab nerds cares about when they plunk down their cash.
That's a very strange hill you're willing to die on, I have to say.

Measuring performance of devices have nothing to do with your personal preference, perceptions or biases surrounding games.

If you can't understand a methodology, then ask why things are tested in a particular way, but your hellbent stance of "you're doing it wrong because I say so" is hardly productive of anyone's time.

Sure, no one will buy any of these as dedicated gaming machines or even rely on non-upscaled gaming performance (more than likely), but you're missing the Forest for the pixelated trees in front of your nose in this instance.

I hope you reconsider your stance and understand what the information in front of you is telling you instead.

Regards.
 
I said "real world performance", not "real performance", and real world performance is what you get on your screen when you set everything up optimally for best performance in the game you're playing - and it's ALL anyone except lab nerds cares about when they plunk down their cash.
So what you're saying is you're not actually interested in benchmarks but rather some sort of arbitrary optimization. FSR for example has varied quality of implementation and there are two XeSS modes one which can be run on anything the other requires XMX which is only available on the LNL IGP. Frame Generation is even more subjective and benchmarks do not include input latency which makes using FG a complete disservice in any circumstance (not to mention inconsistent behavior).

The point of benchmarking is to show how hardware compares to each other in performance. The most accurate way to do that is eliminate as many outside variables as possible.
 
Jarred: thank you for the thorough retest and much elaboration in this article. Great job on Intel finally making a competitive iGPU! AMD hasn't had a reason to pushed on this until now, so it's good for consumers as this space heats up. After all, why couldn't we have console-level gaming performance on laptops if the hardware already exists?

That said, a lot of gamers on laptops don't want and/or can't afford an additional subscription like GeForce Now. Your argument has some sense to it but relevance is limited, IMO. We're not raising the next generation of PC hardware enthusiasts if we're encouraging them to go with cloud gaming (isn't this Tom's Hardware after all?).

Most games tested here are more recent and demanding, so while 720p benchmark results make these iGPU's look like it's not even worthwhile, there's thousands of games that are much less demanding and can even be played smoothly at 1080p -- the same ones that are more practical to play on laptops like these will be in that are big beefy gaming models.

That then would be my advice -- just steer prospective buyers to more casual games or gaming in general.
If you're only interested in more casual / lighter games, you don't need a high-end GPU, or cloud gaming, or anything like that. My point is that, for $20 per month, you can get a laptop gaming experience that will effectively rival what you'd get from a $2000+ gaming laptop, on a $700 non-gaming laptop (or even a six years old laptop). For someone whose main hope is to be able to play games on a laptop, and get decent battery life... all they really need is a good internet connection and a GFN subscription.

The RTX 4080 for Laptops GPU comes in laptops that basically start at $2,000 (and really it's closer to $2,300). What's more, an RTX 4080 for Laptops GPU roughly matches a desktop RTX 4070 Super — more cores by a small amount, but lower clocks and lower power consumption and ultimately lower performance. The GFN "4080" isn't the same as an actual 4080 either, but it's a lot better in terms of hardware than the RTX 4070 Super!

So, yeah, a GFN Ultimate subscription will absolutely be faster in raw performance than a laptop 4080. It will even beat a laptop 4090. And a laptop doing game streaming could potentially last four or five hours I would think. All while saving potentially $1,500 upfront by getting a modest laptop + GFN, rather than a similar performing (but still slower) gaming laptop.

For playing CS2 or Dota 2 or any number of older, lighter games? Even integrated graphics on a several years old Ice Lake or Tiger Lake processor will often suffice. (I have two boys that play Roblox and Minecraft on such laptops on a regular basis.) But if you want a "gaming laptop" experience, without blowing thousands of dollars? GFN truly isn't a bad option.

That's a fact whether I'm writing at Tom's Hardware or not. It's not the perfect solution for everyone, to be sure, and I personally don't use GFN too often. But then I have access to the fastest desktop hardware around and I don't travel much. If I were traveling frequently and wanted gaming on the go, and I could guarantee I'd have high speed internet available? I'd take GFN over any gaming laptop, because I truly don't want to haul around a five or six pound gaming notebook.
 
This test is literally 100% DX12 when Intel is known to be weak in DX11. I smell sponsorship.
Name a recent and important game that is pure DX11.

Baldur's Gate 3? Well, it has Vulkan, but that's slower, and it was included in my testing, running in DX11 mode. (Oops. I mislabeled the chart, so I need to go fix that.) There goes your 100% DX12 argument. I might be wrong on one or two other games as well... but the majority of new games, that are even remotely demanding in terms of graphics, all support DX12. DX11 as the baseline has been fading quickly in the past few years.
 

Pierce2623

Prominent
Dec 3, 2023
487
368
560
Name a recent and important game that is pure DX11.

Baldur's Gate 3? Well, it has Vulkan, but that's slower, and it was included in my testing, running in DX11 mode. (Oops. I mislabeled the chart, so I need to go fix that.) There goes your 100% DX12 argument. I might be wrong on one or two other games as well... but the majority of new games, that are even remotely demanding in terms of graphics, all support DX12. DX11 as the baseline has been fading quickly in the past few years.
I don’t disagree but if you look at the actual playtime charts on Steam, the pc crowd spends a lot more time playing old games than the console crowd. All the emulators use dx11, too. It just seems like a test selection purpose built to show Intel in the best light possible more so than test a sample that’s representative of how PC players play is all I’m saying. A lot of the more poular sims like DCS or Iracing use dx11 (or maybe even older for Iracing).
 

bit_user

Titan
Ambassador
if you look at the actual playtime charts on Steam, the pc crowd spends a lot more time playing old games than the console crowd.
FWIW, the Playstation store is full of PS4 titles, some of which are refreshed versions of even older titles. I wonder how many PS5-exclusives there even are, at this midpoint in its lifespan.

BTW, I just stuck a 1440p gaming monitor on my PS5, out of curiosity. I tried two PS4 titles and one rendered natively at 1440p60, while the other stayed at 1080p60 and just upscaled. Neither went to 120 fps and weirdly the PS5 didn't think my Freesync Premium Pro / Gsync-compatible HDMI 2.1 monitor was capable of VRR!
 

Ogotai

Reputable
Feb 2, 2021
407
248
5,060
f you look at the actual playtime charts on Steam,
yea, because steams is the be all tell all go to for any nmd all gaming stats...

i have never seen a steam survey pop up, ever.. and, not sll those who play games, use steam, heck 8 people i work with who play games, dont even have steam installed.....

take steam as a guideline, nothing more....
 

phxrider

Distinguished
Oct 10, 2013
108
54
18,670
So what you're saying is you're not actually interested in benchmarks but rather some sort of arbitrary optimization. FSR for example has varied quality of implementation and there are two XeSS modes one which can be run on anything the other requires XMX which is only available on the LNL IGP. Frame Generation is even more subjective and benchmarks do not include input latency which makes using FG a complete disservice in any circumstance (not to mention inconsistent behavior).

The point of benchmarking is to show how hardware compares to each other in performance. The most accurate way to do that is eliminate as many outside variables as possible.
No, I'm saying ultimately, when making an actual purchase decision, it's more important to know the FPS you'll be getting on screen when really playing a game than what a laboratory power comparison shows.
 

phxrider

Distinguished
Oct 10, 2013
108
54
18,670
That's a very strange hill you're willing to die on, I have to say.

Measuring performance of devices have nothing to do with your personal preference, perceptions or biases surrounding games.

If you can't understand a methodology, then ask why things are tested in a particular way, but your hellbent stance of "you're doing it wrong because I say so" is hardly productive of anyone's time.

Sure, no one will buy any of these as dedicated gaming machines or even rely on non-upscaled gaming performance (more than likely), but you're missing the Forest for the pixelated trees in front of your nose in this instance.

I hope you reconsider your stance and understand what the information in front of you is telling you instead.

Regards.
Not that strange. No one cares about a lab-derived power comparison when deciding which device to put their hard-earned money down on, they care about how it performs when actually playing a game the way they're going to run it in real life.
 

phxrider

Distinguished
Oct 10, 2013
108
54
18,670
yea, because steams is the be all tell all go to for any nmd all gaming stats...

i have never seen a steam survey pop up, ever.. and, not sll those who play games, use steam, heck 8 people i work with who play games, dont even have steam installed.....

take steam as a guideline, nothing more....
It seems like your number comes up at some point and you get surveyed. I had every device I log into pop up a survey one day, after ~20 years of having a Steam account and never getting one.
 
No, I'm saying ultimately, when making an actual purchase decision, it's more important to know the FPS you'll be getting on screen when really playing a game than what a laboratory power comparison shows.
But framegen FPS is not the same as regular FPS. It's frame smoothing with additional latency is all, and TVs have had that feature for years. The important thing is that the underlying hardware will never change, but the software can and will change.

Rendering 720p and upscaling and frame generating is not the same as 1080p or 1440p native. If company A (AMD) gets enough traction via driver-side upscaling and framegen, company B (Intel) will respond with similar features. And then guess what? Relative performance will be back to native performance.

In fact, there are already software solutions that will do upscaling and framegen for a lot of games. See: Lossless Scaling (which I have no idea how it's "lossless" because upscaling and framegen are inherently lossy in that they create pixels that weren't rendered in the standard way). So, anyone that cares so much about AMD's HYPR-RX features should just spend $6.99 to get the same thing for other GPUs and vendors.
 
Not that strange. No one cares about a lab-derived power comparison when deciding which device to put their hard-earned money down on, they care about how it performs when actually playing a game the way they're going to run it in real life.
So this is fantasy testing? What about this testing is not real according to you?

I honestly think you need to understand the difference between quantitave and qualitative testing and analysis.

This tells me all I need to know about quantity. You're complaining there's no qualitative analysis because there's no framegen enabled? Because they're not changing settings to maximise their FPS? Because they're not using proprietary tech of each to check the artificial numbers?

I'll stop here, but like I said, this reads like "you're not testing like I want, so you're wrong".

Regards.
 
This is an all-around impressive article! I'm just a little disappointed that there was little overlap with the regular GPU ranking suite.
I'm reworking the main GPU hierarchy, so this was a bit of a "test the waters" approach. 1080p medium will be the bottom setting on desktop GPUs, which is why I used that here. But I'm gunning for ~20 games in the new suite, with a lot more "newer" titles than before. There are still a few older ones kicking around for reference, though (mostly more demanding stuff, like Control and Cyberpunk 2077).
 
  • Like
Reactions: HoveringAbove