No, that's not what I'm asking. What settings does the mainstream gamer today game at?
4K Ultra settings. ... accept nothing less!!
Jokes aside, 1080p high or 1440p high is pretty typical, even medium is fine.
No, that's not what I'm asking. What settings does the mainstream gamer today game at?
That's fair then. We can agree to disagree and I'll stop here.
Just as a final comment: I can't stand graphical glitches in any capacity and when testing both FG and upscaling it's just worsens the experience for me way too much. This is not even talking about latency for FG. This reminds me back when the first 120Hz monitors were popping and people was like "nah; 60Hz is alright", or the usual "the eye can't see more than 24 FPS", where in this case it's on the opposite extreme of the spectrum.
Regards.
That's the thing. 8 out of the top 10 GPU's support DLSS, which means they are going to last people a lot longer than pre-DLSS cards. The third most used card is still a 1650. Think about that. A 4060Ti is 4th. How long is that card going to last people with DLSS 3.5 support, when so many people are still somehow gaming on a 1650?I would not be so sure, the trick here is the proliferation of AI-based techniques, especially given all the vendors are now playing ball with it, with AMD joining the fun.
It's not only that a gen step up would increase mildly raster performance, but it will also considerably increase AI compute on one hand AND will also have 2 more years of game releases and updates to introduce that in the games.
I actually think that the next two years will result in a major step up there because of that. And then you will end up having a new console generation too, which is practically assuredly going to utilize those capabilities too.
That's the thing. 8 out of the top 10 GPU's support DLSS, which means they are going to last people a lot longer than pre-DLSS cards. The third most used card is still a 1650. Think about that. A 4060Ti is 4th. How long is that card going to last people with DLSS 3.5 support, when so many people are still somehow gaming on a 1650?
There's no new meaningful console generation coming out soon. PS5 Pro launched 2 months ago, we're not going to see a PS6 for another 4 or 5 years. Switch 2 games aren't getting ported to PC and it's still going to be way slower than PC hardware. Early rumors are the X Box may see a new console towards the end of 2026, but no one really cares. Xbox is typically outsold 2:1 or more vs Playstation.
The future of gaming is changing, and has been for the last few years. Accept it and be part of the growth, or deny it and play at 1080p 60Hz, just because... reasons.
How many games without raytracing currently require DLSS to run medium settings at 1080p on a 3060 level card?The whole point is exactly that - it's no longer a novelty and every half-decent game that is at least somewhat GPU demanding is now not only coming with DLSS or alternatives but also relying on it.
That's a limitation of the early versions. Version 4 will predict the future so you don't even have to ask.Soon you won't even play video games, you just ask the nVidia AI to show you what it thinks you playing games looks like.
More often than not, it's not hardware...But we are hitting a ceiling withcurrent hardwaresoftware...
It was a simple question. What is a mainstream gamer today. As you referenced. Steam shows 2/3's of gamers game at 1080p or lower and 70% currently have a GPU with 8GB or less VRAM. 1080p, probably 60hz maybe creeping up to 100hz. That's a mainstream gamer.