Frame generation takes a load off the CPU.
Gamer Pairs RTX 4070 with Pentium. DLSS 3 Makes it Playable : Read more
Gamer Pairs RTX 4070 with Pentium. DLSS 3 Makes it Playable : Read more
The scheduled forum maintenance has now been completed. If you spot any issues, please report them here in this thread. Thank you!
In our review of the RTX 4070, we noted that the card offers excellent ray tracing and AI for a fairly-reasonable (by today's standards) $599.
Yeah core usage is jumping. If history is any indication I suspect we'll be mostly stuck with 6/8 cores in gaming until consoles (next gen or gen after?) have more cores onboard themselves. Though with ray and soon path tracing taking over (currently 43% of Nvidia gamers if Nvidia is to be believed)I could be wrong. I know MS claims their next console will be all in on AI so who knows what direction/implementations will be used though I suspect most of it will be via the gpu portion of the soc.Another thing it tells us is how dependent games have now become on > 2C/4T.
I honestly didn't believe the current-gen consoles would feature 8x Zen2 cores. So, I guess I should probably double-down and state right here & now that the next-gen models will hold at 8 cores.If history is any indication I suspect we'll be mostly stuck with 6/8 cores in gaming until consoles (next gen or gen after?) have more cores onboard themselves.
yeah, that line caught my attention too. makes you wonder if nvidia sponsored this article, sort of reminds me of the "just buy it" argument about the 2xxx series gpus made here.The media should not be excusing these prices. People on forums certainly aren't and luckily see through the nonsense.
I honestly didn't believe the current-gen consoles would feature 8x Zen2 cores. So, I guess I should probably double-down and state right here & now that the next-gen models will hold at 8 cores.
: D
Do we really think games need more than 8x Zen 4 cores? For what? Physics and AI? Add more GPU cores, instead. Keep in mind that PS4 and XBox One got by with lousy, single-threaded Jaguar cores.
Nintendo does enough volume that I'd imagine Nvidia wouldn't mind doing a custom SoC for them which cuts back on the core count. When you're talking about volumes like 10 M+ chips, the extra die savings will surely add up.I could see the Switch 2 sporting 12 arm cores if it goes with a nvidia orin soc varient as rumored. But even if it does I doubt it have much affect on x86-64 in gaming due to arm's lower ST performance but I could be wrong.
I feel like Nintendo has been singing this tune, for a while. The last time they had a truly competitive hardware was when the Game Cube launched.Work to get games and graphical apps to work on low end hardware improves the efficiency and overall playability on every system.
Quite similar to the "opposite" approach of the steam deck, and how much that has improved linux gaming, gaming in general, and low end hardware being able to game.
We're used to throwing high wattage cpus and gpus at terribly written code to make it acceptable to use.
I've heard stories about some game studios giving their developers very mid-range spec machines, with the idea that they'll only optimize their code to run well on what they're personally using. If you want your game to have high-end requirements, then by all means give them high-end development boxes.I issued a company wide memo to the programmers working at my #2 computer company in the world that it was perfectly okay to NOT use all of the disk space by paging hideously coded and bloated applications that were now exceeding a whopping 1 megabyte size in-memory.
However, not all games become playable when frame generation is enabled. In the video, we see Witcher 3 running at a 55 fps average with a horrible 1% low of 3 fps, despite having frame generation on.
It's not just the core count, force lock all of your cores to only 3.7Ghz and reduce the cache to 6MB and no matter how many cores you have gaming will suck.It's an interesting experiment, I suppose... but rather pointless, when the G7400 is currently selling for about the same price as an i3-12100F.
I guess if you had a prebuilt with the G7400, it shows what kind of gaming performance you could hope to achieve, but then I doubt such a low-end prebuilt would accommodate a card like the RTX 4070.
Another thing it tells us is how dependent games have now become on > 2C/4T.
But CPU usage goes up again when using DLSS... since the resolution is lower. It's exactly the same phenomenon as when reducing the resolution for real. The reason why frame generation still produces more frames is because it got nothing to do with the CPU. DLSS does. Most of the performance gain should come from the frame generation in that experiment, not DLSS.Meanwhile AMD and Intel: NO! YOU NEED THE NEW HIGHEST END HARDWARE! YOUR OLD STUFF IS CRAP!!!!
Is this really news to anyone? At higher detail levels and resolutions the CPU is far less important than a GPU, and with quality reduction fakery techniques like DLSS the GPU and CPU are also less dependent.
I'll go back to TH's CPU scaling article with the 3080. Even with a 15 year old quad core processor you can still do 120fps at 2560x1440. You add in things like DLSS and you can "increase" the detail level. Use frame interpolation with DLSS3 and you can "increase" the frame count.
Yeah, the latency is going to be abysmal at some of these framerates. RDR2 might roughly look to an observer as if it's running at 64fps with 35fps lows, but it's going to feel worse than if it were running at 32fps with 17fps lows, since the input will be delayed an extra half-frame beyond that. Plus, the resulting artifacts will be far more noticeable when they are on-screen for longer periods of time. This will be even worse in first-person games like Cyberpunk. Coverage of DLSS3 frame generation has suggested that its mostly only suitable for increasing the apparent smoothness of games that are already running at decent frame rates without it, mainly for high refresh rate monitors in noncompetitive, slower-paced titles. It's usefulness on lower-performance hardware seems questionable.If you're going to do an experiment like this then you need to include latency in your results. DLSS 3 doesn't just render fake frames without improving latency but it actually increases the latency
I could see the core counts changing if AMD were to move to an asymmetrical core design by the time of the next console generation, or if one or both of the consoles where to switch to another APU provider. Especially in consoles, where some cores are already dedicated to system tasks, and heat and power use need to be limited, it might make sense to utilize different kinds of cores for different tasks.I honestly didn't believe the current-gen consoles would feature 8x Zen2 cores. So, I guess I should probably double-down and state right here & now that the next-gen models will hold at 8 cores.
: D
Do we really think games need more than 8x Zen 4 cores? For what? Physics and AI? Add more GPU cores, instead. Keep in mind that PS4 and XBox One got by with lousy, single-threaded Jaguar cores.