News Nvidia RTX 5060 is up to 25% faster than RTX 4060 with frame generation in new GPU preview

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I feel embarrassed to have bought NVidia GPUs, including an RTX 5070 Ti. I won't buy a single one of their products in the future. My brother said it would only be a matter of time before I regretted buying their products, and that day has arrived.

No respect for the buyers, fake benchmarks, fake reviews, frame-gen forced on. I'm done. I'm done with NVidia for good.
Well they’re a company doing shitty stuff but really aren’t they all to some degree or another (although NVidia is currently amongst the worst)?

5070ti is not a bad product, just enjoy your games and buy whatever is a good deal when you need something new because you’re not going to make NVidia or any other company into upstanding citizens whatever you do.
 
Well they’re a company doing shitty stuff but really aren’t they all to some degree or another (although NVidia is currently amongst the worst)?

5070ti is not a bad product, just enjoy your games and buy whatever is a good deal when you need something new because you’re not going to make NVidia or any other company into upstanding citizens whatever you do.

fair enough I only bought it as the 9070 XT wasn't available either and the 5070 Ti is the only decent RTX 5000 series, but I feel like I should learn to just play old games and stop chasing hardware upgrades, NVidia is not delivering moral behaviour NOR good products right now

good point that Apple is awful (they don't even focus on the GPU that everyone wants, nor have a portable gaming system, nor a decent home system), AMD can't deliver, NVidia, Microsoft, ugh

feels like big tech is turning more and more into cellphone companies, I can remember genuinely loving NVidia and buying all their products with a smile on my face like a new console launch, that's all, it sure is different now
 
Would be interesting to know, how much the improvement, if any, is with a more common CPU than the 9800X3D. Like, when a rig is CPU-bound with a 4060, can a 5060 top that? A few tests like that, one could perhaps even give recommendations about which CPU to use at least, for which GPU - as in the cut-off point, where a rig becomes GPU-bound.

As for frame generation, got me a 9070 XT this week, and curious how that looks and plays like. It sure boosts the FPS number quite some. In Cyberpunk 2077 (which has "only" FSR 3), at 3440x1440 all max and RT Ultra, it went from 83 avg. fps (min 72), to avg. 159 (min 141) with FG - with FSR on Auto (on Quality it is 135, min 120, and about 10 less with RT Psycho).

But from what I understand, FG does need a good FPS baseline, like the 83, for player-interactions to not feel sluggish. So, generally nice to check the link, where Tom's tracks the baseline numbers for the GPUs, prior to (eventual) DLSS/FSR and FG, for a more solid info, than the current: "The preview said 66 FPS for a 4060, and no one knows whether that may be with a baseline of 20."

https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
When a game is cpu limited you could put in a 5090 and still get the same result as a 4060, that’s basically the definition of cpu limited.

I think you’re right about framegen: it makes games better/smoother when they’re already running well (so that’s good in my book) but when your game is running badly to start with (I consider anything less than 60fps bad, but that’s just me) it’ll still run badly with framegen only look a bit smoother and the number will be higher. Actual gaming might actually be worse because of additional latency introduced by framegen.
 
Remember when people waited years for AMD or NVidia to make a large iGPU? Now we finally got a mini PC with the RTX 5070 in it ($550) from NVidia for $4000 USD.

Hilarious company. Add arms cores and ram, add $3450 USD.

Really done with it all.

Disgusted.

Also look at the disgusting used prices for the RTX 4090. That's from market manipulation. Intentionally no RTX 5080 to replace it, and under supply to the market. Nobody is even buying GPUs anymore, NVidia is selling LESS THAN HALF as many GPUs to gamers now as 10 years ago. Just under supply and raise prices.

BS, I'm done with them.
 
It boils down to this: it's a tightly controlled set of permissions for reviews to force only the kind of inflated "improvements" they want people to believe, much like, what was it, framegen giving the 5070 "the performance of a 4090" or something?

They're unrepentant, and doubling down on that same dirty tactic?

This is done very obviously I'm the hopes of enormous Day 1 or maybe even pre-order sales, if such are available, to people who will eagerly buy before full, legitimate reviews are out.

Why wouldn't they, after all?
 
  • Like
Reactions: tamalero
What a ridiculously stupid take. Frame generation is one of the features of this generation's cards If you don't want to take advantage of all of the cards features then don't buy that card. It'd be like buying a car that can go 280 miles an hour and then setting the throttle so it won't go more than 1/8.
I would like someone to explain to me the difference between a "fake frame" which is pixels on a screen and a "real frame" which is made up of pixels on the screen. Both frames are interpreted from code in the game and placed on the screen. Some people are so closed minded that they don't understand the way forward. I'd label these people luddites or technophobes but maybe they're just "fake people".
 
at this point Nvidia might as well not make 60 tier gpu.

Legit waste of sand.

a 4060 had 66.63 w/ frame gen
the 5060 had 83.77 w/ x2mfg.....

nvidia is using those fake frames as a point so goign by their own view....the 60 is making doublt the frames.....yet only a mere 19 frame better? it should be around 30 (at the minimum)...19 is a literal joke considering the downside that come w/ that mfg.
I didn't realize that you were the industry standard setter on what would be the minimum acceptable generational improvement. I guess Nvidia didn't realize that either and it's lucky they didn't because I just got a better card than what was available last generation.
 
I would like someone to explain to me the difference between a "fake frame" which is pixels on a screen and a "real frame" which is made up of pixels on the screen.
FG increases latency slightly when raising the frame rate whereas an actual increase in frame rate lowers latency. That means unless you have a good starting frame rate it's pretty worthless unless you're a fan of poor input latency. There are also problems with dynamic scenes and rendering, but those aren't universal.

FG is a great technology and as it gets better I have no doubt it'll see common usage to utilize very high refresh rates, especially at higher resolution, but it's not the universal improvement it's billed as. It also increases VRAM usage so nvidia's current cards with 8GB VRAM can be a problem.
 
not surprised but disappointed .
nvidia doubling down on their shady tactics ,
generating fake controlled previews done by the selected few "friendly" "journalists" .
(at least they expose themselves as sell outs and we don´t need to bother reading
any of their stuff ever again) .
 
If someone from Tom's Hardware had written the actual "preview", I might sympathize with your position. As it is, they are reporting on the work of another - not actually writing a shill piece themselves.

I do think the title could be improved.
There is no high road to take when the site posts an article with the test results on the front page for all their readers to see. Whether the site did the testing or not is irrelevant. The end result is all the same and Nvidia is handed exactly what they wanted from this charade.

Nvidia has never really cared, just like every other major company. Now though, they don't even care about hiding it. Gamers are so irrelevant to them at this point, that they have no qualms over publicly giving them the middle finger.
 
Easy on the shill nonsense.TH doesn't have a preview do they?The ones I read didn't put disclaimers in at all. Basically just publishing nvidia pr. I deleted their bookmarks.
The video though. I've watched Steve for years and he hardly says the f word. It's usually beeped out.It is not bleeped out here . It's sorta like that mellow dude you know basically really pi**ed off. It's shocking. He tells a nvidia dude to f off (not directly) and just roasts them. He's not perfect but his credibility allows him to do this. It's glorious. Then a pr dude tried to put an underling in the firing line but Burke wasn't having it. The e mail he sent to nvidia is just gold. it's in the video.
The Aussies @ Hardware unboxed have GNs' back, which is good.
 
There is no high road to take when the site posts an article with the test results on the front page for all their readers to see. Whether the site did the testing or not is irrelevant. The end result is all the same and Nvidia is handed exactly what they wanted from this charade.

Nvidia has never really cared, just like every other major company. Now though, they don't even care about hiding it. Gamers are so irrelevant to them at this point, that they have no qualms over publicly giving them the middle finger.
You know... I've been reading that part of "nVidia doesn't care about gamers" over and over and, while I get where it is coming from, I think that's a self-defeating fallacy.

Let me explain my reasoning: the "gamers" market is still ~$2B+ and nVidia already controls ~90% of it. The reason why nVidia is controlling the narrative on their biggest segment seller is no coincidence: they do not want to part with it. They did it with the 5070, albeit it caught slightly less backslash (yes, there were "previews" of it as well), so this means they want to absolutely make sure their narrative is the one being told first. First impressions stick and they absolutely want that. This is basically what they're desperately trying to do here: skew the first impressions and ensure they have critical volume of sales, which include SIs and OEMs (laptops included, since the naming is the same). Whether we like it or not, DYI is not that big in comparison.

As for the phrasing itself, I do believe it's not correct ("self-defeating fallacy") for a simple reason: nVida (or AMD or Intel) has never really "cared for the gamers"; we cannot kid ourselves they "care" about anyone but their own shareholders. We have to view it as just sheer numbers (and market size) and just keep ourselves honest in the face of the marketing machines.

So, this makes me conclude that, in very simple terms, nVidia does not want to part with the ~$2B market that is "gamers" as you can't tell shareholders with a straight face: "nah, we don't need those ~$2B". Additionally, whether we believe him or not, Jensen wants the "gamer" market to be their stronghold.

Regards.
 
The psychology surrounding nvidia is remarkable. Some gamers cower to nvidia, grateful for the overpriced cards they produce.Like a trillion dollar corporation is going to abandon a %90 market share that makes 2 billion dollars.
Trillion dollar companies with %90 market share will stay in it if they make 100k say nothing about 2 billion.( correct me if I'm wrong;I'm crabby today).
Also, correct me if I'm wrong,but it takes a lot more tech and money to turn the silicon for a 5090 into a server type chip they sell for 4k or whatever.
You can't just slap a 5090 or A5000(?) sticker on them when they come out of the oven right? Server gpu's must need more refining time I think compared to a lowly 5090, yeah?
 
You know... I've been reading that part of "nVidia doesn't care about gamers" over and over and, while I get where it is coming from, I think that's a self-defeating fallacy.

Let me explain my reasoning: the "gamers" market is still ~$2B+ and nVidia already controls ~90% of it. The reason why nVidia is controlling the narrative on their biggest segment seller is no coincidence: they do not want to part with it. They did it with the 5070, albeit it caught slightly less backslash (yes, there were "previews" of it as well), so this means they want to absolutely make sure their narrative is the one being told first. First impressions stick and they absolutely want that. This is basically what they're desperately trying to do here: skew the first impressions and ensure they have critical volume of sales, which include SIs and OEMs (laptops included, since the naming is the same). Whether we like it or not, DYI is not that big in comparison.

As for the phrasing itself, I do believe it's not correct ("self-defeating fallacy") for a simple reason: nVida (or AMD or Intel) has never really "cared for the gamers"; we cannot kid ourselves they "care" about anyone but their own shareholders. We have to view it as just sheer numbers (and market size) and just keep ourselves honest in the face of the marketing machines.

So, this makes me conclude that, in very simple terms, nVidia does not want to part with the ~$2B market that is "gamers" as you can't tell shareholders with a straight face: "nah, we don't need those ~$2B". Additionally, whether we believe him or not, Jensen wants the "gamer" market to be their stronghold.

Regards.
Every gaming GPU Nvidia produces is a significant loss in revenue for them vs selling an A series workstation GPU or AI accelerator. That's why it has been so difficult to find Blackwell GPU's since launch for anything remotely close to MSRP. Nvidia is not producing them in the volume necessary to satisfy the market because their production priorities are elsewhere. Nvidia allegedly moved a number of the gaming driver developers to their AI development teams which is why driver quality has dropped off so significantly for Blackwell.

Nvidia isn't going to walk away from the gaming market, but they are very obviously doing the bare minimum necessary to drag the market along. The x90 is the only tier that gets worthwhile performance improvements every generation. They aren't upgrading the VRAM most of the time. They've basically abandoned native performance comparisons in their marketing materials. All of these combined demonstrate that Nvidia just doesn't care. If AMD wasn't such garbage, maybe Nvidia would put in slightly more effort. However, AMD doesn't care either and is perfectly happy doing the (NVidia price structure - $50) while having a much worse feature set.

$2B gaming revenue? Great. Datacenter revenue in Q4 was $35B. If Nvidia had eliminated their gaming GPU production and moved it all to AI accelerators, they would have passed $40B easily. Dropping out of the gaming market wouldn't cost them $2B. It would be replaced with a much larger number selling products to enterprise customers.
 
Last edited:
Every gaming GPU Nvidia produces is a significant loss in revenue for them vs selling an A series workstation GPU or AI accelerator. That's why it has been so difficult to find Blackwell GPU's since launch for anything remotely close to MSRP. Nvidia is not producing them in the volume necessary to satisfy the market because their production priorities are elsewhere. Nvidia allegedly moved a number of the gaming driver developers to their AI development teams which is why driver quality has dropped off so significantly for Blackwell.

Nvidia isn't going to walk away from the gaming market, but they are very obviously doing the bare minimum necessary to drag the market along. The x90 is the only tier that gets worthwhile performance improvements every generation. They aren't upgrading the VRAM most of the time. They've basically abandoned native performance comparisons in their marketing materials. All of these combined demonstrate that Nvidia just doesn't care. If AMD wasn't such garbage, maybe Nvidia would put in slightly more effort. However, AMD doesn't care either and is perfectly happy doing the (NVidia price structure - $50) while having a much worse feature set.

$2B gaming revenue? Great. Datacenter revenue in Q4 was $35B.
Yep. I don't disagree on anything you said.

Just keep in mind: the amount of money they're getting right now is due to the explosion of everyone and their dog building "AI datacenter" or similar capabilities, but that growth, like with most tech, will hit a cap. Not all growth (hence revenue) is infinite. Yes, they are making big shekles from it, but the question you need to ask yourself is "for how long". Now, I am not saying or predicting that will go to zero. That's just dumb to even suggest, just by looking at Intel's history and the data center. But I am saying, much like any corporation with a good grasp on any market, they won't let it go. That's perhaps what you and I are not seeing eye to eye. I fully agree nVidia is doing the bare minimum to keep the "gamer" market fed. I mean, Jensen joked about it last year, but "you all have $10K battlestations, right?!" (not literal quote) says all we need to know about what Jensen thinks of the market itself. Just keep in mind, he is still pushing to own that market.

As for AMD or Intel competing. Well, nothing to say there. If the alternatives are not good enough for your use case (hence the strong qualitative of "garbage" for AMD, I guess?), then you're in nVidia's captive audience and their favourite consumer type.

Regards.
 
Easy on the shill nonsense.TH doesn't have a preview do they?The ones I read didn't put disclaimers in at all. Basically just publishing nvidia pr. I deleted their bookmarks.
The video though. I've watched Steve for years and he hardly says the f word. It's usually beeped out.It is not bleeped out here . It's sorta like that mellow dude you know basically really pi**ed off. It's shocking. He tells a nvidia dude to f off (not directly) and just roasts them. He's not perfect but his credibility allows him to do this. It's glorious. Then a pr dude tried to put an underling in the firing line but Burke wasn't having it. The e mail he sent to nvidia is just gold. it's in the video.
The Aussies @ Hardware unboxed have GNs' back, which is good.

Paul's hardware wasn't kind either, but Steve went scorched earth. HUB also has something regarding the 5060 out now. I suspect more to come.
 
Paul's hardware wasn't kind either, but Steve went scorched earth. HUB also has something regarding the 5060 out now. I suspect more to come.
Sounds like a Hastag "me too" movement in the Tech Space for all reviewers who have been, ehem, strongarmed by nVidia and I find it kind of funny it took this long.

I wonder how the 5060 sales will look like in the upcoming weeks.

Regards.
 
FG increases latency slightly when raising the frame rate whereas an actual increase in frame rate lowers latency. That means unless you have a good starting frame rate it's pretty worthless unless you're a fan of poor input latency. There are also problems with dynamic scenes and rendering, but those aren't universal.

FG is a great technology and as it gets better I have no doubt it'll see common usage to utilize very high refresh rates, especially at higher resolution, but it's not the universal improvement it's billed as. It also increases VRAM usage so nvidia's current cards with 8GB VRAM can be a problem.
Also fact Nvidia themselves say NOT to use FG if you can't hit 60 fps w/o it because of the downsides.

FG is akin to DLSS upscaling.
It is "usable" no matter how low your framerate (or resolution) is, but its not going to be a fun experience.

like upscaling if you already run a "high" setting you get benefit worth the bit of negatives.

if you are on low end..you get "meh" benefit and all the downsides.

fake frames need high frame rates to get data from effectively same was upscaling wants more pixels.

upscaling 1440p to 4k is gonna look better than trying to go from 720p to 4k becasue theres more "data" at 1440p than 720p. Its same FG.
 
  • Like
Reactions: thestryker
What a ridiculously stupid take.
Resorting to insults suggests you don't have much to back your refutation of the point, which was that Nvidia is deliberately tying the hands of the reviewer to highlight a single feature improvement and present the improvement of that one feature as if it was an overall improvement.

Frame generation is one of the features of this generation's cards
Yes. ONE of the features. Pure rasterization performance is still an extremely important metric, and Nvidia is trying to hide that.


If you don't want to take advantage of all of the cards features
But that's not what the test is demonstrating. It is demonstrating a SINGLE feature to the exclusion of all others, with the intent that people will believe that the improvements of the rest of the features (such as pure rasterization, as well as ray-tracing) are equal in degree.


It'd be like buying a car that can go 280 miles an hour and then setting the throttle so it won't go more than 1/8.
No, it would be like buying a car that can go 280 miles an hour, but that can hold cruise control as low as 15 MPH, when all other cars have the cruise control minimum at 25 MPH, and insisting that the testers do NOT test the top speed of the car, nor of the other cars being tested.


I would like someone to explain to me the difference between a "fake frame" which is pixels on a screen and a "real frame" which is made up of pixels on the screen. Both frames are interpreted from code in the game and placed on the screen. Some people are so closed minded that they don't understand the way forward. I'd label these people luddites or technophobes but maybe they're just "fake people".
At this point, you're engaging in a straw-man argument.
 
  • Like
Reactions: Lamarr the Strelok
Frame generation is one of the features of this generation's cards If you don't want to take advantage of all of the cards features then don't buy that card
its not a feature, is a crutch... other then the xx90 cards, all the cards are labeled one tier above what they really are, and then use DLSS and frame gen to get the performance we should be getting at that tier., and still charging more for that tier

I would like someone to explain to me the difference between a "fake frame" which is pixels on a screen and a "real frame" which is made up of pixels on the screen.
a fake frame, is a frame created by the ai or algorithm that nvidia uses to make it, a real frame is the frame the game is programed to create, which is where rasterization performance matters

Some people are so closed minded that they don't understand the way forward
so the way forward is to use tech to make the cards give the performance they used to give natively ? wouldnt a better way be to make a card that can do it with out fake frames and dlss ?? you know... like how nvidea did it before the 30 series when they actually put R&D into their cards to make games run faster and look better ? looks like nvidia's current products could use a brand new gen 1 architecture...
FG is a great technology
not really, its a crutch, as is dlss. as i said above.. is nvidia's way to give us the performance we used to get, with out using tricks. I see my reply just above on this post.
 
  • Like
Reactions: -Fran-