News To No One's Surprise, the RTX 4060 is an Unimpressive Overclocker

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
I had to unsubscribe from that guy... not only was every video he made whining about GPU prices but his voice was beyond annoying. I mean... annoying to the point I'd almost toss a hammer through my PC screen.

He lost credibility with me though in his Jedi Survivor review when he called a 5 year old 2700x a "fairly recent CPU."

That would be like calling my 2017 7700k build "fairly recent."
That is still fairly recent, and the majority of people that game are on even weaker chips. Realistically paired with something like an RTX 4060, a 2700x should still perform just fine. Not amazing, but it wont be a huge bottleneck to it, especially considering that the RTX 4060 isnt providing much if any more performance over an RTX 2080 TI from the 2700X's era. Also vs a 7700K, id much rather have the 2700X, the CPU itself has aged much better, and the platform is still relevant. This is all especially true if you're gaming at 1440P or 4K since the onus is on the GPU for those settings.

 
You just describe personal, not objective, experience of uninformed person who makes a stupid financial decision and hits a brick wall of reality...

But, as far as GPUs are concerned, that's the exact kind of decisions we get, if we only make purchases based exclusively on price.

If hardware quality was supposed to be consumers top priority then everyone should be buying 4090...


And, if price was supposed to be consumers' top priority, we should all be buying Radeon RX 6600.

Hence the reason why a balance between quality and price is highly recommended.

Something which you very well pointed out, when you wrote that

price you pay is the most important factor along with quality of a product.

Quality, is something you added, and, even though you didn't put it as criteria number one, as it should be, i could still concede to that statement. But, please, don't overlook the fact that, my initial response, was directed to a guy, who - among other things - wrote:

he even stated the price was not good.

and thats ALL that matters about a gpu.

every single modern GPU is "good" at a hardware lvl.

the only thing that makes it good or bad to the consumer is the price.

it ALL comes down to the price. Not the hardware itself.

so I feel no need whatsoever to backtrack from my original statement:

If a graphics card is not good enough for the games you wanna play, the fact that you bought it cheaply, will do little to console you.

A bad purchase is a bad purchase, regardless of how much you paid for it.
 
Last edited by a moderator:
I'm actually still pondering over this move by Nvidia. Not sure what's going on here, but maybe Jayz got an incentive from Asus and Nvidia to post some early preliminary benchmark results, lol ? But still makes no sense, as other Youtubers didn't get this early embargo, as far as I'm concerned. There might be more, but I didn't scour the internet.

While searching YT, I found these two YouTubers as well:
I know Nvidia wants to showcase CP2077' s graphical prowess, since this game can be used as a 'Tech Demo' of some sort, to at least benchmark PATH Tracing, and the new RT Overdrive mode. But if this was an early embargo lift, then they should have given this greenlight to every Tech reviewer including Tom's, other YouTubers, and tech news outlets.

I don't trust YT benchmarks much though.

Kind of OT:

Speaking of CP2077, I know you guys used the Intel Core i9-9900K in your test setup, but many AMD CPU owners noticed that their chips remained under-utilized, most notably in regards to SMT.

I too noticed this on my friend's computer. If you ever use an 8 Core+ AMD Ryzen processor for your benchmark, then you also try this fix/workaround as outlined below. I doubt this will get an official fix though.

This issue was found right after the game's release and while CDPR said that the issue was resolved with the HotFix 1.05, but that wasn't the case, because the patch mainly addressed the SMT/Thread utilization on AMD Ryzen 4-core and 6-core SKUs. 8/12/16 Core SKUs were left out.

According to the devs, the 8-core, 12 core & 16 core chips were running as intended, however it was later revealed that AMD Ryzen CPUs still faced under-utilization of their cores/threads which can lead to drastically poor performance.

But, someone has just released an unofficial fix for this AMD Ryzen CPU under-utilization. It is basically a simple HEX edit of the main .executable file.

This was observed and tested by PCGameshardware. They did just that and the results showed improvements of up to 27% with the new unofficial patch applied.

Using the Ryzen 7 7800X3D CPU, the CPU saw an average FPS gain from 108.3 FPS to 137.9 FPS. This shows that there is still a MAJOR problem with the game and its optimization around 8-core AMD Ryzen CPUs, which has yet to be addressed by CDPR, IMO.

And I don't think this is an isolated case, since the problem affects a lot of AMD Ryzen owners out there, and I'm pretty sure other sites/users will also test this fix.

TIJkDqQ.png


View: https://twitter.com/CapFrameX/status/1673259920941096960
We used a 13900K, but yes. Nvidia actually included an advisory with a fix for Cyberpunk 2077 on AMD systems, with the RTX 4060 (and 4060 Ti I think) launches.
 
We used a 13900K, but yes. Nvidia actually included an advisory with a fix for Cyberpunk 2077 on AMD systems, with the RTX 4060 (and 4060 Ti I think) launches.

Advisory, as in Patch Notes ? What about the fix ? Did it include other Nvidia GPUs as well, and covered all 8+core Ryzen CPUs ?
 
Advisory, as in Patch Notes ? What about the fix ? Did it include other Nvidia GPUs as well, and covered all 8+core Ryzen CPUs ?
It was a zip file with notes saying to replace the Steam library DLLs in the Cyberpunk 2077 folder. I believe Nvidia also sent reviewers an email about the problem. Let me look...

Here's the email sent on May 19:
Hello,

We’re notifying you that we found DLSS 3 stuttering issues on Ryzen 7000 CPUs with Cyberpunk 2077 and The Witcher 3.

The latest SL 1.5.6 SL .dll files [Note: "SL" is for "Steam Library"], that will be included in an upcoming patch of Cyberpunk 2077 and The Witcher 3, fixes stuttering on Ryzen 7000 CPUs.

Please download the .zip files on the press site and follow the readme on how to replace the latest .dll files to implement the latest fixes.

As always, we appreciate your time when reviewing the RTX 4060 Ti 8GB FE. Please let us know if you have any questions.
What's unclear is if the latest 1.63 patch for CP77 includes the above, or maybe it's a Steam update that was supposed to take care of it. 🤷‍♂️
 
It was a zip file with notes saying to replace the Steam library DLLs in the Cyberpunk 2077 folder. I believe Nvidia also sent reviewers an email about the problem. Let me look...

Here's the email sent on May 19:

What's unclear is if the latest 1.63 patch for CP77 includes the above, or maybe it's a Steam update that was supposed to take care of it. 🤷‍♂️

I think this fix is basically only about "stuttering" issues on AMD systems. The fix which I referred to was more of a performance boosting workaround, which increased the CPU utilization with Ryzen CPUs.

Not sure if this is the same fix though, embedded in the 1.5.6 SL .dll file.