News To No One's Surprise, the RTX 4060 is an Unimpressive Overclocker

Imma go ahead and block every "reviewer" who Nvidia allowed this pre-pre-release semi-embargo lift.
Because anybody involved is not just the very definition of shilling for access, but they shilled so hard that Nvidia deemed them worthy of double-extra exclusive mega access for being the biggest shills of all.

Nvidia should be ashamed, but so should the YouTube pitch men who built their entire fake-review business around sucking up to Nvidia.
 
The card still performs well over its predecessor at stock settings, but don't plan on getting an extra boost with overclocking — at least, not in Cyberpunk 2077. The RTX 4060 is very efficient, but it wasn't able to take advantage of any additional power limits it might have access to (perhaps due to voltage limits).

First of all, the CP2077 benchmark is inconsistent, and like you mentioned, we can't really draw any rational conclusion based on a single game.

NOT fully sure, but OC didn't help here much 'cause partly due to the card's limited memory bandwidth, and also because the GPU's capability was already at it's full limits, due to setting the Image Quality setting to Ultra, and with some RT enabled.

That's why the benefits from overclocking/OC are very low in this game. So gains from a faster memory OC could also not make much sense here, given the tiny bus and relatively low memory bandwidth of the card. But I still expect small gains from a Mem OC.
 
Last edited by a moderator:
  • Like
Reactions: KyaraM and artk2219

To No One's Surprise, the RTX 4060 is an Unimpressive Overclocker


The article's title is misleading though. How is it an "unimpressive overclocker" ? Just based on ONE game, and that too a benchmark, we can't draw any concrete conclusion that the 4060 fails to overclock better.

Also, the MEM OC was untouched, which might make some difference in other games. But Nvidia did give some TDP headroom to tweak, so maybe gains might be observed in other titles. A more fitting and appropriate title would be:

"The RTX 4060 offers marginal gains from OC in Cyberpunk 2077"
 
Last edited by a moderator:
  • Like
Reactions: KyaraM and artk2219
but they shilled so hard that Nvidia deemed them worthy of double-extra exclusive mega access for being the biggest shills of all.
jay at least didn't shill.

he even stated the price was not good.

and thats ALL that matters about a gpu.

every single modern GPU is "good" at a hardware lvl.

the only thing that makes it good or bad to the consumer is the price.


is a 4080 bad because its less $ to performance than a 4090? No.

is a 4070 bad because its bad $ to performance of a 4080? No.

it ALL comes down to the price. Not the the hardware itself.

Yes, the 128bit bus on 4060/ti was a stupid mistake on nvidias part but it isnt always gonna be an issue (depends on what you are doing with it)
 
  • Like
Reactions: artk2219
he even stated the price was not good.

and thats ALL that matters about a gpu.

Nothing could be further from the truth.

If a graphics card is not good enough for the games you wanna play, the fact that you bought it cheaply, will do little to console you.

A bad purchase is a bad purchase, regardless of how much you paid for it.

it ALL comes down to the price. Not the the hardware itself.

If that's the case, then it's just too bad. Hardware quality, should be consumers' top priority.
 
Last edited by a moderator:
First of all, the CP2077 benchmark is inconsistent, and like you mentioned, we can't really draw any rational conclusion based on a single game.
That's not my experience at all, at least if you're not running at settings that exceed your VRAM (e.g. 4K RT-Ultra on an 8GB GPU). Variance between runs for my Cyberpunk 2077 tests is usually something like 0.1%, with the 1% lows as usual showing slightly more of a difference between runs.

But while the headline is a bit debatable, we did note several times that this is a single game result and not particularly significant. (I didn't write it, and news was a bit slow today, so take it for what it's worth.) Definitely pisses me off that Nvidia gave several YouTubers an option to do a preview of performance two days early, though.
 
  • Like
Reactions: artk2219
"the RTX 4060 is an Unimpressive Overclocker"

And yet, whenever "certain" techie websites compare GPUs, they ALWAYS give NVidia extra points because their cards are allegedly way over-clockable than AMD's....!

And I am sure this dubious drivel will continue.
 
Last edited:
  • Like
Reactions: artk2219
I don't trust any reviews of anything on Youtube. If it reaches 81 FPS, presumably at 1920x1080 no ray tracing no DLSS, the same performance as the 2080S and 3060Ti, then it should be able to put in mid 60s with them. It would take a massive overclock to bring it to the next performance tier of 75fps average, which would mean overclocking is effectively useless. The same applies for 2560x1440 no ray tracing no DLSS to get to 60fps.

hCKrNhzdb2m6ZQPkMNtPRi.png

woB9RsLwgBhjwtfgnscQa4.png
 
  • Like
Reactions: KyaraM and artk2219
Definitely pisses me off that Nvidia gave several YouTubers an option to do a preview of performance two days early, though.

I'm actually still pondering over this move by Nvidia. Not sure what's going on here, but maybe Jayz got an incentive from Asus and Nvidia to post some early preliminary benchmark results, lol ? But still makes no sense, as other Youtubers didn't get this early embargo, as far as I'm concerned. There might be more, but I didn't scour the internet.

While searching YT, I found these two YouTubers as well:
I know Nvidia wants to showcase CP2077' s graphical prowess, since this game can be used as a 'Tech Demo' of some sort, to at least benchmark PATH Tracing, and the new RT Overdrive mode. But if this was an early embargo lift, then they should have given this greenlight to every Tech reviewer including Tom's, other YouTubers, and tech news outlets.

I don't trust YT benchmarks much though.

Kind of OT:

Speaking of CP2077, I know you guys used the Intel Core i9-9900K in your test setup, but many AMD CPU owners noticed that their chips remained under-utilized, most notably in regards to SMT.

I too noticed this on my friend's computer. If you ever use an 8 Core+ AMD Ryzen processor for your benchmark, then you also try this fix/workaround as outlined below. I doubt this will get an official fix though.

This issue was found right after the game's release and while CDPR said that the issue was resolved with the HotFix 1.05, but that wasn't the case, because the patch mainly addressed the SMT/Thread utilization on AMD Ryzen 4-core and 6-core SKUs. 8/12/16 Core SKUs were left out.

According to the devs, the 8-core, 12 core & 16 core chips were running as intended, however it was later revealed that AMD Ryzen CPUs still faced under-utilization of their cores/threads which can lead to drastically poor performance.

But, someone has just released an unofficial fix for this AMD Ryzen CPU under-utilization. It is basically a simple HEX edit of the main .executable file.

This was observed and tested by PCGameshardware. They did just that and the results showed improvements of up to 27% with the new unofficial patch applied.

Using the Ryzen 7 7800X3D CPU, the CPU saw an average FPS gain from 108.3 FPS to 137.9 FPS. This shows that there is still a MAJOR problem with the game and its optimization around 8-core AMD Ryzen CPUs, which has yet to be addressed by CDPR, IMO.

And I don't think this is an isolated case, since the problem affects a lot of AMD Ryzen owners out there, and I'm pretty sure other sites/users will also test this fix.

TIJkDqQ.png


View: https://twitter.com/CapFrameX/status/1673259920941096960
 
Last edited by a moderator:
  • Like
Reactions: artk2219 and -Fran-
While searching YT, I found these two YouTubers as well:

I had to unsubscribe from that guy... not only was every video he made whining about GPU prices but his voice was beyond annoying. I mean... annoying to the point I'd almost toss a hammer through my PC screen.

He lost credibility with me though in his Jedi Survivor review when he called a 5 year old 2700x a "fairly recent CPU."

That would be like calling my 2017 7700k build "fairly recent."
 
but his voice was beyond annoying. I mean... annoying to the point I'd almost toss a hammer through my PC screen.

LOL. :tearsofjoy: This is the first time I heard about this guy. Name came up via a random google search. I basically don't follow or watch ANY YouTuber out there though, be it anyone.
 
  • Like
Reactions: artk2219
>I'm actually still pondering over this move by Nvidia.

There's no mystery to it. As one of the YT'er said directly, Nvidia wanted to showcase the 4060's feature sets, viz framegen, DLSS3, etc. The preview is explicit in requiring those features tested.

It's Nvidia PR doing their job, as they should. I get why Jarred and other reviewers would feel peeved about this. But the review embargo is by Nvidia's rules, and if they want to change it, it's their prerogative.

The reason the peanut gallery is squawking about this is that they have their own negative PR campaign going, to pre-emptively lambast Nvidia cards as <Mod Edit> even before the reviews launch, and they don't want any competition.

I've seen the JZ and the Daniel Owen vids. They're solid reviewers, and their pieces were balanced and informative. No gushing. Owen brought up an aspect of framegen, which is that framegen latency can be ameliorated with Reflex, so framegen is doable for some twitch games, albeit with caveats. To wit,

View: https://youtube.com/watch?v=jVCcM7V9tME&t=153


To be fair, Jarred covered this in his 4060Ti review, but on pg.7 of a 10-page review, and it's in text. When it's text-vs-video for accessibility, video wins.

My takeaway from this is that Nvidia PR has earn its keep for this foray. Nvidia has managed to nudge my own opinion about the feature set, viz framegen & DLSS3, more toward the positive.
 
Last edited by a moderator:
  • Like
Reactions: artk2219
If a graphics card is not good enough for the games you wanna play, the fact that you bought it cheaply, will do little to console you.
.....no duh?

if you are playing at 1080p (which a 60 tier is mainly for) thats not an issue.

if you expect 4k 60 fps & buy a 60 tier..thats not the cards fault its you not knowing what you were buying.

If that's the case, then it's just too bad. Hardware quality, should be consumers' top priority.
its not.

jay even mentions you can buy used cards for less & get better performance.

wait a bit and vendors will have last gen cheaper (as they are priced same and they will need to get rid of old stock thus lower price)

price is everything on if a gpu is good or not.

know what your use case is
know what gpu are aroudn that.
see how price is and take whats more value.



if the 4090 ti cost 5000$ that doesnt make it a bad gpu.

it would still be an amazing piece of hardware...just it would be a bad buy as its price isnt right.
 
HUB addresses the "preview review" at 12:20:

View: https://youtu.be/DKSd5iO7kiI?t=738


They were approached, and I'm sure other "big" YT'ers also were, but we only have 3 confirmed ones (thanks @Metal Messiah. !) which accepted. That actually tells you about how most of them aligned on this. Sad to see Owen and Jay allowing nVidia manipulate them and used them as an extension to their marketing arm instead of standing their ground as independent reviewers.

I also understand why some think this is "ok", but as they portray themselves as "independent tech press", they have to abide by it and act consistently. If you accept this sort of "content" to be published, then you're just an extension of nVidia's (or any other) company's PR machine. That is where the issue lies.

Regards.
 
"The card still performs well over its predecessor at stock settings"

Would have been a proper headline. But no...Toms' looks for a negative in everything. 99% of people do not even overclock....CPU or GPU.
I'd argue on the opposite direction: nVidia is locking down the cards more and more and giving its AIBs even less room to get creative without people modifying the hardware directly for "proper" or even mild overclocking.

Think about it from this perspective: if you allow voltage modification on these cards to go higher than just a few mVs, you could see AIB partner models that could get close to the 4060ti for about a small premium from factory. I can't remember when it was, but I do remember there's a few cases where partner cards were indeed close to the next tier.

I don't think it's fair to say nVidia wants the headache of allowing such a thing so openly and that's why they don't allow their partners to "get creative" anymore, which is sad.

All of that being said, I don't like OC'ing GPUs (I do undervolting nowadays) and have always advised against it because of the diminishing returns, but hey, if the option is there, I'm sure a lot of people would take it if it's just a "one click thing" like in some AMD models. Both are restricting the partner models more and more and I don't like that trend. From a business perspective it makes sense: you keep the segmentation in a tight grip.

Regards.
 
  • Like
Reactions: artk2219
>I also understand why some think this is "ok", but as they portray themselves as "independent tech press", they have to abide by it and act consistently. If you accept this sort of "content" to be published, then you're just an extension of nVidia's (or any other) company's PR machine. That is where the issue lies.

This got a chuckle from me. It's precious. Naive, but precious, like a toddler professing its love for Santa Claus, before reality intrudes.

There is no "independent tech press" in this context, only YT personalities, or influencers if you will. As they would be the first to tell you, they are not "press" because they are not journalists. And they are not "independent" in the traditional sense, because you the user do not pay for their work. More succinctly, they don't work for you. Your interests are not their interests. Their obligation to you is zip.

I get it. People like simple, cozy narratives where everything is clear-cut, good/evil, black/white. The good guys in your story are "independent tech press" (working for free, no less!) and the bad guys are all those shifty types who allow themselves to be "manipulated and used as extension of XYZ marketing arm." Must be a storybook world you live in, where good guys do only good things, and bad guys do only evil things.

To belabor the obvious, the world is a complicated place. The relationship between vendors and online tech pubs (incl YT personalities) is complicated. The tech pubs are dependent on ads and views. They are also dependent on vendors to supply them with products to review. The vendors, of course, rely on tech pubs to get publicity for their products, and to get positive PR when they can. It's a symbiotic, interdependent relationship, full of gray areas. Nothing is "independent" here.

There is nothing wrong with Nvidia wanting to showcase the feature set of their products. There is nothing said in the "previews" that is a lie or embellishment. Marketing is not evil or deception. It is a fact of life, and a fundamental function of every company.
 
  • Like
Reactions: artk2219
Leave it Jayz2Cents to leave you with more questions than answers. I quit watching his content a little over a year ago because I got tired of reaching the end of nearly every video with a feeling of "whelp....that was a waste of my time".

Question for those who did watch. When the 20% PL increase and 100MHz undervolt(....bad terminology, I know, but that's what's happening when you do this) was applied, what was the resulting core clock? What was the core clock increase (%) over stock?

Was any effort made to determine whether the VRAM size/width was playing a limiting role?
 
Last edited:
  • Like
Reactions: artk2219
jay at least didn't shill.

he even stated the price was not good.

and thats ALL that matters about a gpu.

every single modern GPU is "good" at a hardware lvl.

the only thing that makes it good or bad to the consumer is the price.


is a 4080 bad because its less $ to performance than a 4090? No.

is a 4070 bad because its bad $ to performance of a 4080? No.

it ALL comes down to the price. Not the the hardware itself.

Yes, the 128bit bus on 4060/ti was a stupid mistake on nvidias part but it isnt always gonna be an issue (depends on what you are doing with it)
Except when is the 7900XTX vs the 4090.

No matter how many times AMD states that they are not competing, these Tubers will keep doing it and better yet, they always trash the 7900 XTX on everything yet dont mention the price difference.
"the RTX 4060 is an Unimpressive Overclocker"

And yet, whenever "certain" techie websites compare GPUs, they ALWAYS give NVidia extra points because their cards are allegedly are way over-clockable than AMD's....!

And I am sure this dubious drivel will continue.
Dont forget the 2 Nvidia mandated points, RT and DLSS3 Fake Frames, sorry, real frames now.
But never, ever mention that RT is pretty much a useless gimmick (performance hit doesnt justify the very limited eye candy return) and DLSS only works on which ever GPUs Dear Leader Jensen decides.
I had to unsubscribe from that guy... not only was every video he made whining about GPU prices but his voice was beyond annoying. I mean... annoying to the point I'd almost toss a hammer through my PC screen.

He lost credibility with me though in his Jedi Survivor review when he called a 5 year old 2700x a "fairly recent CPU."

That would be like calling my 2017 7700k build "fairly recent."
Likewise and it doesnt surprise me that he was "allowed" to show the card early, since he hasnt stopped publishing videos after videos of this 4060, over hyping it like crazy.
 
  • Like
Reactions: PEnns and artk2219
Nothing could be further from the truth.

If a graphics card is not good enough for the games you wanna play, the fact that you bought it cheaply, will do little to console you.

A bad purchase is a bad purchase, regardless of how much you paid for it.



If that's the case, then it's just too bad. Hardware quality, should be consumers' top priority.
Thats just failure to study the product if that happens. Sure there are people impulse buying, statistically 40% and thats lowest estimate, but lets just use George Carlin: “Think of how stupid the average person is, and realize half of them are stupider than that.”
You just describe personal, not objective, experience of uninformed person who makes a stupid financial decision and hits a brick wall of reality...by doing this you can criticise any product for not doing what you want even if its not supposed to meet your expectations.

If hardware quality was supposed to be consumers top priority then everyone should be buying 4090...like...no. We are living in a world of finite and limited resources (time, natural resources, money...) so obviously price you pay is the most important factor along with quality of a product.
 
  • Like
Reactions: artk2219