News Intel Core i9-14900K, i7-14700K and i5-14600K Review: Raptor Lake Refresh XXX

Page 7 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

ilukey77

Reputable
Jan 30, 2021
794
331
5,290
Look at the title of this review, does it say review of a system? This is a CPU review which means that you have to isolate everything as much as possible to only show differences that actually come from the CPU, and not from mobo settings or anything else.

These mobos will still support 14th gen, out of 10 mobos with the default settings only one pushed to 200W while one was as low as 65W, 7 out of 10 stuck with the suggested TDP of 125W.
Yes, these are Budget boards, that is the point we are making, the CPU will run differently depending on the mobo you use.
(You can find the video on youtube by using the title under the pic. )
zfIQvko.jpg


Quote,link, something?!

Why?! What for?! Just so you can drag this out forever?!
They are the same thing.

So you left out a very important fact, I stated that fact and now you are calling this putting words in your mouth?! Seriously?!

Maybe it would be significant if it started with the 14th gen but it started many gens ago now.
The 13th gen gains this 1% as well, so the final difference is no different.
Im not sure why even with the most almost pathetic release in generations ( being 14th ) you continue to champion Intel..

Ive lost count of the amount of time youve attacked me ( shown some funky graph about how intel are doing this or that ) on my support for AMD and just pointing out simple truths about Intel ..

Yet even now you try to champion them and justify these terrible decisions Intel make

Call a spade a spade !!

14th is a joke on a dead platform that should never have been released and should not be bought under any circumstance ..

It almost feels like a Nvidia situation ( where they think their poo doesnt stink and can do what they want and customer will buy it ) difference is Intel has NOTHING to win on ( atleast Nvidia have DLSS RT and the 4090 )

If people keep supporting these rubbish products Intel will learn nothing and still ( in this case ) get pushed completely out of the CPU market !!

That is very bad for consumers then because ( yes i like my AMD products but ) AMD will just take the market in the direction they want because they have NO competition !!
 
  • Like
Reactions: King_V

bit_user

Titan
Ambassador
Look at the title of this review, does it say review of a system? This is a CPU review which means that you have to isolate everything as much as possible to only show differences that actually come from the CPU, and not from mobo settings or anything else.
Unfortunately, Intel has made that difficult by allowing for the motherboard (not to mention cooling solution) to have such a dramatic influence on the results.

These mobos will still support 14th gen, out of 10 mobos with the default settings only one pushed to 200W while one was as low as 65W, 7 out of 10 stuck with the suggested TDP of 125W.
Which 10? Did you arbitrarily pick 10 budget boards to look at, or did you pick gaming boards like most people building an i9 or i7 machine would probably use?

Quote,link, something?!

"adjusting PL2 and Tau, according to Intel, is not overclocking." (April 30, 2020)

View: https://twitter.com/IanCutress/status/1255853364862214145


Why?! What for?! Just so you can drag this out forever?!
They are the same thing.
There's indeed reasoning behind it. The 14th gen clocks higher. That's practically its only major selling-point. Yet, we know that power usage increases nonlinearly with frequency. It therefore stands to reason that removing power limits could provide even greater upside, on Raptor-Refresh.

In TechPowerUp's review, I found an upside of as much as 5.0%. That's enough to bring its overall advantage vs. the i9-13900K from 1.9% to 8.1%. That's huge, increasing its margin over the previous gen by a whopping 4.3x!

cinebench-multi.png


So you left out a very important fact,
You mean because I didn't say how big the performance impact was? That's yet to be determined (although, I did find some relevant data, above). I just pointed out deficiencies in the ability to assess its performance. It's a separate, though relevant question of what sort of impact that has.

I stated that fact and now you are calling this putting words in your mouth?! Seriously?!
No, let's look at what you actually said.

"it makes people like you believe that the difference from more power is huge"

That's not an established fact. You're supposing what "people like me believe". That's ascribing a position to me which I never took.

The 13th gen gains this 1% as well, so the final difference is no different.
People don't only look at application-average performance. The reason for multiple benchmarks is so they can look at those which best characterize their own usage profile.

Anyway, let's try and put some data behind it. The TechPowerUp review of the i9-13900K shows a 6.1% advantage from running that CPU with "Power Limits Removed". So, it seems like the Gen 14 has less of an upside, but then it's essentially the same CPU running at a higher baseline. So, I guess that's not too surprising.

cinebench-multi.png


Correspondingly, we see the multi-threaded power consumption of the CPUs with "Power Limits Removed" went from 373 W to 407 W.
power-multithread.png
power-multithread.png
 
  • Like
Reactions: King_V

bit_user

Titan
Ambassador
Im not sure why even with the most almost pathetic release in generations ( being 14th ) you continue to champion Intel..

...

Call a spade a spade !!
Terry has to back Intel. Terry cannot call certain spades a spade. Hopefully, everyone now sees that. I challenge anyone to find a single critical thing Terry ever said about Intel. Meanwhile, the attacks and digs on AMD are relentless.

One would expect that even Intel's investors and many of its employees would get frustrated and vent about Intel, from time to time. Yet, Terry never even so much as criticizes Intel's management.
 
Last edited:

sundragon

Distinguished
Aug 4, 2008
575
16
18,995
You mean same price, because while it's far more likely to see 13th gen discounted Intel's 14th gen shares the exact same tray price.
GamersNexus had the current prices listed as of a few days ago. Let's see how much of a discount they will provide going forward. I suspect it won't be huge because it will cannibalize sales of their other processors. You can check out their Youtube Review for details.
 
  • Like
Reactions: King_V and bit_user

sundragon

Distinguished
Aug 4, 2008
575
16
18,995

Not that it makes these CPUs any better but they did introduce new features, the application optimizer could be a big change especially for people that can't be bothered to shut off e-cores or to use a tool to confine a game to p-cores only and stuff like that.

And of course toms did report on these things being new but promptly did not benchmark these two games to see if the claims are true.

AI-Assisted Overclocking and the Application Performance Optimizer​

Lzc2rkf2PUnNSLXCZSvb4T-1200-80.jpg.webp
While, I appreciate that, GamersNexus tests the all out performance of the chip and those tests show "improvement" that could be within statistical rate of error :LOL: I HIGHLY suspect the numbers in the shared image are optimistic in a best case scenario. It's sad that a company with the super deep pockets, R&D budget of Intel produces products that need a thick layer of marketing to show "improvement" where it mostly doesn't exist.
 
  • Like
Reactions: King_V

sundragon

Distinguished
Aug 4, 2008
575
16
18,995
To be fair. When it comes to the highest end hardware.... most software does not take advantage of it. You basically buy this grade of hardware for niche use cases or bragging rights.

Most people will be fine with non-K i7's at highest which is about where the improvements this refresh seems to top out..... but that just means they probably should have done a lot of this already in the 13th gen but were saving it up for the pre-planned refresh.
The only rational for 14 vs 13 is future proofing where (hopefully) future games will push these chips, like you said, and thus the difference is within the statistical error :ROFLMAO: All that R&D and deep pockets doesn't help them make better chips, just better marketing
 

bit_user

Titan
Ambassador
It's sad that a company with the super deep pockets, R&D budget of Intel produces products that need a thick layer of marketing to show "improvement" where it mostly doesn't exist.
Remember how Intel kept releasing Skylake-derivative 14 nm desktop products, even while 10 nm were being launched on the laptop market? We got 2 generations of 10 nm laptop CPUs, before Alder Lake finally brought those benefits to the desktop. This seems to be a repeat of that situation - except that desktop users don't even get any more cores, this time!

Another example - possibly even more relevant - is how Intel launched Broadwell (their first 14 nm CPU) for laptops, while desktop users got only a Haswell Refresh.

Something went wrong with launching the desktop version of Meteor Lake (known as the S-series, in Intel lingo). Since Intel didn't have enough time to design anything new, they slapped a "Gen 14" label on what - for all we know - could be just the latest stepping of their Gen 13 CPUs.

That's why this product gives such a lackluster showing. Intel had other plans, but it seems their ability to execute on their roadmap still isn't 100%.

All that R&D and deep pockets doesn't help them make better chips, just better marketing
And maybe even hire people to do positive spin, on forums and social media.
 
  • Like
Reactions: sundragon
I guess that depends on how much variation there is between those CPUs.
Exactly why power limits removed by itself isn't a good dataset.
How about not putting words in my mouth?
I'm not you're the one who has been arguing this entire time that power limits removed is a perfectly valid way to benchmark. I've been saying that it has way too many variables and isn't useful unless you can compare it to stock (which is what TPU did and is why theirs is the only 14900K review I've seen that I would consider to be good).
Intel didn't officially allow MCE. All that article says is that the author has heard that Intel was willing to honor the warranty on CPUs where MCE had been enabled.
Intel allowed it to exist and never stopped the motherboard makers. They've stopped plenty of other things over the years, but this wasn't one of them so whether explicit or not they've allowed this behavior for over a decade.
 

King_V

Illustrious
Ambassador
Look at the title of this review, does it say review of a system? This is a CPU review which means that you have to isolate everything as much as possible to only show differences that actually come from the CPU, and not from mobo settings or anything else.

These mobos will still support 14th gen, out of 10 mobos with the default settings only one pushed to 200W while one was as low as 65W, 7 out of 10 stuck with the suggested TDP of 125W.
Yes, these are Budget boards, that is the point we are making, the CPU will run differently depending on the mobo you use.
(You can find the video on youtube by using the title under the pic. )
zfIQvko.jpg


Quote,link, something?!

Why?! What for?! Just so you can drag this out forever?!
They are the same thing.

So you left out a very important fact, I stated that fact and now you are calling this putting words in your mouth?! Seriously?!

Maybe it would be significant if it started with the 14th gen but it started many gens ago now.
The 13th gen gains this 1% as well, so the final difference is no different.
Dude, look, it's very predictable here.
Intel winds up with outrageous power draw numbers, and you show up every time to declare that Intel is really power efficient, and that there's this grand conspiracy between reviewers and motherboard manufacturers to make Intel look bad, and that poor Intel is absolutely powerless to stop it.

There's no conspiracy, and Intel is not a helpless victim.

You just need to chill out with that. I don't know why you feel the need to sell this false narrative about Intel CPUs being far more power efficient than the reviews say.

Hell, even Intel isn't making the claims you are.

Have you considered writing for Userbenchmark?
 

bit_user

Titan
Ambassador
I'm not you're the one who has been arguing this entire time that power limits removed is a perfectly valid way to benchmark. I've been saying that it has way too many variables and isn't useful unless you can compare it to stock (which is what TPU did and is why theirs is the only 14900K review I've seen that I would consider to be good).
Given their review, can you predict how it would behave, out of the box, for a user with a given motherboard? That's the primary purpose of these reviews! I'm still waiting to hear your solution to that problem.

Intel allowed it to exist and never stopped the motherboard makers. They've stopped plenty of other things over the years, but this wasn't one of them so whether explicit or not they've allowed this behavior for over a decade.
Whether or not this is a new problem doesn't change the facts. Intel is marketing a CPU with a "suggested serving size" that doesn't reflect the out-of-the-box usage, for many users, and providing no clear disclaimer to warn them.
 
  • Like
Reactions: King_V
Given their review, can you predict how it would behave, out of the box, for a user with a given motherboard? That's the primary purpose of these reviews!
First of all a CPU review is supposed to review the CPU not the system it's put into.

There are countless motherboards that have countless settings and capabilities so it's absolutely impossible to accurately give information that applies to all of them. Using Intel's stock CPU settings is the most accurate way to show how any CPU you pull off the shelf in any motherboard with sufficient VRM will run (sadly a lot of manufacturers will skimp here and not be clear about it while still supporting CPUs they can't really run). It's absolutely important to show the power limit removed performance in addition for enthusiasts because most enthusiast boards come like that. Without the baseline performance numbers though it's impossible to make any sort of good comparison because you have no way of knowing if it's just a bin situation, power delivery, cooling or motherboard.
I'm still waiting to hear your solution to that problem.
What problem? You keep saying there's some problem, but there isn't one. TPU did it the right way: they showed PL2 locked performance along with the limits removed (also showed OC which I like a lot, but don't find it to be necessarily important).
Whether or not this is a new problem doesn't change the facts. Intel is marketing a CPU with a "suggested serving size" that doesn't reflect the out-of-the-box usage, for many users, and providing no clear disclaimer to warn them.
Why is it Intel's job to warn people that depending on what hardware they buy that hardware may run the CPU off stock settings? This is absolute absurdity that no business would do unless it was warranty related ex: Nvidia doesn't warn you that AIBs may run their GPU at higher power limits.
 

Bamda

Distinguished
Apr 17, 2017
114
38
18,610
So, a tiny upgrade in performance, check. New higher price, check. A nice boost in power consumption, check. This must be what Pat was talking about when he said, "Intel Will Return to Become ‘Unquestioned Leader in Process Technology"
 

Ogotai

Reputable
Feb 2, 2021
348
231
5,060
Using Intel's stock CPU settings is the most accurate way to show how any CPU you pull off the shelf in any motherboard with sufficient VRM will run

may run the CPU off stock setting
but, what is considered " stock " settings ? to me, thats what ever the board you are putting the cpu in, defaults to. if intel doesnt dictate that to the board makers, then that gives them free rein to choose what " stock " setting are for that board and cpu.
 
but, what is considered " stock " settings ? to me, thats what ever the board you are putting the cpu in, defaults to. if intel doesnt dictate that to the board makers, then that gives them free rein to choose what " stock " setting are for that board and cpu.
Stock is what Intel says it is period (it's their product).

You're just making up your own definition that suits your opinion. To use my prior nvidia analogy let me explain it to you this way: my 3080 out of the box runs 400W while stock for a 3080 is 320W that fact doesn't change just because my card's AIB chose to run outside of it.
 
Last edited:

Ogotai

Reputable
Feb 2, 2021
348
231
5,060
Stock is what Intel says it is period (it's their product).
ahh, but intel doesnt in force, those specs, do they ??
You're just making up your own definition that suits your opinion
well then, as are you.
To use my prior nvidia analogy let me explain it to you this way: my 3080 out of the box runs 400W while stock for a 3080 is 320W that fact doesn't change just because my card's AIB chose to run outside of it.
no, which 3080 did you buy ? one that runs at the specs nvidia states, or one of the overclocked cards, for example a 3080 from asus, had probably 4+ different versions of the card, from bone stock specs as dictated by nvidia, all the way up the Stix models, if thats the case, your 3080 is stock, as per the model of the card you bought. the same as the case of how a mobo runs a cpu, which intel doesnt endforce its " stock " specs, as if it did, it wouldnt look as good in benchmarks...
 
ahh, but intel doesnt in force, those specs, do they ??
Having stock settings and forcing them aren't mutually exclusive.
well then, as are you.
No I actually understand what the terminology means and you don't seem to.
no, which 3080 did you buy ? one that runs at the specs nvidia states, or one of the overclocked cards, for example a 3080 from asus, had probably 4+ different versions of the card, from bone stock specs as dictated by nvidia, all the way up the Stix models, if thats the case, your 3080 is stock, as per the model of the card you bought.
It doesn't matter what card I bought stock for the 3080 is always 320W that doesn't change because I bought a model that doesn't adhere to it. I purchased a card which doesn't adhere to nvidia's specs that doesn't mean the specs change because of that.
 
Last edited:

Ogotai

Reputable
Feb 2, 2021
348
231
5,060
No I actually understand what the terminology means and you don't seem to.
i do under stand, we just have different ways of interpreting what they mean, by the looks of it.

there is a poster on here that says the same thing you do as to what " stock " is for intel cpus,. and i believe, others also disagree with that person, as to what is considered stock, and what isnt.
 
i do under stand, we just have different ways of interpreting what they mean, by the looks of it.

there is a poster on here that says the same thing you do as to what " stock " is for intel cpus,. and i believe, others also disagree with that person, as to what is considered stock, and what isnt.
There is nothing open to interpretation MSI/Asus/Gigabyte etc do not determine what other companies stock settings are. They only determine what their own settings are.
 

Ogotai

Reputable
Feb 2, 2021
348
231
5,060
There is nothing open to interpretation MSI/Asus/Gigabyte etc do not determine what other companies stock settings are. They only determine what their own settings are.
IF intel mandated which settings to use, and the mobo makers had boards that stuck to those settings for some of their boards, and decided what settings to use for others, then we would both be correct, depending on which board we are referring to.. but intel doesnt mandate anything, there fore, stock settings are what the defaults of any given board, are.

bottom line, we have different views of what is stock and what isnt...
 
  • Like
Reactions: bit_user

ilukey77

Reputable
Jan 30, 2021
794
331
5,290
Terry has to back Intel. Terry cannot call certain spades a spade. Hopefully, everyone now sees that. I challenge anyone to find a single critical thing Terry ever said about Intel. Meanwhile, the attacks and digs on AMD are relentless.

One would expect that even Intel's investors and many of its employees would get frustrated and vent about Intel, from time to time. Yet, Terry never even so much as criticizes Intel's management.
it annoys me to no end because it does not represent true facts ..

PC parts are not cheap by any means ( it varies on deals location etc etc ) and people have a right not to be mislead into spending hard earned money ( even if money is no object ) on products that are just rubbish..

I look at what is best for the consumer do i care if you prefer Intel over AMD or vice versa or ultimately buy either NO..

Other than the 11th and now the ultimately pathetic 14th gen i like intel CPU's socket life is my distaste with Intel ..
hence why i buy AMD !!

AMD has bad products too that i hate 7600 8gb pathetic 7900x3d is a pointless product the 5600x3d is a bit of a scam the list goes on but i state what i think is a good buy and a bad buy to save consumers disappointment and pain !!

I understand people have deep passion and support for certain products but to shove graph after
graph of pointless information to discredit anyone who dares to have a different opinion isnt right!!

My point is and the point of these forums should be to offer non bias advice to others..
 
  • Like
Reactions: bit_user

bit_user

Titan
Ambassador
First of all a CPU review is supposed to review the CPU not the system it's put into.
That only makes sense to the extent the two are separable. For instance, you don't see reviews of laptop CPUs, but rather entire laptops. That's because the behavior of the same CPU is so heavily influenced by the particular laptop that a bench review of the CPU is both difficult and fairly meaningless.

Using Intel's stock CPU settings is the most accurate way to show how any CPU you pull off the shelf in any motherboard with sufficient VRM will run (sadly a lot of manufacturers will skimp here and not be clear about it while still supporting CPUs they can't really run). It's absolutely important to show the power limit removed performance in addition for enthusiasts because most enthusiast boards come like that.
I'll agree that showing those two extremes is useful, but without being entirely sufficient.

Without the baseline performance numbers though it's impossible to make any sort of good comparison because you have no way of knowing if it's just a bin situation, power delivery, cooling or motherboard.
Even with baseline numbers, you still are limited in your ability to make predictions. That's the part I'm wrestling with.

To the extent motherboards truly and meaningfully differ in their stock settings and behavior, then maybe we need to revert to what we did in the old days, like when I started reading Toms back in the late 90's. Back then, motherboards would routinely impact performance in measurable ways. As a result, there were motherboard shootouts that included benchmarks.

What problem? You keep saying there's some problem, but there isn't one.
Of course there is. A DIY builder wants to do a new build or upgrade. They need to decide on which components to use. In order to do that, they need some fairly specific idea of how different options will perform and behave. If you only test CPUs in somewhat artificial scenarios of all-or-nothing limits, then it doesn't give them enough information to make all the decisions that go into a successful build.

Why is it Intel's job to warn people that depending on what hardware they buy that hardware may run the CPU off stock settings?
Intel is the one claiming the CPU uses x & y Watts, in base & turbo, respectively. To make such claims with the knowledge they typically won't hold true is blatant false advertising.

If they made no claims about power, then they would not have to issue any caveat or qualifications. Because they do make such assertions, they need to make it clear that these are merely manufacturer suggestions - and not buried way down in some footnotes or some catch-all clause about "specifications subject to change without notice".

This is absolute absurdity that no business would do unless it was warranty related ex: Nvidia doesn't warn you that AIBs may run their GPU at higher power limits.
Nvidia's situation is different, because they don't sell naked GPUs. You can only buy their products in the form of a board. So, it's the board-provider's responsibility to specify accurate power utilization numbers (if any).

bottom line, we have different views of what is stock and what isnt...
Ultimately, what's the "right" thing for reviews to test is what's most useful to end-users. These reviews don't exist in a vacuum - their purpose is to inform prospective CPU buyers, so they can make the best decisions for their needs.

Consequently, if you benchmark something other than out-of-the-box behavior, you're failing your most important readers. If different boards have meaningfully different behaviors, then the reviews should try to sample them enough to cover the distribution.

What's decidedly not useful is to review the CPU at settings that it won't typically run at. That's mostly just serving internet commentators who like to have pitched battles between CPUs - like fantasy sports leagues.
 
Last edited:

bit_user

Titan
Ambassador
A inexpensive surprise, a quick OEM cash grab, a last minute refresh and one last push before its touted mid-2024 total revamp.
More like an "Oh S***!" last-resort move by Intel, when they discovered Meteor Lake S was nonviable. Intel has to release a new gen, every year. they couldn't just delay their CPU launch until Arrow Lake was ready... plus, what if Arrow Lake has problems of its own?

Setting all surprise things clear, the Intel Core i9-14900K is very much the same stuff we saw with the Core i9-13900K while keeping the same motherboard and memory.
As shown in this leak, there apparently is no such thing as Gen 14 desktop silicon. The Gen 14 SKUs are all comprised of either Gen 12 or Gen 13 dies.

ModelBase Freq.L3 CacheGPUDieSteppingTDPBCLK
Celeron G6900T2.80GHz4MB0.3 GHz / 1.3 GHzAlder LakeH035W100
Celeron G69003.40GHz4MB0.3 GHz / 1.3 GHzAlder LakeH046W100
Pentium G7400T3.10GHz6MB0.3 GHz / 1.35 GHzAlder LakeH035W100
Pentium G74003.70GHz6MB0.3 GHz / 1.35 GHzAlder LakeH046W100
Core i3-12100T2.20GHz12MB0.3 GHz / 1.4 GHzAlder LakeH035W100
Core i3-121003.30GHz12MB0.3 GHz / 1.4 GHzAlder LakeH060W100
Core i3-12100F3.30GHz12MBN/AAlder LakeH058W100
Core i3-12300T2.30GHz12MB0.3 GHz / 1.45 GHzAlder LakeH035W100
Core i3-123003.50GHz12MB0.3 GHz / 1.45 GHzAlder LakeH060W100
Core I3-13100T2.50GHz12MBIntel® UHD Graphics 770Raptor LakeH035W100
Core I3-131003.40GHz12MBIntel® UHD Graphics 770Raptor LakeH060W100
Core I3-13100F3.40GHz12MBN/ARaptor LakeH058W100
Core I3-14100T2.70GHz12MBIntel® UHD Graphics 730Raptor LakeH035W100
Core I3-141003.50GHz12MBIntel® UHD Graphics 730Raptor LakeH060W100
Core I3-14100F3.50GHz12MBN/ARaptor LakeH058W100
Core i5-12400T1.80GHz18MB0.3 GHz / 1.45 GHzAlder LakeH035W100
Core i5-124002.50GHz18MB0.3 GHz / 1.45 GHzAlder LakeC0 / H065W100
Core i5-12400F2.50GHz18MBN/AAlder LakeC0 / H065W100
Core i5-12500T2.00GHz18MB0.3 GHz / 1.45 GHzAlder LakeH035W100
Core i5-125003.00GHz18MB0.3 GHz / 1.45 GHzAlder LakeH065W100
Core i5-12600T2.10GHz18MB0.3 GHz / 1.45 GHzAlder LakeH035W100
Core i5-126003.30GHz18MB0.3 GHz / 1.45 GHzAlder LakeH065W100
Core i5-12600K3.70GHz20MBIntel® UHD Graphics 770Alder LakeC0125W100
Core i5-12600KF3.70GHz20MBN/AAlder LakeC0125W100
Core I5-13400T1.30GHz20MBIntel® UHD Graphics 770Raptor LakeC035W100
Core I5-134002.50GHz20MBIntel® UHD Graphics 770Raptor LakeB0 / C065W100
Core I5-13400F2.50GHz20MBN/ARaptor LakeB0 / C065W100
Core I5-13490F2.50GHz24MBN/ARaptor LakeC065W100
Core I5-13500T1.60GHz24MBIntel® UHD Graphics 770Raptor LakeC035W100
Core I5-135002.50GHz24MBIntel® UHD Graphics 770Raptor LakeC065W100
Core I5-13600T1.80GHz24MBIntel® UHD Graphics 770Raptor LakeC035W100
Core I5-136002.70GHz24MBIntel® UHD Graphics 770Raptor LakeC065W100
Core I5-13600K3.50GHz24MBIntel® UHD Graphics 770Raptor LakeB0125W100
Core I5-13600KF3.50GHz24MBN/ARaptor LakeB0125W100
Core I5-14400T1.50GHz20MBIntel® UHD Graphics 730Raptor LakeC035W100
Core I5-144002.50GHz20MBIntel® UHD Graphics 730Raptor LakeB0 / C065W100
Core I5-14400F2.50GHz20MBN/ARaptor LakeB0 / C065W100
Core I5-14500T1.70GHz24MBIntel® UHD Graphics 770Raptor LakeC035W100
Core I5-145002.60GHz24MBIntel® UHD Graphics 770Raptor LakeC065W100
Core I5-14600T1.80GHz24MBIntel® UHD Graphics 770Raptor LakeC035W100
Core I5-146002.70GHz24MBIntel® UHD Graphics 770Raptor LakeC065W100
Core I5-14600K3.50GHz24MBIntel® UHD Graphics 770Raptor LakeB0125W100
Core I5-14600KF3.50GHz24MBN/ARaptor LakeB0125W100
Core i7-12700T1.40GHz25MB0.3 GHz / 1.5 GHzAlder LakeC035W100
Core i7-127002.10GHz25MB0.3 GHz / 1.5 GHzAlder LakeC065W100
Core i7-12700F2.10GHz25MBN/AAlder LakeC065W100
Core i7-12700K3.60GHz25MBIntel® UHD Graphics 770Alder LakeC0125W100
Core i7-12700KF3.60GHz25MBN/AAlder LakeC0125W100
Core I7-13700T1.40GHz30MBIntel® UHD Graphics 770Raptor LakeB035W100
Core I7-137002.10GHz30MBIntel® UHD Graphics 770Raptor LakeB065W100
Core I7-13700F2.10GHz30MBN/ARaptor LakeB065W100
Core I7-13700K3.40GHz30MBIntel® UHD Graphics 770Raptor LakeB0125W100
Core I7-13700KF3.40GHz30MBN/ARaptor LakeB0125W100
Core I7-14700T1.30GHz33MBIntel® UHD Graphics 770Raptor LakeB035W100
Core I7-147002.10GHz33MBIntel® UHD Graphics 770Raptor LakeB065W100
Core I7-14700F2.10GHz33MBN/ARaptor LakeB065W100
Core I7-14700K3.40GHz33MBIntel® UHD Graphics 770Raptor LakeB0125W100
Core I7-14700KF3.40GHz33MBN/ARaptor LakeB0125W100
Core i9-12900T1.40GHz30MB0.3 GHz / 1.55 GHzAlder LakeC035W100
Core i9-129002.40GHz30MB0.3 GHz / 1.55 GHzAlder LakeC065W100
Core i9-12900F2.40GHz30MBN/AAlder LakeC065W100
Core i9-12900K3.20GHz30MBIntel® UHD Graphics 770Alder LakeC0125W100
Core i9-12900KF3.20GHz30MBN/AAlder LakeC0125W100
Core I9-12900KS3.40GHz30MBIntel® UHD Graphics 770Alder LakeC0150W100
Core I9-13900T1.10GHz36MBIntel® UHD Graphics 770Raptor LakeB035W100
Core I9-139002.00GHz36MBIntel® UHD Graphics 770Raptor LakeB065W100
Core I9-13900F2.00GHz36MBN/ARaptor LakeB065W100
Core I9-13900K3.00GHz36MBIntel® UHD Graphics 770Raptor LakeB0125W100
Core I9-13900KF3.00GHz36MBN/ARaptor LakeB0125W100
Core I9-13900KS3.20GHz36MBIntel® UHD Graphics 770Raptor LakeB0150W100
Core I9-14900T1.10GHz36MBIntel® UHD Graphics 770Raptor LakeB035W100
Core I9-149002.00GHz36MBIntel® UHD Graphics 770Raptor LakeB065W100
Core I9-14900F2.00GHz36MBN/ARaptor LakeB065W100
Core I9-14900K3.20GHz36MBIntel® UHD Graphics 770Raptor LakeB0125W100
Core I9-14900KF3.20GHz36MBN/ARaptor LakeB0125W100

Source: https://www.gigabyte.com/Motherboard/Z790-AORUS-ELITE-X-AX/support#support-cpu

If you like me already have a 12th or 13th Gen Core architecture, then there's absolutely no reason to upgrade
Oh, it rarely ever makes sense to do a gen-on-gen upgrade. The main case where someone might do that is if they're coming from a lower-spec part, especially if they can keep the motherboard and RAM they already have. So, like going from a Gen 12 i3 to a Gen 14 i7 could make a lot of sense.
 
That only makes sense to the extent the two are separable. For instance, you don't see reviews of laptop CPUs, but rather entire laptops. That's because the behavior of the same CPU is so heavily influenced by the particular laptop that a bench review of the CPU is both difficult and fairly meaningless.
That's why every laptop reviewer who's worth anything says you can't compare performance between them. Nobody reviews just laptop CPUs because there are too many variables. Desktops you can limit them to a degree, but if you don't then the review becomes equally pointless as trying to do a laptop CPU review.
I'll agree that showing those two extremes is useful, but without being entirely sufficient.
What extremes? Every reviewer other than TPU is only using power limits removed as that's the default motherboard setting.
Even with baseline numbers, you still are limited in your ability to make predictions. That's the part I'm wrestling with.
For enthusiasts if you have both you should be able to come to reasonable conclusions performance wise. It's the best you can do without having reviewers do tons more work (like as you mentioned before swapping CPUs and each running tests on them).
To the extent motherboards truly and meaningfully differ in their stock settings and behavior, then maybe we need to revert to what we did in the old days, like when I started reading Toms back in the late 90's. Back then, motherboards would routinely impact performance in measurable ways. As a result, there were motherboard shootouts that included benchmarks.
These could still be extremely important as proven by HUB's B650 roundup. Honestly I think the only reason they're not really done much anymore is that it's very labor intensive and likely wouldn't have reasonable ROI. There aren't many motherboard reviews at all anymore which is unfortunate because that means less accountability.
Of course there is. A DIY builder wants to do a new build or upgrade. They need to decide on which components to use. In order to do that, they need some fairly specific idea of how different options will perform and behave. If you only test CPUs in somewhat artificial scenarios of all-or-nothing limits, then it doesn't give them enough information to make all the decisions that go into a successful build.
This entire time you've been advocating for power limits removed testing... that is the default setting...

That's why MSI raised the thermal throttle limit, and likely why Anandtech had a notably higher Cinebench 2023 MT score than TPU/Tom's.
Intel is the one claiming the CPU uses x & y Watts, in base & turbo, respectively. To make such claims with the knowledge they typically won't hold true is blatant false advertising.
What is typical? You can't define that unless you start putting in asterisks. The vast majority of these CPUs will never see an enthusiast board so what's the line here? Intel's specifications are real and do exist they just don't stop board partners or OEMs from running outside of them (this works both ways if you've seen any of GN's Dell reviews).
Nvidia's situation is different, because they don't sell naked GPUs. You can only buy their products in the form of a board. So, it's the board-provider's responsibility to specify accurate power utilization numbers (if any).
Oh do CPUs work without motherboards then? Does every LGA1700 motherboard, or even a majority of them, remove power limits by default? Does Nvidia not specify power consumption for their GPUs? Does Nvidia not sell founder's edition GPUs that adhere to their specifications?

You want it one way for Intel and another for Nvidia simple as that and I don't know why. If you think it's the responsibility of the board partners for Nvidia then it should be responsibility if the motherboard makers to specify when their boards remove power limits by default.
Ultimately, what's the "right" thing for reviews to test is what's most useful to end-users. These reviews don't exist in a vacuum - their purpose is to inform prospective CPU buyers, so they can make the best decisions for their needs.

Consequently, if you benchmark something other than out-of-the-box behavior, you're failing your most important readers. If different boards have meaningfully different behaviors, then the reviews should try to sample them enough to cover the distribution.

What's decidedly not useful is to review the CPU at settings that it won't typically run at. That's mostly just serving internet commentators who like to have pitched battles between CPUs - like fantasy sports leagues.
The part you're happily ignoring with this nonsense is that when reviewers only review power limits off nobody can make a reasonable conclusion off of that data.

Also just for note because you're so obsessed with out-of-the-box why is XMP/EXPO okay? It most certainly isn't enabled by default.

One thing I do wonder is if Intel's reviewers guide (should they still send them, but I assume they do) has a testing recommendation regarding power settings.
 
Last edited:

bit_user

Titan
Ambassador
Intel's specifications are real and do exist they just don't stop board partners or OEMs from running outside of them (this works both ways if you've seen any of GN's Dell reviews).
This part is where you keep going off the rails. If Intel isn't doing anything to enforce these limits, then they're mere suggestions and must be noted as such. They're just an arbitrary line in the sand that effectively means almost nothing.

Does Nvidia not specify power consumption for their GPUs? Does Nvidia not sell founder's edition GPUs that adhere to their specifications?

You want it one way for Intel and another for Nvidia simple as that and I don't know why.
The most granular part an end-user can buy is the graphics card. That's where you need to look for the power specifications. I don't know why this is so hard for you to grasp. It's a crystal clear distinction.

The part you're happily ignoring with this nonsense is that when reviewers only review power limits off nobody can make a reasonable conclusion off of that data.
Where did I ever say that? Please tell me, because it would be news to me.

Also just for note because you're so obsessed with out-of-the-box why is XMP/EXPO okay? It most certainly isn't enabled by default.
If reviewers only reviewed with non- out-of-the-box settings, then I do consider that a problem. But, since you're just making up my position as you go along, I guess it doesn't really matter to you what I think.

I still don't know exactly what's your agenda, but I don't think you're truly interested in what's best for the DIY builder. If so, then you should be able to justify why each of your positions is best possible option for them. If not, then we will never agree because we don't have the same interests at heart.
 
This part is where you keep going off the rails. If Intel isn't doing anything to enforce these limits, then they're mere suggestions and must be noted as such. They're just an arbitrary line in the sand that effectively means almost nothing.
Why should they be noted as a suggestion? It's what the part is rated for which means it's what the part is guaranteed to work at.
The most granular part an end-user can buy is the graphics card. That's where you need to look for the power specifications. I don't know why this is so hard for you to grasp. It's a crystal clear distinction.
Then why does Nvidia bother putting power specs out for their parts? Maybe because much like Intel they have specifications for the parts they put out, but don't demand partners adhere to them. You cannot run a CPU without a motherboard, and the motherboard settings dictate how the CPU runs as they always have. The end user does not know what they're getting settings wise without transparency from the motherboard manufacturer.

Here's a completely different example of specs/settings not matching: the N series processors only support 16GB according to spec, but the router boxes coming out of China officially support 32GB (according to the box manufacturer and user experience). I'm not trying to say this is the same thing, as it's not, but it's another example of board makers doing what they want.
Where did I ever say that? Please tell me, because it would be news to me.
I don't know how I can possibly make this more clear: the "default" settings you keep championing and defending are power limits removed.
If reviewers only reviewed with non- out-of-the-box settings, then I do consider that a problem. But, since you're just making up my position as you go along, I guess it doesn't really matter to you what I think.
I'm asking you if it's not okay to change the power settings to enforce PL2 (your stance this entire time) for a review why is it okay that they enable XMP/EXPO for one?

Perhaps you're not okay with those being enabled, and if so I apologize for assuming it wasn't an issue for you.
I still don't know exactly what's your agenda, but I don't think you're truly interested in what's best for the DIY builder. If so, then you should be able to justify why each of your positions is best possible option for them. If not, then we will never agree because we don't have the same interests at heart.
I've been extremely clear in what my position is: For the purposes of a CPU review Intel's spec is far more relevant than specific board default (review the motherboard if you want board default, and I would very much like to see more motherboard reviews), but for enthusiasts board default is very important data to have in addition.
 
Status
Not open for further replies.