News Intel Core i9-14900K, i7-14700K and i5-14600K Review: Raptor Lake Refresh XXX

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Order 66

Grand Moff
Apr 13, 2023
2,163
909
2,570
Well there is a point to that AMD argument, it's true though the i5 was almost 45% faster IPC, at 1440p there was no graphics card on the market that would make the FX8350 feel slower then the i5. However how many people were on 1440p in 2013? what about people who were doing more then just gaming? what about 2 years down the road when the gtx 1080ti launched? Now if you had gone with the FX8350 you'll need to replace the whole system in order to upgrade to the 1080ti.
The 1080ti launched in 2017, so 4 years down the road from 2013.
 
I do own a 7800X3D and your pick of games appears to favor the X3Ds more than it "should"... that might be a subjective opinion ofcourse.
Not that the X3Ds aren't better in many games, just that in those you picked, their % difference is greater than in others.
I'm not saying you did it intentionally, maybe 9 games is too small of a sample in my eyes...BUT
If you choose something like Factorio (that performs superbly on X3Ds) then you could also put in a title that favors the intels... don't know, Starfield maybe?
Obviously, there's limited time to test every game around, so I understand.
Keep up the good work!
Factorio and Minecraft (and Flight Simulator) are interesting cases of games where there's tons of CPU work in a different format than what you might see in a typical FPS game. Factorio incidentally isn't in the overall average — it's just a curiosity.

Honestly, I still don't really know why Flight Simulator likes the big L3 caches so much. Obviously it's a game that can be CPU limited, in ways that a lot of other games are not. I suspect there's some weird stuff in the engine code that just responds well to X3D.

The difficulty comes in trying to extrapolate the results shown here to other games and future games. Things that are GPU limited (lots of ray tracing games) will benefit less. But what will happen with future GPUs? Because while the 4090 is extremely expensive, prices come down and people often use multiple GPU generations with the same base platform.

Conceivably there are people that will upgrade their GPU to a 5080 or 6070 in the future and potentially end up with similar performance to the 4090. Those same people might have an RTX 4060 or 4070 right now (or RX 6700 XT / 7700 XT), with an i5-13600K/14600K or Ryzen 7 7800X3D.

This is why we try to find games that scale more with CPU rather than less. Because there will always be lots of games where the question of "which CPU or GPU is best" is that it doesn't matter. So we want to know, when your CPU (or GPU) does matter, how much of a difference can it make?

Realistically, if you're buying a high-end CPU for gaming, you'll probably also have a high-end GPU... and you'll probably also have a 4K 144Hz or 3440x1440 144Hz ultrawide monitor to go with it. And if that's what you're running, most of the time, your choice of CPU will matter very little relative to your GPU (assuming you game at native resolution). I typically suggest allocating about 3X as much money to your GPU as your CPU if you're wanting a good gaming PC. If you're more about doing video editing and other tasks that need lots of CPU multi-threading performance, than maybe 2X as much for the GPU.
 
The i9-14900K is actually less power-efficient than the already overly-thirsty i9-13900K. Well, that places it in the same category as the ridiculousness known as the i9-13900KS, not a good place to be.
It's basically in the same realm as Rocket Lake and Comet Lake, as far as I'm concerned. I mean, sure, Comet Lake 10900K had potentially two more CPU cores than a 9900K, and Rocket Lake 11900K was a new architecture. But in both of those cases, Intel had pretty much reached the end of the line for performance scaling without killing efficiency. It needed a new process node and then some.

Raptor Lake (not the Refresh) was already somewhat questionable. My 12900K test PC runs just fine and I never really worry about things. The 13900K though can get hot, and shader compilation in some games actually triggers a game or system crash. Maybe it's more the MSI mobo (firmware) that's to blame, or the Cooler Master cooler, or a mixture of CPU, mobo, and cooler. But the fact is, 13900K is faster but runs hotter than 12900K, and in my experience the end result can be hit and miss.

Given how far Intel had already pushed its "Intel 7" 10nm node with Raptor Lake, I had no expectations that the Raptor Lake Refresh would reign in power use and improve efficiency. We'll need Meteor Lake or Arrow Lake to hopefully improve those aspects. Those future CPUs could actually be faster and more efficient than RPL-R, rather than just incrementally faster... or maybe just slightly faster but half the power? I guess we'll see when those launch.
 
Comparing 14600K to 13600K:

Running Blender.......I notice that temps are about 5 degrees lower with the 14600K....92 versus 87.

What's the consensus reason for that?

Random chance/margin of error?

Known differences with the IHS?

Rank speculation only, totally unexplained?

Non-gaming improvements generally seem to be 2 to 4 percent in favor of 14600K. Don't know why you wouldn't take that unless 40 or 50 dollars is a significant hurdle.
 
Since TomsHardware is supposed to be a tech site and not just a gamer site, I wish these reviews would be more detailed, like Techpowerup did in their review. They included machine learning, physics, chemistry and genome analysis, among other non-gaming programs, while TH relied mostly on just standard benchmarks like Cinebench.


ai-upscale.png



genomics.png
 
  • Like
Reactions: bit_user

kiniku

Distinguished
Mar 27, 2009
250
70
18,860
Can't say I see any sense in this 14th version of the same damn thing, nor do I understand how they claim 6ghz when it won't hit and hold. I don't care about video games, I'm all about video editing, and my 13900k will do 6.1 ghz, for like a nano second then hangs at 5.4 all day. I won't call it 5.8 ghz chip when it lives at 5.4. I've limited mine to 295 watts, cause any more just turns my PC into a bigger space heater with no real benefit. These chips just want to live on thermal throttle and I see no sense in that whatsoever. So I'm definitely skipping the 14th gen of the same damn thing!
It's time for a new generation that's smaller and more power efficient. The original Alder Lake was impressive in its time but it seems the 13th and 14th versions were mostly clock speed increases for an already power hungry CPU compared to Ryzens.
 
  • Like
Reactions: grogi

King_V

Illustrious
Ambassador
It's time for a new generation that's smaller and more power efficient. The original Alder Lake was impressive in its time but it seems the 13th and 14th versions were mostly clock speed increases for an already power hungry CPU compared to Ryzens.
Agreed. As was said when Alder Lake came out, the power consumption wasn't great, but at least was no longer meme-worthy.

It seems like, with Rocket Lake to some extent, and Raptor Lake to a greater extent, Intel is reaching for that meme-worthy crown again.
 
  • Like
Reactions: helper800

bit_user

Polypheme
Ambassador
@PaulAlcorn , thanks for yet another solid review!

However, I have a concern and a suggestion. First, I noticed some of the i9-13900K results looked a bit anomalous, where it's not only ahead - but also separated from all the other Intel models.

RwDZjPWBm6qYQ5UwUaEe5P.png

When all the other Intel models are fairly well clustered, how did that CPU not only out-perform the i9-14900K, but also break apart from the pack by more than an amount that separates its other peers?

And this one seems obviously wrong. Could it be a data entry error? Or maybe the test suffered an early abnormal termination, instead of completing successfully?

9y36CQovxqMcPdwuG3kZ3F.png

It really seems like the i9-13900K's score ought to have been something more like 86.6 - not 60.6!!

As for the suggestion: I think it would be more insightful if the less-relevant Intel and AMD processors had been colored with a dull blue and red, respectively, instead of making them all black. It would help make it more visually apparent which benchmarks so heavily favor Intel or AMD that they all cluster together vs. which are more open to contention.
 
Last edited:
This is pretty much as expected though seeing locked PL2 numbers would have been nice. 14th gen is just a replacement for 13th gen and that's about it. It does sound like the 14600K may be a better overclocker than the 13600K, but I haven't seen enough results on it for any certainty. Sure seems like nothing in the desktop consumer CPU space is going to be terribly interesting until ARL/Zen 5.
 
The opening shot appears to show three brand new processors on a piece of grey and white flecked carpet, a well known source of static voltages. Perhaps they're only dummy non-working CPUs, but it's hardly a good example for Tom's readers to follow.

When buying second hand components on eBay, I avoid all auctions showing CPUs, RAM, GPUs and mobos resting on carpet. It's one of the worst sources of ESD damage.
What if the piece of carpet is blue with orange flecks, will I be ok?
 
  • Like
Reactions: helper800

Phaaze88

Titan
Ambassador
If you check over on Hardware Canucks:
View: https://www.youtube.com/watch?v=D8qEzL8MM50


Mobo vendors should stop this crap, or Intel puts their foot down on them - one or the other.
What we got here with the i9 looks like a slightly more efficient 13900KS with a 200mhz bump on the P & E-core turbo... but board vendors still going balls to the wall on their stock bioses in their peen measuring contest, and the clueless end user suffers from it.
Looking forward to the 14th gen cooling threads... /S
 
  • Like
Reactions: bit_user
A YouTuber being over zealous with review option!?!?!?!?! I'm shocked... just shocked...
If you haven't taken the time to check Steve Burke and the crew out at GN I suggest you do. Potentially the most impartial and accurate review site on the internet (Steve's lede can be a little over the top sometimes for humours sake but the charts and benches are solid). They have some of the most stringent test suites and probably the most comprehensive testing equipment suite you'll see outside of an engineering lab. (They even visit AMD's lab and I believe an Intel tour is in the works?) The stock of hardware they keep for comparison testing is enormous as well. Generally pretty accountable for mistakes they make, either pulling the erroneous video, or issuing a correction within it, and no longer do sponsored videos so as to avoid any conflicts of interest (Asus *cough* Asus) over advertising money. If you want fair impartial testing they are as close to that as you will ever get right now. Many sites would do well to look at how they are doing things and incorporate some of it (Especially how they are always trying to improve, both with process and production).

Generally speaking if one enjoyed Anandtech of old, GN kinda scratches that itch, but with epic hair.


Anandtech deeps dives were legendary.
 

NinoPino

Commendable
May 26, 2022
250
137
1,760
I do own a 7800X3D and your pick of games appears to favor the X3Ds more than it "should"... that might be a subjective opinion ofcourse.
Not that the X3Ds aren't better in many games, just that in those you picked, their % difference is greater than in others.
I'm not saying you did it intentionally, maybe 9 games is too small of a sample in my eyes...BUT
If you choose something like Factorio (that performs superbly on X3Ds) then you could also put in a title that favors the intels... don't know, Starfield maybe?
Obviously, there's limited time to test every game around, so I understand.
Keep up the good work!
As they say Factorio "...doesn't impact our cumulative measurements...".
I agree with you, few titles to have a significant sample, and also titles with no difference from Intel/Amd affect the average result.
Imho, in the future, the reviewers should include also games like Factorio that are marginally GPU constrained.
 
We'll need Meteor Lake or Arrow Lake to hopefully improve those aspects. Those future CPUs should actually be faster and more efficient than RPL-R, rather than just incrementally faster... or maybe just slightly faster but half the power? I guess we'll see when those launch.
And you will never notice because you (everybody) only run the CPUs full blast...
Only if RPL-R is a terrible terrible node that has an extremely low power ceiling will we see it have low power draw numbers.
See phaaze88 post for details.
If you haven't taken the time to check Steve Burke and the crew out at GN I suggest you do. Potentially the most impartial and accurate review site on the internet (Steve's lede can be a little over the top sometimes for humours sake but the charts and benches are solid). They have some of the most stringent test suites and probably the most comprehensive testing equipment suite you'll see outside of an engineering lab. (They even visit AMD's lab and I believe an Intel tour is in the works?) The stock of hardware they keep for comparison testing is enormous as well. Generally pretty accountable for mistakes they make, either pulling the erroneous video, or issuing a correction within it, and no longer do sponsored videos so as to avoid any conflicts of interest (Asus *cough* Asus) over advertising money. If you want fair impartial testing they are as close to that as you will ever get right now. Many sites would do well to look at how they are doing things and incorporate some of it (Especially how they are always trying to improve, both with process and production).

Generally speaking if one enjoyed Anandtech of old, GN kinda scratches that itch, but with epic hair.


Anandtech deeps dives were legendary.
All the things you say are 100% true, but his personal opinions or jabs at intel are still super overblown just to make a boring article more clickbaity...
I mean imagine if he would have made this video as a comparison between 14th gen and AMDs new gen that just didn't come out (13th gen and zen 4 came out the exact same day and amd didn't release anything now), so he would compare 14 th gen against nothing and at the end of every result he would say, well that's a poor showing for amd, or amd failed this test as well, or something like that.
That would still be 100% accurate measurements but it would also be 100% cringe.
 
  • Like
Reactions: thestryker

bit_user

Polypheme
Ambassador
Amazon has been one of my several favorite buying review sources and receiving direct feedback from both, 'confirmed' independent novice users and enthusiasts who already have put their own money on the line! As you already know...Money talks...!
There have been plenty of review brokers that will reimburse people for leaving fake reviews. Just because it's a "Verified Purchase" doesn't mean the review is genuine.


Plus, if you return the product within the Amazon return window, I'll bet your review stays up and even keeps the "Verified Purchase" tag.

generally after carefully reading as many as 100-200 relevant reviews on especially the higher quality brand products, discerning the truth and reading between the lines, is much better served than a single review offered by influencers on the tech-channels at large.
As @JarredWaltonGPU pointed out, Amazon reviews rarely give you much in the way of test data. Not sure about you, but when I buy a tech product, I like to have good data as the basis for my decision. Then, if Amazon reviewers point out issues like reliability or compatibility problems, maybe I'll change my mind. However, I always like to start with concrete data.
 
Last edited:

grogi

Commendable
Jun 27, 2021
7
6
1,515
1080p is more CPU reliant than 1440p or 4k, such that you need a minimum level of power to not become to CPU restrained. Something like this is more of a mid to higher mid level PC. This would be perfect for 1080-1440p gaming at med to high settings.
Investing 100$ or 200$ for a better CPU at 1080p cannot bring even half of the performance you can get from spending the money for a better video card. Let's say if you buy 4070 instead of 4060 RTX for 200$ more there is no CPU that can bring remotely close performance at 1080p not to mention at higher resolutions for the same money.
 
  • Like
Reactions: P.Amini
Investing 100$ or 200$ for a better CPU at 1080p cannot bring even half of the performance you can get from spending the money for a better video card. Let's say if you buy 4070 instead of 4060 RTX for 200$ more there is no CPU that can bring remotely close performance at 1080p not to mention at higher resolutions for the same money.
Yes, but this is not a gaming benchmark...
This is a CPU benchmark, it doesn't have to make sense for gamers, it just has to show the differences between CPUs.

If you want to figure out what GPU and CPU you should buy for gaming at 1080 or 1440 or 4k or whatever, then you look at gaming benchmarks.
 
If you haven't taken the time to check Steve Burke and the crew out at GN I suggest you do. Potentially the most impartial and accurate review site on the internet (Steve's lede can be a little over the top sometimes for humours sake but the charts and benches are solid). They have some of the most stringent test suites and probably the most comprehensive testing equipment suite you'll see outside of an engineering lab. (They even visit AMD's lab and I believe an Intel tour is in the works?) The stock of hardware they keep for comparison testing is enormous as well. Generally pretty accountable for mistakes they make, either pulling the erroneous video, or issuing a correction within it, and no longer do sponsored videos so as to avoid any conflicts of interest (Asus *cough* Asus) over advertising money. If you want fair impartial testing they are as close to that as you will ever get right now. Many sites would do well to look at how they are doing things and incorporate some of it (Especially how they are always trying to improve, both with process and production).

Generally speaking if one enjoyed Anandtech of old, GN kinda scratches that itch, but with epic hair.


Anandtech deeps dives were legendary.
I usually watch GN, but they have no clue how the PC market works and tend to heavily slant the editorial parts of their videos. The introduction to the 14700K review proves that easily where they think Intel is "desperate to put something out". I didn't bother watching the rest as I tend to watch mostly for the commentary and I knew this one wouldn't be for me. Not to single GN out as this is how the majority of the techtube space views things. I just find it gets really old about certain topics.

The 14xxx series is just refined 13xxx and Intel did the new name because that's what they do for OEMs all the time. Everyone knew this is all it was before the launch yet some people feel the need to sensationalize it.
 
  • Like
Reactions: Lafong

rambo919

Great
Sep 21, 2023
55
30
60
The 14xxx series is just refined 13xxx and Intel did the new name because that's what they do for OEMs all the time. Everyone knew this is all it was before the launch yet some people feel the need to sensationalize it.
They are riding the wave of 2023 disappointment when it comes to HW releases.

Be it CPU, GPU or anything else.... it's been a meh year with everything somehow more expensive than it used to be.
 
  • Like
Reactions: helper800
Status
Not open for further replies.