News Intel Core i9-14900K, i7-14700K and i5-14600K Review: Raptor Lake Refresh XXX

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

t3t4

Prominent
Sep 5, 2023
140
54
660
Can't say I see any sense in this 14th version of the same damn thing, nor do I understand how they claim 6ghz when it won't hit and hold. I don't care about video games, I'm all about video editing, and my 13900k will do 6.1 ghz, for like a nano second then hangs at 5.4 all day. I won't call it 5.8 ghz chip when it lives at 5.4. I've limited mine to 295 watts, cause any more just turns my PC into a bigger space heater with no real benefit. These chips just want to live on thermal throttle and I see no sense in that whatsoever. So I'm definitely skipping the 14th gen of the same damn thing!
 
  • Like
Reactions: NinoPino

Maebius

Splendid
Feb 17, 2017
162
42
23,540
In what sense? That too many benefit too much from X3D? We do try to pick games that are more likely to show a delta based off CPU performance, rather than just taxing the GPU.

For a lot of games, especially if you're not using an RTX 4090, the gap between 7800X3D/7950X3D and the other CPUs will be smaller, yes. You need to look at the big picture, or just find benchmarks of the games you actually play to see if there's a meaningful difference.
I do own a 7800X3D and your pick of games appears to favor the X3Ds more than it "should"... that might be a subjective opinion ofcourse.
Not that the X3Ds aren't better in many games, just that in those you picked, their % difference is greater than in others.
I'm not saying you did it intentionally, maybe 9 games is too small of a sample in my eyes...BUT
If you choose something like Factorio (that performs superbly on X3Ds) then you could also put in a title that favors the intels... don't know, Starfield maybe?
Obviously, there's limited time to test every game around, so I understand.
Keep up the good work!
 
  • Like
Reactions: bit_user

evdjj3j

Distinguished
Aug 4, 2017
350
379
19,060
I do own a 7800X3D and your pick of games appears to favor the X3Ds more than it "should"... that might be a subjective opinion ofcourse.
Not that the X3Ds aren't better in many games, just that in those you picked, their % difference is greater than in others.
I'm not saying you did it intentionally, maybe 9 games is too small of a sample in my eyes...BUT
If you choose something like Factorio (that performs superbly on X3Ds) then you could also put in a title that favors the intels... don't know, Starfield maybe?
Obviously, there's limited time to test every game around, so I understand.
Keep up the good work!
Just so you're aware, he's not the author of the article.
 
  • Like
Reactions: bit_user and King_V
What's making me laugh is that I jokingly "called it", not ACTUALLY expecting Intel to release what are essentially overclocked 13th-gens. I mean, I thought that they'd have higher clocks but I didn't really expect them to consume more power, but as we can see, they do:
MdzDULZJkM78j5SPXYSHpJ-970-80.png.webp

The i9-14900K is actually less power-efficient than the already overly-thirsty i9-13900K. Well, that places it in the same category as the ridiculousness known as the i9-13900KS, not a good place to be.

This isn't a refresh, it's literally just some overclocked 13th-gens and are more of a step backward than a step forward. That's the real joke.
 
The i9-14900K is actually less power-efficient than the already overly-thirsty i9-13900K. Well, that places it in the same category as the ridiculousness known as the i9-13900KS, not a good place to be.
That depends extremely on how you have things set up...
The 14900k has quite a bit better efficiency than the 13900k and ks over at techpowerup, in cinebench for example, if you run it at stock and not overclocked and/or without limits, in h265 techpowerup also has the 14900k slightly better than the 13900k and ks. Stock to stock will have both of them running at 253W and the 14900k wins by a couple of seconds.
efficiency-multithread.png


encode-h265.png
 

sitehostplus

Honorable
Jan 6, 2018
400
161
10,870
I recently switched my main home PC from an Intel 12th gen to an AMD 7900 X3D and I am even more happy with that choice today from this review. The AMD CPUs are significantly more power efficient. And now with this near space heater grade "refresh" Intel just launched? Yes, I am feeling smug :)
You and me both. That 7950x3d is aging well right now.
 
Great tests, but I still want to see it all ran with an OEM cooler. Can you just use the standard cooler from iBuyPower or Alienware and see if Intel's 385W chips thermal throttle? I know Reddit indicates that real-world Intel chips from OEMs always throttle, falling far shy of benchmark numbers from reviews.
 
  • Like
Reactions: Avro Arrow
Like most these cpu article they can't just say it makes no difference in real world game usage....

Problem is until the GPU catch up, mostly in affordability, the summary for future CPU chips will likely be similar. They are all way overpowered compared to the video cards they are commonly matched with.
People buy i7's and i9's for bragging, not for gaming.

EDIT: I should have said "Most" in the front of that sentence.
 
Last edited:

t3t4

Prominent
Sep 5, 2023
140
54
660
People buy i7's and i9's for bragging, not for gaming.
Bragging about what? How efficient they are at heating small spaces? I couldn't care less about games, but I'm all about frames rendered per second. My 13900k will come in handy this winter when it's crunching some real numbers while keeping my 1K sq ft house warm in the process! There might be a viable balance point in that, somewhere. But all the top tier chips run away on thermal overload until they hit their thermal limit. It's just where we're at in 2023/24/25.......
 

saunupe1911

Distinguished
Apr 17, 2016
207
76
18,660
Well the good news is that there's no need to upgrade if you have bought a CPU within the last few years. They are pretty close to each other unless you bought an extremely low core budget CPU.

I'm not upgrading until a CPU can truly push a 4090 further than current high end CPUs. Otherwise my 5900X will do.
 

saunupe1911

Distinguished
Apr 17, 2016
207
76
18,660
People buy i7's and i9's for bragging, not for gaming.
Totally false. On the flip side I don't know anyone that buys a CPU specifically for gaming. Every soul on this earth that I've ever met buys a cpu for other needs outside of gaming. I see folks saying "mostly" or even "primarily" but gaming won't take up 100% of their time on that machine.
 
The opening shot appears to show three brand new processors on a piece of grey and white flecked carpet, a well known source of static voltages. Perhaps they're only dummy non-working CPUs, but it's hardly a good example for Tom's readers to follow.

When buying second hand components on eBay, I avoid all auctions showing CPUs, RAM, GPUs and mobos resting on carpet. It's one of the worst sources of ESD damage.
I assumed that they did that photo for irony. But you're right that MANY people wouldn't interpret it as a joke to the point that it's probably a bad idea to show it like that.

I also agree with the eBay carpet deal-breaker.
 
Totally false. On the flip side I don't know anyone that buys a CPU specifically for gaming. Every soul on this earth that I've ever met buys a cpu for other needs outside of gaming. I see folks saying "mostly" or even "primarily" but gaming won't take up 100% of their time on that machine.
Bragging about what? How efficient they are at heating small spaces? I couldn't care less about games, but I'm all about frames rendered per second. My 13900k will come in handy this winter when it's crunching some real numbers while keeping my 1K sq ft house warm in the process! There might be a viable balance point in that, somewhere. But all the top tier chips run away on thermal overload until they hit their thermal limit. It's just where we're at in 2023/24/25.......
I should have said "most". You do need an i7. The i7 is a FANTASTIC buy for those who need the horsepower. So are some of the Ryzen 7's. The i9 isn't much better, but costs a lot, IMO.

I would argue most of the people posting here can use an i7 or better. But that's not who drives the economy. It's driven by people paying $2.5k for an Alienware Desktop that pairs an i9 with an RTX 3060.
 
Last edited:
  • Like
Reactions: sherhi
Like most these cpu article they can't just say it makes no difference in real world game usage, nobody would read their articles and no advertiser would pay either. Even though a lot of the 14th gen reviews are very close to saying that.

If we ignore the crazy counter strike player who plays on low settings nobody who buys a $1600 video card is going to run at 1080. For people that can only afford to run at 1080 the cost of better performance is likely not going to be cpu related.

This means for almost all people who are building machines for gaming they are going to be GPU bound and they will see no difference even between the newest chips and say older chips like a 5800x3d.

Problem is until the GPU catch up, mostly in affordability, the summary for future CPU chips will likely be similar. They are all way overpowered compared to the video cards they are commonly matched with.
have you even taken a logic course? cause yours is faulty. I'll explain the problem

1) they need to review the CPU, part of that review is determining it's "power/speed" whatever you want to call it, because clock speed means nothing.

2) in order to determine the speed/power of a part in a complex machine you have to control for the influence of other parts. else instead of determining the power of the part you want to test, you end up determining the slowest part of the machine
-a dead give away your not testing the one part your changing in your test rig is if the results don't change. this means you're not testing anything. This is like trying to find out the fastest runner in a group of people by only watching the person who finishes last. that tells you nothing.

3) the solution is to eliminate the slow runners. Use fast RAM, use fast GPUs, use fast SSDs, make sure you have plenty of power and cooling, and make sure you use the same parts for every CPU you're testing.

4) Next you have to make sure you're not benching the GPU so you use low resolution, this means the bottleneck (the slowest part) will be the cpu being tested.

They are testing these the right way because they don't know what YOU will use the cpu for, and knowing which cpu is the strongest/best for gaming helps inform decisions.

-----

what you're asking for is something else. you want to know about "your use" experience. this is the old AMD fanboy argument back when Piledriver was up to offer. The argument went like this; I'm gaming at 1080p or 1440p, gpus get maxed out before the piledriver cpu does at those resolutions, so why buy an i5-3570k when i can buy an FX8350 and get the same user experience in 1440p.

Well there is a point to that AMD argument, it's true though the i5 was almost 45% faster IPC, at 1440p there was no graphics card on the market that would make the FX8350 feel slower then the i5. However how many people were on 1440p in 2013? what about people who were doing more then just gaming? what about 2 years down the road when the gtx 1080ti launched? Now if you had gone with the FX8350 you'll need to replace the whole system in order to upgrade to the 1080ti.

So offering the testing results which show the i5 is 45% faster DOES matter, just as showing how this chip is basically identical to the 13th gen chips, only they run so hot they're almost impossible to cool, they throttle on 360mm rads, and offer almost no performance uplift against 14th gen. If you want the best chips on the market you'll still be buying AMD. That's why this testing is needed. even if it doesn't show exactly what you want to see, it shows everything a consumer in the market needs to know to make an informed decision.
 
Last edited:
Great tests, but I still want to see it all ran with an OEM cooler. Can you just use the standard cooler from iBuyPower or Alienware and see if Intel's 385W chips thermal throttle? I know Reddit indicates that real-world Intel chips from OEMs always throttle, falling far shy of benchmark numbers from reviews.
one of the tech-tubers talked about the throttling issues and said they couldn't keep the chips cool with one of the absolute best AIO liquid cooler on the market (arctic liquid freezer II 360mm); it overheated in pretty much every scenario they tried to test it with. they had to use a custom loop to perform their testing with a 480mm rad.
 
  • Like
Reactions: bit_user and sherhi
No. That is incorrect on two points. This assumes everyone buys CPUs for gaming, and even if that were true, i7s and i9s are clearly better at doing it than the lesser parts.
My bad. I should have said "most".

While your argument that i7's are better for gaming is technically true, the GPU is practically always the bottleneck with an i5/R5 or better, so that's not really a performance point. And the CPU often only limits at resolutions that a new i7/RTX build isn't even played at.
 
Last edited:
one of the tech-tubers talked about the throttling issues and said they couldn't keep the chips cool with one of the absolute best AIO liquid cooler on the market (arctic liquid freezer II 360mm); it overheated in pretty much every scenario they tried to test it with. they had to use a custom loop to perform their testing with a 480mm rad.
That's been the case with Intel chips for 5+ years. But reviewers use open benches with extremely high end AIO's and don't report on throttling EVER. I complain every article, but you'll NEVER see Paul Alcorn use a mid-range cooling solution or OEM case with bad circulation in a review.

In the real world, Intel loses on every price point benchmark (R9>i9 and R7>i7, R5=i5), based on what I've seen limited from Reddit--about 12% performance drop from throttling in OEM builds.

Additionally, you won't see reviews pitting Intel running DDR4 against Ryzen running DDR5, which Ryzen always wins. Yet, each review still criticizes AMD for not supporting DDR4 and claiming performance equality. It's a blatant logic error that should offend tech/sci-fi nerds. It offends me at least.

At least this review wasn't all about DDR4 support.
 
Status
Not open for further replies.