News Intel Core i9-14900K, i7-14700K and i5-14600K Review: Raptor Lake Refresh XXX

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
The 14xxx series is just refined 13xxx and Intel did the new name because that's what they do for OEMs all the time. Everyone knew this is all it was before the launch yet some people feel the need to sensationalize it.
Yes anyone who even has the slightest bit of tech skill to even come and read a forum here likely understands this. Problem is it is still not extreme enough to even reach the general consumer. You would have to put a big frowny face icon on the ads intel runs on tiktok to even half get the attention of many consumers.

It seems the vast majority of the population must "buy now", "new and improved", with a "bigger number".

The old movie "idiocracy" is still not a documentary right :)
 
  • Like
Reactions: helper800

sitehostplus

Honorable
Jan 6, 2018
391
159
10,870
I can imagine! Great buy for you too!
Yes it is. It's not only so powerful it's kind of ridiculous, it's putting money back into my pocket via my power bill.

I upgraded from an 8th gen I7 (not a bad chip but a little long in the tooth), and my monthly power bill dropped $25. It's literally paying for itself. 😺
 

sitehostplus

Honorable
Jan 6, 2018
391
159
10,870
Since TomsHardware is supposed to be a tech site and not just a gamer site, I wish these reviews would be more detailed, like Techpowerup did in their review. They included machine learning, physics, chemistry and genome analysis, among other non-gaming programs, while TH relied mostly on just standard benchmarks like Cinebench.


ai-upscale.png



genomics.png
And how much overhead in software did that review require?

Price it and get back to me with some hard numbers.
 

watzupken

Reputable
Mar 16, 2020
1,058
549
6,070
What's making me laugh is that I jokingly "called it", not ACTUALLY expecting Intel to release what are essentially overclocked 13th-gens. I mean, I thought that they'd have higher clocks but I didn't really expect them to consume more power, but as we can see, they do:
MdzDULZJkM78j5SPXYSHpJ-970-80.png.webp

The i9-14900K is actually less power-efficient than the already overly-thirsty i9-13900K. Well, that places it in the same category as the ridiculousness known as the i9-13900KS, not a good place to be.

This isn't a refresh, it's literally just some overclocked 13th-gens and are more of a step backward than a step forward. That's the real joke.
When you pushed clockspeed further beyond what the chip can manage, you will run into diminishing returns in terms of clockspeed vs power. So to me, the more aggressive clockspeed over the 13th gen Raptor Lake will make the chips less efficient, no surprises. Even that extra 100 Mhz that they can squeeze out of the chips will blow the power requirement out of proportion. And that same 100 or 200 Mhz is not going to make any material impact to performance. Goes to show Intel's desperation here really. Putting this into perspective, Intel's CPU only power draw is going into a flagship GPU territory, which is something I wasn't expecting. For the small package, it is near impossible to keep the chip from throttling with conventional water cooling solution.
 
  • Like
Reactions: bit_user
Yes it is. It's not only so powerful it's kind of ridiculous, it's putting money back into my pocket via my power bill.

I upgraded from an 8th gen I7 (not a bad chip but a little long in the tooth), and my monthly power bill dropped $25. It's literally paying for itself. 😺
Lol, so you spend like $1000 on a new system to save $25 per month, that will make you get even in 3.3 years......you know you could have just gotten into the bios and save $25 a month without spending a dime right?!

Also the 9th gen i9 drew like 95W at the max as shown by gamers nexus, the 8th gen i7 would have drawn even less and current ZENs draw more than that, tom's shows the 7950x3d drawing 121W in blender so God only knows how you are saving any money from your power bill at all.........
View: https://youtu.be/2MvvCr-thM8?t=354
 

ManDaddio

Reputable
Oct 23, 2019
103
59
4,660
I appreciate your time to test but...

Can't use XTU with core isolation on.
Nice feature but useless for security.
Why Intel uses that as a selling point, I don't know.

Who cares about Hitman, Far Cry 6 or Watch Dogs Legion?
Look on Steam and see what people are playing and other gaming PC platforms.
Test games most people play.
A good number of them. We don't need you to be first. Just give us a good swath of games tested at 1080p to 4k.
1080p is useless to someone like me and many others. I'd like to see how AMD behaves at 1440p and 4k mostly. Intel as well.
 

ManDaddio

Reputable
Oct 23, 2019
103
59
4,660
Also power usage will vary based on needs. Some people saw reduced power usage from 13th to 14th on average.
A good number of Tech tubers know the AMD mob will give them lots of views if they all trash Intel and Nvidia for whenever they can.
I imagine now most patreon supporters are on the AMD side. It pays off for AMD.
 
It's basically in the same realm as Rocket Lake and Comet Lake, as far as I'm concerned. I mean, sure, Comet Lake 10900K had potentially two more CPU cores than a 9900K, and Rocket Lake 11900K was a new architecture. But in both of those cases, Intel had pretty much reached the end of the line for performance scaling without killing efficiency. It needed a new process node and then some.
I would go so far as to say, at least on 14nm, that as soon as Intel went beyond 4 physical cores with a 4GHz clock that efficiency went out the window in search of absolute performance. Yes during gaming the 8700k was only using the same power at the 7700k, but games at that time almost never used more than a couple threads. Once you got into full core AVX loads the 8700k was using 40% more power or more.
 
  • Like
Reactions: bit_user

kiniku

Distinguished
Mar 27, 2009
250
70
18,860
Yes it is. It's not only so powerful it's kind of ridiculous, it's putting money back into my pocket via my power bill.

I upgraded from an 8th gen I7 (not a bad chip but a little long in the tooth), and my monthly power bill dropped $25. It's literally paying for itself. 😺
I'm mostly a gamer with some minor productivity and light streaming. So that CPU would have been throwing money away for me. But the X3D is an innovative product. But, here comes Meteor Lake. Intel's first chiplet design. LOL
 
I usually watch GN, but they have no clue how the PC market works and tend to heavily slant the editorial parts of their videos. The introduction to the 14700K review proves that easily where they think Intel is "desperate to put something out". I didn't bother watching the rest as I tend to watch mostly for the commentary and I knew this one wouldn't be for me. Not to single GN out as this is how the majority of the techtube space views things. I just find it gets really old about certain topics.

The 14xxx series is just refined 13xxx and Intel did the new name because that's what they do for OEMs all the time. Everyone knew this is all it was before the launch yet some people feel the need to sensationalize it.
I disagree on the part about them understanding the market. If anything I feel they understand it better than most, at least in as much as it jives with my understanding/needs/wants (I've been doing this for over 30 years if that counts for anything). I mean, it's likely I'm way off base too so there's that and ultimately it's just my thoughts on the matter.

Moving on yes, Steve can get a bit melodramatic especially in the lede, and I often skip ahead to the charts and more technical parts of their videos as well as that's where the meat and potatoes are (The AMD tour and Taiwan factory tours were fantastic). I find the same thing here at Tom's and many other sites as well. Opinions, especially contrary to the herd ones, get views or clicks so you're going to see that everywhere to some extent. As for Tom's specifically, I'm not here for Paul Alcorns opinion as it often differs from my own, but I'm here nonetheless and I am here because I value Paul's depth of experience and insight, even if I disagree with the conclusion from time to time. Same for the commentariat, I sometimes disagree with others here but I value what they have to say and oftentimes learn something of value or learn something that changes my opinion on a thing or event. It's a good day if something was learned!

On your point about Steve's point about Intel being "desparate to put something out". I agree with you more or less. I wouldn't say they were desparate, but they had to either release -something- or release nothing. Which would be worse in the eyes of stakeholders? I don't know. I do think Intel appears to be up against the ropes in some manner, between TSMC's process dominance and AMD's darling image with many (vocal) enthusiasts. Probably some wiping of brows in engineering but thats about it. If we're in this same situation next year, then there may be some wringing of hands.

Now, not to blather on as is my habit from time to time but I'd like to clarify my original post, as I made it late last night after a 14 hour shift. When I say GN is impartial, what I mean is that if this was AMD, or NVIDIA, or ANYONE ELSE they would rip them just the same. They don't seem to suffer the brand loyalty many reviewers do. THAT is what I enjoy about GN. Steve's little "Number bigger better" sketch was funny, but ultimately didn't add much besides some cringey entertainment value, which some odd people do enjoy....
 
I'm mostly a gamer with some minor productivity and light streaming. So that CPU would have been throwing money away for me. But the X3D is an innovative product. But, here comes Meteor Lake. Intel's first chiplet design. LOL
I think that it's very unlikely that meteor will have a cache large enough to get close to x3d ,at the scale that intel has to make CPUs it would be way too expensive, not just in money but mainly resources, for something that would not increase their sales anyway above what the new arch would do.
 
When I say GN is impartial, what I mean is that if this was AMD, or NVIDIA, or ANYONE ELSE they would rip them just the same. They don't seem to suffer the brand loyalty many reviewers do. THAT is what I enjoy about GN.
I think it's fine to enjoy YouTubers doing the obligatory rants and such, but it's also important to realize that those rants, on YouTube, are there specifically to drive views. Because you get paid, directly, by your view count. And sure, sites like ours get paid directly by our pageviews (plus other stuff like ecommerce), but this is the big difference between a YouTuber and small tech sites, versus stuff like Tom's Hardware.

Paul, Avram, myself, and everyone else on the team? We don't get paid directly by pageviews. It's a metric that the corporate overlords look at, so indirectly it's still important. But when you see the things that, for example, GN has said about LTT? Yeah, it's because the people running things have too much of a vested interest in traffic and income.

I get my salary whether I do rants that get a ton of traffic, or tame reviews that get modest traffic, or boring news that doesn't traffic, etc. If I don't do my job, I could get fired. But I don't make a penny more off of a great article or a poor article, as far as traffic and clicks are concerned. Future PLC makes more (or less), but not the actual writers. And that's a good thing. This makes people far more impartial in general.

Feigned anger in search of views is a proven tactic on YouTube, and before that, on TV. This is a fact. Maybe it's not even feigned anger all the time! But the number of "controversial hot takes" that you see in videos versus the number you see from major publications? I don't have proof that it's higher on YT, but my gut says that it absolutely is higher — a lot higher.
 
I think it's fine to enjoy YouTubers doing the obligatory rants and such, but it's also important to realize that those rants, on YouTube, are there specifically to drive views. Because you get paid, directly, by your view count. And sure, sites like ours get paid directly by our pageviews (plus other stuff like ecommerce), but this is the big difference between a YouTuber and small tech sites, versus stuff like Tom's Hardware.

Paul, Avram, myself, and everyone else on the team? We don't get paid directly by pageviews. It's a metric that the corporate overlords look at, so indirectly it's still important. But when you see the things that, for example, GN has said about LTT? Yeah, it's because the people running things have too much of a vested interest in traffic and income.

I get my salary whether I do rants that get a ton of traffic, or tame reviews that get modest traffic, or boring news that doesn't traffic, etc. If I don't do my job, I could get fired. But I don't make a penny more off of a great article or a poor article, as far as traffic and clicks are concerned. Future PLC makes more (or less), but not the actual writers. And that's a good thing. This makes people far more impartial in general.

Feigned anger in search of views is a proven tactic on YouTube, and before that, on TV. This is a fact. Maybe it's not even feigned anger all the time! But the number of "controversial hot takes" that you see in videos versus the number you see from major publications? I don't have proof that it's higher on YT, but my gut says that it absolutely is higher — a lot higher.
Thanks for this response Jarred. Clear, measured and insightful. This is the kind of stuff I come to Tom's for. There's not many editors that would take the time to respond in such a manner. Now excuse me as I allude to my prior post... it has been a good day!
 
Until it dies we'll probably have the 20th gen, scratch that, until it becomes somewhat slow enough to consider upgrading we'll have the 20th gen.
I'm still rocking an i7-4770k from 2013. Only reason I am looking to upgrade is I need more than 32GB RAM to really make my desktop a good virtual home lab. Overall the CPU has enough grunt for the 6700XT and the games in which I play.
 
It's basically in the same realm as Rocket Lake and Comet Lake, as far as I'm concerned. I mean, sure, Comet Lake 10900K had potentially two more CPU cores than a 9900K, and Rocket Lake 11900K was a new architecture. But in both of those cases, Intel had pretty much reached the end of the line for performance scaling without killing efficiency. It needed a new process node and then some.

Raptor Lake (not the Refresh) was already somewhat questionable. My 12900K test PC runs just fine and I never really worry about things. The 13900K though can get hot, and shader compilation in some games actually triggers a game or system crash. Maybe it's more the MSI mobo (firmware) that's to blame, or the Cooler Master cooler, or a mixture of CPU, mobo, and cooler. But the fact is, 13900K is faster but runs hotter than 12900K, and in my experience the end result can be hit and miss.

Given how far Intel had already pushed its "Intel 7" 10nm node with Raptor Lake, I had no expectations that the Raptor Lake Refresh would reign in power use and improve efficiency. We'll need Meteor Lake or Arrow Lake to hopefully improve those aspects. Those future CPUs could actually be faster and more efficient than RPL-R, rather than just incrementally faster... or maybe just slightly faster but half the power? I guess we'll see when those launch.
I didn't expect Intel to reign in power efficiency all that much, I just expected that, since this was a refresh, that they'd be able to push a few hundred extra MHz without increasing the power consumption. From looking at what we're seeing here, this isn't a refresh at all, it's just Intel managing to get faster stable clocks on the same 13th-gen silicon. This "refresh" could just be chocked up to Intel selling higher-binned 13th-gen silicon and just calling it 14th-gen. I think that a more honest name for the i9-14900K would be the i9-13950K, but hey, you and I both know just how important something like honesty is to big corporations like this, eh? ;)
 
Great tests, but I still want to see it all ran with an OEM cooler. Can you just use the standard cooler from iBuyPower or Alienware and see if Intel's 385W chips thermal throttle? I know Reddit indicates that real-world Intel chips from OEMs always throttle, falling far shy of benchmark numbers from reviews.
I agree. Intel 13th and "14th" gen CPUs are essentially "Liquid Cooling Required" CPUs if you want to avoid thermal throttling.
 
When you pushed clockspeed further beyond what the chip can manage, you will run into diminishing returns in terms of clockspeed vs power. So to me, the more aggressive clockspeed over the 13th gen Raptor Lake will make the chips less efficient, no surprises.
Then it's not really a refresh, it's just Intel selling higher-binned 13th-gen silicon and calling it "14th-gen". A refresh means that the silicon was re-worked and usually results in higher clocks (even if only slightly so) without increased power consumption. If Intel referred to the i9-14900K as the i9-13950K, then I would have no issues with it but they're trying to call this a whole new generation even though it's not.
Even that extra 100 Mhz that they can squeeze out of the chips will blow the power requirement out of proportion. And that same 100 or 200 Mhz is not going to make any material impact to performance.
I agree with you on that one for sure!
Goes to show Intel's desperation here really. Putting this into perspective, Intel's CPU only power draw is going into a flagship GPU territory, which is something I wasn't expecting. For the small package, it is near impossible to keep the chip from throttling with conventional water cooling solution.
And yet, for some crazy reason, people keep buying them and so do OEMs.
 
  • Like
Reactions: Order 66
I didn't expect Intel to reign in power efficiency all that much, I just expected that, since this was a refresh, that they'd be able to push a few hundred extra MHz without increasing the power consumption. From looking at what we're seeing here, this isn't a refresh at all, it's just Intel managing to get faster stable clocks on the same 13th-gen silicon. This "refresh" could just be chocked up to Intel selling higher-binned 13th-gen silicon and just calling it 14th-gen. I think that a more honest name for the i9-14900K would be the i9-13950K, but hey, you and I both know just how important something like honesty is to big corporations like this, eh? ;)
To be fair they did increase performance without increasing power consumption, but no premium motherboards on the market run them at their stock PL2. TPU's review shows peak stock 14900K power consumption within 5W of the 13900K and in the application tests they were 1W apart all while maintaining a slight performance bump. This is still on Intel because they don't seem to care about this behavior, but these chips are no different than 9th-10th gen refreshes.
 
To be fair they did increase performance without increasing power consumption, but no premium motherboards on the market run them at their stock PL2. TPU's review shows peak stock 14900K power consumption within 5W of the 13900K and in the application tests they were 1W apart all while maintaining a slight performance bump. This is still on Intel because they don't seem to care about this behavior, but these chips are no different than 9th-10th gen refreshes.
Well, that is better than it appeared, but I still think that it's disingenuous to call these CPUs a different generation. I think that a much more fitting name for the i9-14900K would have been the i9-13950K because this isn't really a new generation of CPUs.
 

Amdlova

Distinguished
Well, that is better than it appeared, but I still think that it's disingenuous to call these CPUs a different generation. I think that a much more fitting name for the i9-14900K would have been the i9-13950K because this isn't really a new generation of CPUs.
This is the new gen 11+++ edition need greather numbers for show the advanced node of the advanced node plus edition alpha +. In some days the intel will launch a new way to perform benchmark. Some benchmark will have the ultra zoom mode.
 
Status
Not open for further replies.