News Intel Core i9-12900K and Core i5-12600K Review: Retaking the Gaming Crown

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
because DDR5 performs the same or better in almost every scenario
In most games it doesn't. Also the point is i5-12600KF DOESN'T NEED the help of DDR5 to beat 5800X. It can beat it with or without DDR5. So you were doing an apple vs orange price comparison.

...if you go a DDR4 board you will be gimping yourself later on....
And you are not gimping yourself by going with DDR4 supported 5800X? My point is DDR5 is an option, an extra feature, NOT a drawback as you are insinuating in your apple vs orange price comparison.
 
How is this a scientific test? No PBO on any of the AMD processors? A AMD 5600x still wrecks these new intel chips. Literally with the click of the button in the bios you enanble PBO and safely run AMD overlocking without any issues and you go straight to the leaderboards. Would love to see some more scientific and fair testing.
I see what you are on about , apparently now on all Intel cpus and motherboards MCE ON is now classed as normal testing setup! (By Intel)
It is like having PBO on Ryzen on by default to boost your scores so realistically PBO should be on for all testing as well . As it has shown it makes almost no difference to gaming but certainly helps lots with higher scores for anything multicore !
 
  • Like
Reactions: Sleepy_Hollowed
In most games it doesn't. Also the point is i5-12600KF DOESN'T NEED the help of DDR5 to beat 5800X. It can beat it with or without DDR5. So you were doing an apple vs orange price comparison.


And you are not gimping yourself by going with DDR4 supported 5800X? My point is DDR5 is an option, an extra feature, NOT a drawback as you are insinuating in your apple vs orange price comparison.
There is no like for like comparisons to be made right now anyways because Intel is gimped on Win10 and AMD is gimped on Win11. The 5800x and 12600k/f are so close at 1080p that they are virtually indistinguishable from one another. My advice is if you have a 9000 intel or 3000 AMD and up already, its a hard pass to upgrade to Alder Lake.

2. No. You are not gimping yourself getting DDR4 on the AM4 platform because its the only option. You only gimp yourself if you have a better option and intentionally choose the worse option as I have elaborated in my previous post.
 
  • Like
Reactions: King_V
Intel's new 12900k does <Mod Edit> in Windows 10, at or below 5950x except for their use of power. Intel's new chips use WAY more power to get to this performance level. It is using almost double what 5950x uses as it's base. We can hope Windows 11 helps lower this somewhat but with the base wattage so high it doesn't look likely.
 
Last edited by a moderator:
I see what you are on about , apparently now on all Intel cpus and motherboards MCE ON is now classed as normal testing setup! (By Intel)
It is like having PBO on Ryzen on by default to boost your scores so realistically PBO should be on for all testing as well . As it has shown it makes almost no difference to gaming but certainly helps lots with higher scores for anything multicore !
My Asus Prime X570-P has bios defaults for PBO, Core Boost etc... all ON by default. Even when you do a full bios reset it is all still on by default.
So why was this turned off for testing? It appears to be a standard setting for most mid to high end X570 boards.
 
I agree with all your other points but this one is just hyperbole, you are not forced to use the CPU at whatever the mobo happens to come at as default, you can go into the bios and select whatever power you want.
The 5900x running at 88w is 8% faster than the 12900k running at 88W, it's not the end of the world, you don't need any heavy cooling or a good board for 88W.
At 125W the 12900k matches the 5950x at 88w, 125 compared to 88W isn't going to break the bank, heat will be practically the same and you don't need any better equipment.
The difference between the 12900k at 125w and at 241w is a laughable 7-10%

rq26KdY.jpg

But ....... so 88W 5900X is about the same as 12900K at 125 watts, right? and 125/240watts is just a tinny bit faster... then .... Whats the point of the 12900K at all?, new mobo, new ram, new windows, for the "almost" same performance?, I do not think thats a great selling point.

Most people will probably just run it at 241watts with a 360mm AIO (at around 85/92°C depending on the workload). Like someone already said here, people looking for 12900K wont give a crap about energy savings and world climate change (they either don't care or don't believe in all that, and they have no idea of how much worst it will get very soon if we don't start doing something now and fast, but this is not the place to talk about this, forgive me)

I think as someone mention already, Alder Lake is a decent starting point for intel after 14nm+++++++ iterations. But I can only guess and hope that the following chips (10nm+?) will really be be better (truly better, the whole package).

And also I agree with Phaaze88, Windows 11 and intel 12th gen are really too inmature to fast jump on them, unless you need a complete new build now, then they become an option to consider-

EDIT: Sorry need it to write and add this. Im also very confident that in a few months (2, 3, 4 months) the numbers for Alder Lake will only get better as Wiin 11, BIOS and Drivers mature, and you can also expect more games and apps to get patches for it too. I have no idea what can AMD do in that time with the current 5000 line up (other than drop prices if intel gets a really strong stock and prices close to MSRP). But also poeple have been waiting to get a new GPU for too long...... so will they spend the money on a new cpu+mobo+ram, or will they finally get the long waited new GPU at any price and leave the platofmr upgrade for the future?
 
Last edited:
I agree with all your other points but this one is just hyperbole, you are not forced to use the CPU at whatever the mobo happens to come at as default, you can go into the bios and select whatever power you want.
The 5900x running at 88w is 8% faster than the 12900k running at 88W, it's not the end of the world, you don't need any heavy cooling or a good board for 88W.
At 125W the 12900k matches the 5950x at 88w, 125 compared to 88W isn't going to break the bank, heat will be practically the same and you don't need any better equipment.
The difference between the 12900k at 125w and at 241w is a laughable 7-10%

rq26KdY.jpg
The heck? The point you quoted from me has nothing to do with that. It's to do with the cpu's package and the power used under heavy core workloads. Any other loads and it's not even an issue.
Even with it, and the last 2 i9s under their Intel-defined PL1 + PL2 limits, the NH-D15 struggles with 12900K, but not the other 2 cpus under those specific loads.
Thermal density is playing a role here. It's a similar deal with Ryzen as it progressed from TSMC 14nm FinFet to 7nm FinFet.
 
Last edited:
You are not gimping yourself getting DDR4 on the AM4 platform because its the only option.
Your option is not limited to AMD or AM4. When you have competitor's DDR5 option then you are surely gimping yourself (according to your own argument that claims DDR5 is better).

Finally, just just one word: COPE.
 
Out of stock across the board except Amazon has 2 in stock for $1399. LMAO. Which is exactly why I went with a Ryzen 5900X. This will be a repeat of the RTX 3000 series GPU waiting game.
 
Out of stock across the board except Amazon has 2 in stock for $1399. LMAO. Which is exactly why I went with a Ryzen 5900X. This will be a repeat of the RTX 3000 series GPU waiting game.

Well is not a surprise, I mean I check newegg like 5 hours ago and there were no stock and price was at $650, also no one heard intel claiming they will flood the market with thousands and thousands of chips.
So it was meant to be out of stock the fist day.

And yes, intel will have to do a lot better than AMD in terms of stock if they really want to take the "gaming" crown.
 
  • Like
Reactions: emitfudd
Why? Everybody is on the power conscious trip right now, why not show both so that people can actually choose which one is better for them?
The 12900k shows so much improvement that running it with base power would still be fine.
People need both PBP and MTP to make an informed decision.
I think the issue is more the fact that the testing is done with Intel having MCE on and AMDs equivalent is off when comparing performance benchmarks which is not a fair comparison!
 
Well I could buy one supposedly right now, but it is priced exactly the same as Ryzen 5950X at $1099.
So I was led to beleive there would be plentiful stock and they would be $589 US .
Normally they are about 50% over US$ but these ones are getting towards 100% overs.
Starting to sound like a paper launch!
Oops too late ! 12900k ($1099) is sold out and so is 12700KF. ($699)
 
My Asus Prime X570-P has bios defaults for PBO, Core Boost etc... all ON by default. Even when you do a full bios reset it is all still on by default.
So why was this turned off for testing? It appears to be a standard setting for most mid to high end X570 boards.
Well there there you go . Say no more. That's probly the thing that annoys me the most everything is skewed in Intel's favour !
Most websites do it happily and knowingly . Some state what they have done and why . Some show you 9/10ths of nothing !
Look these new intel cpus are faster not massively but, as they have more cores . I have seen one website comparing 12900k to 5800x and 11900k the whole review which is just dumb.!
The one constant is its always shown in Intel's favour and nothing much is ever said about it.
Blatant 'industry bios' !

Dont give me no fanboy crap as I have had many more Intel systems than AMD.
 
How is this a scientific test? No PBO on any of the AMD processors? A AMD 5600x still wrecks these new intel chips. Literally with the click of the button in the bios you enanble PBO and safely run AMD overlocking without any issues and you go straight to the leaderboards. Would love to see some more scientific and fair testing.

PBO is not enough, this is because even with PBO, Ryzen will still be limited by power. PBO still has a socket limit of 142W. While these alderlake CPUs have 241W limit...... 100W difference. This will still put ryzens at a disadvantage esp. 5900/5950x.

The results clearly show how the massive power headroom (241W vs 105W, since no PBO) gives the alderlake a big advantage. However, in reality, the difference is far smaller. If we turn off the MTP and limit alderlake to just 125W, the results will be much much closer.

The real problem for AMD now is marketing. Not many will know this 241W limit. ITs not the actual 125W TDP, its just boost power.

OK, with alderlake, we need a different approach to benchmarking. We can't simply just set everything to stock and auto due to the massive disparity in power consumption. Perhaps there will need to be some ways to equalise the power consumption in order for testing to be accurate. If not there should also be an overclocked mode when both CPUs are set to certain fixed clockspeed, forsimilar power consumption. So, we know that given similar power consumption, how does the CPU performs.
 
I did know that Alder Lake supports DDR4. There does not seem to be a point in arguing with you, so I will just state my opinion. There is no point in getting a z690 board with DDR4 support because DDR5 performs the same or better in almost every scenario. Also DDR5 is only ever going to get faster as it matures so if you go a DDR4 board you will be gimping yourself later on. The 12600kf is only ever just slightly better than a 5800x with a 3090 playing games at 1080p which nearly nobody does. As soon as you go to 1440p+ or have lets say a 3060 ti, that performance difference melts. You also have to stick with Window 11 which is objectively terrible in its current state to actually take advantage of the new architecture. So I will not "accept and move on."
You must be new to this. The reason the 3090 is used at 1080p is due to the fact that makes the test cpu dependant. 1440p and on up and it becomes gpu dependant.
 
  • Like
Reactions: rtoaht
You must be new to this. The reason the 3090 is used at 1080p is due to the fact that makes the test cpu dependant. 1440p and on up and it becomes gpu dependant.

Talking about the DDR4 vs DDR5. I am not sure why DDR5 uses alot more power than DDR4. 15-25W difference is massive.
 
- Because it faster in most applications., cheaper and consume almost same power...... So no reason to buy 5800X anymore unless you hardcore AMD fanboy.
I expect that prices of existing Ryzen 5000 processors will be adjusted fairly quickly, putting them roughly in line with the competition from a value perspective. 5000-series pricing was never particularly great to begin with, as AMD's 7nm manufacturing capacity was limited and they knew the chips would be in high demand, given their leading performance. Once that leading performance is surpassed though, the prices should drop to around what we were seeing for the 3000-series on a per-core basis. They should cost a similar amount to make after all, so it would still be profitable for them to sell the 5600X for around $200 or less, and the 5800X for $300 or less. I would expect these processors to see big discounts for "Black Friday" sales this month, and then the sale prices may stick around thereafter, at least until the new 3D V-Cache models drop in a few months. Of course, that's assuming this isn't a paper launch and that you will actually be able to buy Alder Lake for close to MSRP before the year is through.

3)Just skip DDR4 motherboard options. DDR5 kits will only get better with time. The ones out now 'suck', and if you get 12th gen with a DDR4 combo, you'll be 'inclined' to change later. Just ouch.
Eh, I'm not so sure. Even with a DDR5 kit currently costing more than double what a DDR4-3600 kit can be had for, the DDR4 appears to be able to outperform it in things like games, and performs fairly similar overall, making the gains from using DDR5 kits rather questionable. Sure, all-around faster DDR5 will come eventually, but there's no guarantee it will necessarily be fully compatible with these processors.

And it's not like people are going to be clamoring to upgrade to faster RAM that improves their performance slightly over their existing RAM. RAM upgrades typically only make sense if you need more RAM than you currently have, but it seems likely that 32GB will be plenty for the vast majority of systems for a number of years to come, given the slow increase in memory utilization that we've been seeing for years. 32GB of DDR4-3600 CAS18 is readily available for around $120. Going with DDR5 for some slight performance gains in certain software is a bit like paying a huge premium for higher-frequency DDR4. In general, the faster RAM doesn't tend to make that much of a difference, at least not enough to justify the price difference. I suppose going the DDR5 route might not make a huge difference to a multi-thousand dollar enthusiast-level system built around a 12900K, but DDR4 might be better suited to most 12600K builds.

By the time faster DDR5 is available and reasonably priced, there will undoubtedly be newer, faster CPUs out, that will probably require new motherboards anyway. I'm not even sure that getting an overclocking-capable motherboard will be worthwhile for these processors either, seeing as they don't appear to overclock particularly well.
 
The loss of AVX512 is unfortunate because Intel engineering did an AMA not long ago where they stated that they're going to continue pushing AVX512. I'm expecting this to return on future iterations of this architecture once they sort out the additional complexity in the ISA. Intel is not going to confirm or deny that today. 🙂I'm going to build one of these anyway since I keep two rigs, but for someone looking to keep a machine further out, AVX512 has proven to be extremely potent when it's used.
But they are still there go and read anandtech.
If you disable e cores and have the right motherboard you can turn them back on and use them . They are list in new Xeon cpus anyway.
 
But ....... so 88W 5900X is about the same as 12900K at 125 watts, right? and 125/240watts is just a tinny bit faster... then .... Whats the point of the 12900K at all?, new mobo, new ram, new windows, for the "almost" same performance?, I do not think thats a great selling point.
If you already have a 5900x or a 5950x or even a 11900k or a 10900k there is no point for the 12900k, why who said that there is?
But if you are upgrading from a much older system you could go for the much higher single of the 12900k and the power draw is not a prohibiting factor.
The heck? The point you quoted from me has nothing to do with that. It's to do with the cpu's package and the power used under heavy core workloads. Any other loads and it's not even an issue.
Even with it, and the last 2 i9s under their Intel-defined PL1 + PL2 limits, the NH-D15 struggles with 12900K, but not the other 2 cpus under those specific loads.
Thermal density is playing a role here. It's a similar deal with Ryzen as it progressed from TSMC 14nm FinFet to 7nm FinFet.
5)Probably best to forget about air cooling this cpu for games. The power that can be generated in that small package appears to be too much for them.
Yes, some reviews have shown that air cooling is off the table for this cpu specifically in all core workloads, but there are some games out there that can do this to, or similar.
You were talking about forgetting air cooling even for games because apparently you think they are running at 241W but even the hardest heaviest core loads can be run at 125W and still roughly match the 5950x.
You could cool it with a stock intel cooler and it would be fine because it would automatically throttle down to using the 125W that the cooler would provide and you would still be close to 5950x performance.
 
It's a good show from Intel. It either matches or exceeds current AMD offerings in terms of performance for most gaming/productivity workloads while having competitive pricing. However, it is not a game-changer in my opinion. If you consider overall platform costs (new motherboard is expensive even if you don't go for super expensive DDR5), it's not the greatest leap, at least in gaming, in certain productivity workloads its quite a big leap. Also the power consumption is still kind of worrying even though they moved to a supposedly much more efficient architecture. IMO, Alder Lake is like what gen one Ryzen was for AMD. Its a demonstration of what they can do and with some more iterations/improvements, could be spectacular. Gaming is marginally better and certain productivity workloads are much better, but there are various platform areas that could be more matured (Windows 11, DDR5 availability and prices, use of efficiency cores in certain areas) before it can be called great. I'm very excited that Intel is back and better than ever with a lot of hope for what comes next in 13th gen as well as AMD's response.

edit: Also I'm extremely exited for the released i5 and unreleased lower end parts since Intel's lower end seems increasingly great in perf/$. I just hope Intel has decent B-series and H-series motherboards. Maybe unlock the lower end boards? Please Intel!
 
You must be new to this. The reason the 3090 is used at 1080p is due to the fact that makes the test cpu dependant. 1440p and on up and it becomes gpu dependant.
You do not understand my point, I am not new to this. I was referring to the difference in the results if say a 3060 ti was used instead of a 3090. If a 3060 ti were used instead the performance uplift at 1080p that the 12000 series has over AMD 5000 would not be ~8% or 9%. My point is that there are so few people with a 3090 that play at 1080p the minute difference in performance between Intel 12000 and 5000 would shrink if using anything less than a 3090 GPU. A more typical setup for most people would be 3060 ti or 3070 levels of performance at 1080p or 1440p. In any case if you are using less than a 3090 and 1080p or higher resolution then the performance benefits of Alder Lake shrink gratuitously. My other points also stand.
 
  • Like
Reactions: Why_Me
I am astonished to see just how many people doesn't know that Alder Lake also supports cheap DDR4 memory. The i5-12600KF will still beat Ryzen 7-5800X even with DDR4 in both multi threaded and single threaded benchmarks and real world applications. The gaming scores will actually be even better with 12600KF using DDR4 vs DDR5 due to DDR4 having lower latency. So i5-12600KF is the ultimate value king. Just accept and move on.
It will "beat" 5800x by 4% average in 1080p with a 3090, lmao. How about 1440p, 4k or a lesser GPU?

Yeah, sure take that amazing WIN! /s

Also why don't you pair your new Alder Lake with DDR4 so you can gimp yourself in the future and need to upgrade again for when DDR5 actually makes sense, right? Because now it will actually be much more expensive for little gain to do so... pfft.

There are so many holes in these Alder Lake CPUs, that the Swiss cheese will be jealous on them...

This barely a win Alder Lake has over Zen3 and is gonna be a laugh when Zen3D comes. Enjoy it while it lasts, it won't be for long.