News Purported 5.5 GHz Intel Core i9-13900K Spotted at Geekbench

rmcrys

Commendable
Jun 11, 2020
4
1
1,510
0
The results represent very little until it is known how much energy is being pulled out from the wall. Intel is known to improve the handling of the heat, not improving the consumption but the speed with increasingly energy use. AMD and Apple seem to improve instruction speed per watt, Intel seems to go more speed, more energy, more heat dissipation. I don't want to burn my house or increase my energy bills.
 
Reactions: octavecode
Jul 11, 2022
1
0
10
0
Someone tell me that there is CPU technology that dwarfs this i9, or did we just give China another chunk of our technology to copy.
 

Sippincider

Commendable
Apr 21, 2020
67
27
1,560
0
On the other hand, I couldn't care less how much energy it draws from the wall. I just want to go fast. My house is up to fire code and I can afford the extra ~$10 a year in electricity.
For me it's not the energy from the wall, it's getting the heat out of the case!

Somewhere there has to be a practical limit of how many BTUs can be extracted from a given surface area of silicon.
 
Well, Pentium 4 was ideally going to be worked up to 8Ghz, clearly that couldn't happen.

I hope to see these chips hit a "common" 6Ghz on water... that would be cool.

It leaves me to wonder how many more years it will take to hit 8Ghz.

All very exciting.
 
Reactions: KyaraM

shady28

Distinguished
Jan 29, 2007
385
244
19,090
8
For me it's not the energy from the wall, it's getting the heat out of the case!

Somewhere there has to be a practical limit of how many BTUs can be extracted from a given surface area of silicon.

Ya but just exactly how often will you push that CPU to its multi-core limits? I have a 10850K, arguably it has the potential to be one of the most power hungry chips on the planet when it is power unlocked and overclocked.

But, to actually see that power ramp, you have to be running something that pegs all cores. How often is that?

Most sites, Tom's included, don't do much power testing for the real world. They run y-cruncher, AIDA stress test, Blender, Handbrake. I don't run any of those normally, nor do I run anything like those, and the only person I know who does run anything like those at all makes a living doing photography - and she uses a Mac.

The most stressful thing most people here will do on their PC is game - and if you have a GPU powerful enough to make a 12900K or this new 13900K start to pull down on the power, you're clearly not someone who cares much about power. That GPU is going to pull way more power than your CPU.

If you don't have such a GPU (like at least a 3080), and you're not doing rendering and encoding on a regular basis, it's going to be extremely rare to see your CPU draw down a lot of power during actual use.
 
If you don't have such a GPU (like at least a 3080), and you're not doing rendering and encoding on a regular basis, it's going to be extremely rare to see your CPU draw down a lot of power during actual use.
Yep. I've been musing about that. I doubt if my PC (excluding monitor) uses 150 watts more than 20 hours a year....out of about 5000 hours on per year. Typically in the 50 to 75 watt bracket.
 

shady28

Distinguished
Jan 29, 2007
385
244
19,090
8
Yep. I've been musing about that. I doubt if my PC (excluding monitor) uses 150 watts more than 20 hours a year....out of about 5000 hours on per year. Typically in the 50 to 75 watt bracket.
Exactly, I've actually gone so far as to use the performance monitor to model my usage for days at a time. There are, literally, like 2-3 minutes a day where I can see more than 2-3 cores going to high usage. I figured out that 2-3 minutes is during patching - either the OS, via steam, or whatever. It's literally like 10 hours per year if I were to spend 15 hours a day on my computer - which would be 4500 hours - and I don't average anywhere near that.

Now, if I had a more powerful GPU (I have a 2060 KO) I'd probably see more CPU usage. But this kinda baits the point, I need a 350W GPU to see my CPU go over 150W more than 2 minutes a day?
 
For me it's not the energy from the wall, it's getting the heat out of the case!

Somewhere there has to be a practical limit of how many BTUs can be extracted from a given surface area of silicon.
You are not forced to run an intel CPU at full overclock mode...it's a couple of clicks in the bios to rein it in, the same way it's a few clicks for ryzen to enable PBO.
Even locked at 125W the 12900k does very well against the top ryzen CPUs and is more than 10 degrees cooler with the same cooling and using the same amount of power...So guess which one is going to pump more heat into your room.
If we are pretending to be enthusiasts here a few clicks in the bios shouldn't be an issue.
https://www.hardwareluxx.de/index.php/artikel/hardware/prozessoren/57430-core-i9-12900k-und-core-i5-12600k-hybrid-desktop-cpus-alder-lake-im-test.html?start=8

And you are also not forced to use it at 125W only, you can go as high or as low as YOU want because no company can force you how to use YOUR PC.
 

bolweval

Distinguished
Jun 20, 2009
60
25
18,560
0
The results represent very little until it is known how much energy is being pulled out from the wall. Intel is known to improve the handling of the heat, not improving the consumption but the speed with increasingly energy use. AMD and Apple seem to improve instruction speed per watt, Intel seems to go more speed, more energy, more heat dissipation. I don't want to burn my house or increase my energy bills.
I worry about energy efficiency when i buy appliances, CPU's not so much, unless you are mining or benchmarking the cpu draws very little power 99% of the time...
 

KyaraM

Notable
Mar 11, 2022
988
350
890
42
This looks quite nice actually. really hope they can push ST performance to a 10% or higher uplift, that would keep AMD at bay. But either way, these are very strong cores already.

Also kinda disproves the claim that the e-cores are garbage, sorry to anyone who thinks that. Actually, no, I'm not.

The results represent very little until it is known how much energy is being pulled out from the wall. Intel is known to improve the handling of the heat, not improving the consumption but the speed with increasingly energy use. AMD and Apple seem to improve instruction speed per watt, Intel seems to go more speed, more energy, more heat dissipation. I don't want to burn my house or increase my energy bills.
A 12900K at stock settings draws as much as a 5950X with PBO, while being faster or equal in most workloads. Just because AMD isn't telling you how much your CPU actually draws doesn't mean it does draw that much. Always remember that such tests are the absolute worst case, too. Under normal circumstances, no CPU draws as much as is theoretically possible. My 12700k has a PL2 of 190W, but runs completely uncapped except for a dynamic undervolt. Not only have I never seen it go far past 160W during stress testing, and now 30W less after the undervolt, I also have never seen it go past 75W-80W while gaming, which is most of what I do on the machine (rest is surfing and some programming). Most of the time, it pulls 25-35W, which is laughable compared to the 240W my 3070Ti pulls. It's also quite easily coolable, if you consider it is running under a Pure Rock 2 while not going past 80°C due to the undervolt.

Also, consider that AMD is upping their power limits while Intel's stay the same. You can still run a 13900K at PL2 andthen use 240W without much, if any, performance loss. Above graph points to no loss at all. And then AMDs new chips are right up there with them.
 
Reactions: shady28 and rtoaht

rtoaht

Commendable
Jun 5, 2020
66
57
1,610
0
On the other hand, I couldn't care less how much energy it draws from the wall. I just want to go fast. My house is up to fire code and I can afford the extra ~$10 a year in electricity.
You might end up saving power with Raptor Lake since the E cores idles at much lower power than Ryzens. If you do 24/7 Cinebench at peak, then Raptor Lake will consume more. But for most users web browsing and idling are more common use case. That will be done entirely with E cores.
 
Reactions: shady28

Eximo

Titan
Ambassador
Just says to me that the non-K SKUs are going to be all the more impressive. My 10900F already boosts to 5.2Ghz single core, more commonly at 5.1Ghz in everyday use. 12900 only goes up to 5.1Ghz max, so a little backwards slide there.

If the 13900 will boost to 5.3 or more Ghz with somewhat reasonable power requirements, all is well.

This is an interesting one comparing a severely under powered 12900 vs the 5800X3D, also undervolted. Skip to 12:25.

View: https://www.youtube.com/watch?v=1rgbJwxSYss
 
Reactions: rtoaht

shady28

Distinguished
Jan 29, 2007
385
244
19,090
8
Just says to me that the non-K SKUs are going to be all the more impressive. My 10900F already boosts to 5.2Ghz single core, more commonly at 5.1Ghz in everyday use. 12900 only goes up to 5.1Ghz max, so a little backwards slide there.

If the 13900 will boost to 5.3 or more Ghz with somewhat reasonable power requirements, all is well.

This is an interesting one comparing a severely under powered 12900 vs the 5800X3D, also undervolted. Skip to 12:25.

View: https://www.youtube.com/watch?v=1rgbJwxSYss
Ya relevant if you have a 3080 Ti, maybe. Thing is, most people don't have that. And according to TPU, during gaming the 3080 Ti FE draws 358W. So we're gonna undervolt the CPU to save 40W and run a GPU that pulls 360W ++?

On lesser cards, the CPU is just idling as it doesn't have to work to keep up with the GPU. Normally, I don't see any significant CPU power draw even during gaming. It's usually below 50W on games.

Case in point, my GPU taking up 215W while my CPU is at 41W. The 10850K is way faster than this GPU can push. This morning, playing and working, never went above 103.5W.

No one has really taken this tract as far as reviews that I can see, but my best guess is I'd need to have a 3070 at a minimum to start to really draw any power during gaming. By that I mean, I might maybe see 100W CPU, as a 3070 is almost 2x faster than a 2060. But assuming I hit 100W, and the 3070 was running ~240w, both of them together is still less power than a 3080 Ti draws all by itself.

This image shows my GPU pretty much maxing out, and my CPU at 41W. The 103W was peak this morning, during some application launch / startups, and lasted all of about 1 second.

 

Eximo

Titan
Ambassador
Same applies to the GPU actually. You can lop off a lot of the power demand of a 3080/3080Ti and still get most of the performance. The performance difference between 2000Mhz and 1800Mhz isn't huge, but the power to do it is.

And, yes, my card by default will sit at 350W. I've been considering undervolting and setting a more reasonable power limit, just to save the heat output. Much as I love blasting the AC, it does get awfully warm in the summer months. Water cooling works almost too well.
 
Reactions: KyaraM and shady28

ASK THE COMMUNITY