News Intel Core i7-10700K Comet Lake CPU Spotted Boosting to 5.3 GHz

jgraham11

Distinguished
Jan 15, 2010
54
21
18,535
Seems to be typo as I am pretty sure Intel has been having problems with their 10nm node for a lot longer than since 2018...

I wonder if Comet lake will fix the known CPU bugs: Meltdown, Specter, Zombieload, Fallout, Cacheout.

Will Intel recommend users disable Hyper-threading like they have on their previous and current processors because of these bugs...
 
So another early benchmark results thingy with, no benchmark results.

So if the specs are somehow accurated it will be something like a i9 9900K/KS, just probably not the same performance cause, if it was the same performance I can forsee a lot of high end intel adopters and fanboys who got this expensive cpus really pissed.

Only thing to do is wait, and see what happend when the independent reviews shows up.
 

joeblowsmynose

Distinguished
So is the 5.29 what the CPU reported its boost clock is, or what the software read the boost clock as? Because if it is not the latter, then the article title is very misleading and click-baity. ... "spotted boosting to ..."...

I am also thinking that Intel may be trying to do with boost what AMD did for Zen2 - very short time-frame boosting that might aid in some "bursty" operations, as AMD put it.

Unfortunately, if this turns out to be the case, it won't offer near the performance boost (pun intended) of a full time single or dual core boost that we see in the 9th gen. Just as with Ryzen, when AMD "fixed" the boost clock issue, there' really wasn't a whole lot of performance gains.

We'll have to wait and see though ...


BTW a 10 core 14nm CPU sounds like a terrible idea to me (10900k). It still won't compete with high end ryzen and won't clock as well as the 10700k, and will be hotter than the surface of the sun. Intel should scrap that stupid idea and put the resources into making the best 6 and 8 cores they can with 14nm and defend their tiny gaming crown ... actually the way these rumours are shaping up, this might actually be what is happening ... the 10700k will be way more desirable over a 10900k ... if you need more than 8 cores, the intelligent choice right now is Ryzen. The 10900k should be relegated as the "fanboi special" ;)
 
Last edited:
  • Like
Reactions: bit_user
So is the 5.29 what the CPU reported its boost clock is, or what the software read the boost clock as? Because if it is not the latter, then the article title is very misleading and click-baity. ... "seen boosting to ..."...

I am also thinking that Intel may be trying to do with boost what AMD did for Zen2 - very short time-frame boosting that might aid in some "bursty" operations, as AMD put it.

Unfortunately, if this turns out to be the case, it won't offer near the performance boost (pun intended) of a full time single or dual core boost that we see in the 9th gen. Just as with Ryzen, when AMD "fixed" the boost clock issue, there' really wasn't a whole lot of performance gains.

We'll have to wait and see though ...


BTW a 10 core 14nm CPU sounds like a terrible idea to me (10900k). It still won't compete with high end ryzen and won't clock as well as the 10700k, and will be hotter than the surface of the sun. Intel should scrap that stupid idea and put the resources into making the best 6 and 8 cores they can with 14nm and defend their tiny gaming crown ... actually the way these rumours are shaping up, this might actually be what is happening ... the 10700k will be way more desireable over a 10900k ... if you need more than 8 cores, the intelligent choice right now is Ryzen. The 10900k should be relegated as the "fanboi special" ;)

So the 7900X and TR 1920X were terrible ideas? Because the 7900X was a 10 core 14nm, first gen Skylak-X, and the TR 1920X was a 12 core 14nm CPU.

The only difference here is the boost clock rates and sustained clock rates for all cores. The 7900X was a much earlier 14nm so it was not as able to sustain the clock speeds that the 10900K will.

I will withhold judgement on this CPU until seeing it's actual Battlefield 5 framerates compared to those of existing 9700K/9900K...

A single core boosting to 5.3 GHz for a max of 30 seconds then downclocking itself by 500-600 MHz due to TDP restrictions will not win over many gaming fans...

I would think the clock speed they can maintain overclocked on all cores would matter more for gamers. I doubt most 9700K/9900K/KS are stock speeds nor would the 10700K or 10900K be.
 
No one is denying that many overclock their CPUs, myself included....

But, true 'stock' behavior might only have a max turbo clock hit for 30 seconds or so...(yes, it's easily overcome, but, many comparisons will be done in stock configuration, vice MCE-enabled, TDP limits removed, etc...) It's also possible that with upcoming CPU-integrated Meltdown/Spectre mitigation microcodes, the 5.3 GHz CPUs might only match or barely exceed the results of those of the previous gen, but, I hope that is not the case)

With 5.3 GHz turbos, I'm sure many are already dreaming of all core overclocks of 5.4 GHz, but, I'd wager 5-1 that this does not occur very often, and, undoubtedly would be at 160-180 watts TDP if it did...

I'm suspecting an 8c/16t at higher clocks would likely fare better at most gaming scenarios than a 10c/20t part at 200-300 MHz less speeds...

It will be an interesting year for the enthusiast, as Ryzen 4000 desktop arrives to compare to existing Intel 9th gen and upcoming 10700K/10900K...
 
  • Like
Reactions: bit_user
No one is denying that many overclock their CPUs, myself included....

But, true 'stock' behavior might only have a max turbo clock hit for 30 seconds or so...(yes, it's easily overcome, but, many comparisons will be done in stock configuration, vice MCE-enabled, TDP limits removed, etc...) It's also possible that with upcoming CPU-integrated Meltdown/Spectre mitigation microcodes, the 5.3 GHz CPUs might only match or barely exceed the results of those of the previous gen, but, I hope that is not the case)

With 5.3 GHz turbos, I'm sure many are already dreaming of all core overclocks of 5.4 GHz, but, I'd wager 5-1 that this does not occur very often, and, undoubtedly would be at 160-180 watts TDP if it did...

I'm suspecting an 8c/16t at higher clocks would likely fare better at most gaming scenarios than a 10c/20t part at 200-300 MHz less speeds...

It will be an interesting year for the enthusiast, as Ryzen 4000 desktop arrives to compare to existing Intel 9th gen and upcoming 10700K/10900K...

Of course. A 10700K will probably be able to maintain higher clocks longer. The 9700K is seen as the better gaming CPU and in most cases OCed higher than the 9900K did.

I don't think this year will be very interesting. When Intel finally gets 10nm out of the way and moves to 7nm that will be interesting. Ice Lake and Tiger Lake have both shown promising performance improvements over previous generations. Ice Lake seems to perform better at lower clock speeds than its previous gen counterparts and Tiger Lake looks to increase that performance. Once we can have those at decent clocks and core counts on consumer desktop it will be an interesting battle.
 

joeblowsmynose

Distinguished
So the 7900X and TR 1920X were terrible ideas? Because the 7900X was a 10 core 14nm, first gen Skylak-X, and the TR 1920X was a 12 core 14nm CPU.

The only difference here is the boost clock rates and sustained clock rates for all cores. The 7900X was a much earlier 14nm so it was not as able to sustain the clock speeds that the 10900K will.

A few things ... 7900x was HEDT when Ryzen is desktop and only had only had 8 cores when the 7900x was launched - entirely different product segmentation and different ecology compared to now. Would you rather a 7900x or a 3950x (if you didn't "need" HEDT platform)? (which is still $250 less than what the 7900x went for).

In the "desktop" environment, where you can get a 12 core and 16 core at some great bang for buck prices, and superior high thread computing that doesn't look like it'll change at all with just another 14nm refresh, why would a 10 core, that boost to or near 5ghz (while that number alone is pretty impressive), will consume +300w (rumour alert), won't outperform in either raw performance nor bang for buck (reasonably educated speculation), not seem like a bit of a pathetic option?

Especially considering that your 10700k is aiming for retaining the "gaming crown" (which is a plus for Intel), at least it will have a win there ... the 10900k will be slower at gaming, and not have multi-threaded efficiency / bang for buck at all compared to the competition?

I think its pretty straight forward ... the 10900k will be "less good at gaming" than the 10700k, most likely won't oc as high a 10700k, will be harder to cool, and won't be usurping AMD's multi-threaded lead. It is the winner of exactly nothing (aka, loser at everything). The 10700k will be the winner of high clocks, likely best overclocking, and probably gaming.

The 1920x, like th e7900x, might have made sense at the time, but if all you need is desktop, a 1920x doesn't make any sense in a newer environment where the 3950x exists for the same or less money and performs better.

I am suggesting that Intel should just play only their best hands in regards to differentiating their products compared to their competition, at lest at this point ... Did you see the enthusiasts and critics response to 10th gen HEDT? ... yeah not good for brand image.

Does that make sense?




I would think the clock speed they can maintain overclocked on all cores would matter more for gamers. I doubt most 9700K/9900K/KS are stock speeds nor would the 10700K or 10900K be.

Again, my point exactly ... unless Intel starts doing some weird "AMD style" binning, the 10700k is likely going to be a better overclocker, because the cooling requirements to keep 10 14nm cores at or above 5.0ghz is probably beyond most people ability to afford, (or build).

So again, my point (for me) is even more reinforced -- in light of Ryzen for muti-threaded, and 10700k for better gaming with a 2080ti and bottlencked CPU and a high refresh monitor (or whatever criteria people think they need to "game"), the 10900k at 10 cores on 14nm doesn't make much sense, except to those who just will never try AMD even if it makes no sense to not do so (which generally are the "fanbois", hence my original name, "the fanboi special" ;)).

And while I'm not comparing Intel 10th gen to bulldozer -- the FX9590 was AMD's "fanboi special" - aaand pretty much that whole architecture.
 
Last edited:

bit_user

Polypheme
Ambassador
So the 7900X and TR 1920X were terrible ideas? Because the 7900X was a 10 core 14nm, first gen Skylak-X, and the TR 1920X was a 12 core 14nm CPU.
First, that's an interesting example to cite, because IIRC it did not compare well in gaming benchmarks against the i7-7700K.

Second, it was also hot and expensive.

From the review (https://www.tomshardware.com/reviews/intel-core-i9-7900x-skylake-x,5092.html):

  • AGAINST
    • Performance regresses in some games compared to prior generation
    • High price
    • Poor thermal performance

Third, Comet Lake will probably stick with a ring-bus interconnect, which is pretty much maxed out at about 10 cores. The disadvantages of using ring bus, at this scale, are latency and power.

And, as for TR 1920X, that was a specialized CPU that was not intended primarily for gaming. I wouldn't compare it with a mainstream CPU, like Comet Lake.

if all you need is desktop, a 1920x doesn't make any sense
That. Just leave it at that. It was not made to be a generic desktop CPU, and it wasn't a very good one. You'd only get one if you needed to do what it did well.

The only difference here is the boost clock rates and sustained clock rates for all cores. The 7900X was a much earlier 14nm so it was not as able to sustain the clock speeds that the 10900K will.
It was 140 W, bro. Stock.

I know their 14 nm is improved, but this thing is gonna be more than 140 W.

I would think the clock speed they can maintain overclocked on all cores would matter more for gamers. I doubt most 9700K/9900K/KS are stock speeds nor would the 10700K or 10900K be.
So, you think most gamers on these CPUs are using liquid cooling? I'm not saying they aren't, but I didn't think it was quite that popular.

Do games really need all-core overclocks? I'd imagine games are only bottle-necked on a couple threads.
 
Last edited:
So the intel cpu "benchmarked" reach 5.3GHz, ok, under what condictions?
Was this an open bench system?, Was this under liquid cooling or air?, if it was liquid cooling, did they used an AIO or a custom loop? What size was the radiator?

I think at this point whatever intel makes in the mainstream segment it wont really matter but thier fanboys, let me explain before you go berserk, all thier high end current cpu (9xxx) still play any game out there without any issues and in some cases, if not all, a bit better than AMD counterpart. I believe, they have more important concerns in the face of thier current (HEDT and Server), and to keep thier products with the same level of sale (which wont be hard at all from them).

As for 10nm, and I know I can be very wrong here but, we will see if they manage to pull something that works better than this new +++ . . .+14nm iteration.
 
First, that's an interesting example to cite, because IIRC it did not compare well in gaming benchmarks against the i7-7700K.

Second, it was also hot and expensive.

From the review (https://www.tomshardware.com/reviews/intel-core-i9-7900x-skylake-x,5092.html):



Third, Comet Lake will probably stick with a ring-bus interconnect, which is pretty much maxed out at about 10 cores. The disadvantages of using ring bus, at this scale, are latency and power.

And, as for TR 1920X, that was a specialized CPU that was not intended primarily for gaming. I wouldn't compare it with a mainstream CPU, like Comet Lake.

That. Just leave it at that. It was not made to be a generic desktop CPU, and it wasn't a very good one. You'd only get one if you needed to do what it did well.


It was 140 W, bro. Stock.

I know their 14 nm is improved, but this thing is gonna be more than 140 W.


So, you think most gamers on these CPUs are using liquid cooling? I'm not saying they aren't, but I didn't think it was quite that popular.

Do games really need all-core overclocks? I'd imagine games are only bottle-necked on a couple threads.

Power draw was not that bad for it. Under AutoCAD and gaming loops is was only a bit above the 7700K per the benchmarks you listed which is comparing a quad core with SMT to a 10 core with SMT. When it did get well above was torture which makes sense as its using all cores at their max and again considering it has 150% more cores it was pushing about double the power draw.

Either way I was just stating I don't think 10 cores on 14nm is a terrible idea. Especially considering that Intel has made a lot of improvements. The 9900KS alone shows that they can achieve better clocks while keeping power low as or lower than previous spins.

I would say the majority of people that buy the higher end K CPUs probably buy an AiO. They are affordable and easier to setup and cool the CPUs, even higher end AMD CPUs. Games don't need all core but most gamers tend to overclock and its better to have too much cooling than just enough.

Either was these CPUs will be nothing to get excited about, however this is the result of AMD throwing a ton of cores at mainstream. I still don't think 16 cores is beneficial to mainstream at all. There is a very small use case for them. Until Intel has a competitive product on 7nm, I still doubt 10nm will make a major splash in consumer desktops, I just don;t see an exciting battlefield.
 
  • Like
Reactions: RodroX

bit_user

Polypheme
Ambassador
all thier high end current cpu (9xxx) still play any game out there without any issues and in some cases, if not all, a bit better than AMD counterpart. I believe, they have more important concerns in the face of thier current (HEDT and Server), and to keep thier products with the same level of sale (which wont be hard at all from them).
In the consumer space, they have two concerns:

1. Keep adding value, so that more owners of older CPUs (i.e. not Coffee Lake, but maybe Sandybridge, Ivy Bridge, Haswell, or even Skylake) that didn't bite on Coffe Lake might finally see enough value to pull the trigger on an upgrade.

2. Keep ahead of AMD, which should be launching a Zen2+ generation (possibly 7 nm EUV?), later this year, which could finally leapfrog Intel.

That's why they're doing yet another generation on 14 nm, instead of just waiting until their 10 nm or 7 nm is capable of surpassing their current mainstream segment.
 

bit_user

Polypheme
Ambassador
Power draw was not that bad for it. Under AutoCAD and gaming loops is was only a bit above the 7700K per the benchmarks you listed which is comparing a quad core with SMT to a 10 core with SMT.
Considering it was released at a time when most gamers still had only quad-core CPUs, I doubt the gaming loop was as stressful as it would be today.

Either way I was just stating I don't think 10 cores on 14nm is a terrible idea.
Where the "bad idea" part comes in is if it's a ring bus topology. Otherwise, I wouldn't care. Their mesh scales much more efficiently, but at the expense of higher latency.

Especially considering that Intel has made a lot of improvements. The 9900KS alone shows that they can achieve better clocks while keeping power low as or lower than previous spins.
Did you see that replicated anywhere other than on this site? Just curious. I'm not accusing Tom's of anything, other that the fact that they probably got a single review sample of each model.

I still don't think 16 cores is beneficial to mainstream at all. There is a very small use case for them.
Agreed. Even 12 cores is probably overkill. Maybe worth getting, for the sake of "future-proofing", but still probably overkill.
 

JaSoN_cRuZe

Honorable
Mar 5, 2017
457
41
10,890
I'm in IvyBridge still looking for a worthy upgrade. Will see whether Comet lake or Zen 3 takes the catch.

Hopefully will side with AMD because of it's efficiency, wide product stack and AM4 compatibility throughout.

Having little hope for Intel but still waiting to see those clock speed improvements.
 
  • Like
Reactions: bit_user

bit_user

Polypheme
Ambassador
I'm in IvyBridge still looking for a worthy upgrade. Will see whether Comet lake or Zen 3 takes the catch.
I have a Sandybridge machine that I would upgrade to either Coffee Lake-R or Ryzen 3k, if I used it more. Ryzen 4k would probably take the cake.

I also have a fileserver that's Phenom 2 and I want to upgrade it to Zen 2, but the darn thing is still good enough at what it does. It has a SSD boot disk and I recently replaced the RAID. I'm about to add a 2.5 Gigabit Ethernet card, too!

Hopefully will side with AMD because of it's efficiency, wide product stack and AM4 compatibility throughout.
I'm not sure their Ryzen 4k will keep AM4. They said it would last through 4 generations, and if you count the back-ported APUs they released on it, I'd say we're due for a new socket.
 

JaSoN_cRuZe

Honorable
Mar 5, 2017
457
41
10,890
I'm not sure their Ryzen 4k will keep AM4. They said it would last through 4 generations, and if you count the back-ported APUs they released on it, I'd say we're due for a new socket.

I'm hoping Ryzen 4000 series (Zen 3) will stay on AM4 and a possible launch window of Q3 2020 along with x670 & B650 which is B550. Waiting eagerly for Computex 2020!!
 
In the consumer space, they have two concerns:

1. Keep adding value, so that more owners of older CPUs (i.e. not Coffee Lake, but maybe Sandybridge, Ivy Bridge, Haswell, or even Skylake) that didn't bite on Coffe Lake might finally see enough value to pull the trigger on an upgrade.

2. Keep ahead of AMD, which should be launching a Zen2+ generation (possibly 7 nm EUV?), later this year, which could finally leapfrog Intel.

That's why they're doing yet another generation on 14 nm, instead of just waiting until their 10 nm or 7 nm is capable of surpassing their current mainstream segment.

Well those are valids points but, but they are just reactions to what AMD did, nothing of that mean they give a sh##$$% about consumers. Thier sudden price drop is a prove of that, they been overpricing cpus for years.
And users that have been holding to their old intel cpus (core i3 and core i5) is not cause theres nothing new and worth it to jump onto, but cause they really don't have a reason to do the jump, thier current CPU is enough for the task they do with the PC.

And they are doing another 14nm iteration cause they realize their 10nm counter part don't bring anything new to the table.
 
Last edited: