Question Should I be concerned about Intel's future 10-Core CPU that could potentially be released later this year when it comes to gaming?

Feb 18, 2019
15
0
10
I plan to either rebuild or replace my gaming PC either later this spring or summer after finishing graduate school. I may hold off to see what the story may be with AMD's Ryzen 3000 CPU series, but currently I am leaning toward Intel for a replacement CPU. My current PC set up uses an Intel i5-7600K CPU and an Asus variant of the Nvidia GTX 970. I am not certain on which CPU I may get if I go with Intel. I am considering either the I9-9900K or I7-9700K as candidates. I am also uncertain if I will go with either an Nvidia RTX 2080 or 2080 Ti. I do know that in terms of gaming resolution I will purchase a G-Sync capable gaming monitor with either 1080p or 1440p resolution since I am hearing that the gaming experience at 4K has not entirely reached acceptable levels. A secondary task that I will likely perform on the PC will be basic drone video editing such as combining video clips and adding music and basic animation like texts and titles. I do not overclock and do not believe in the futureproof myth, but my intention is to build a gaming PC that is capable of handling high and/or max game settings for 3 to 4 years with an upgrade of the GPU and, if necessary, memory at either the 18 month or 2 year mark. Right now I understand that the current I7 and I9 core count is 8 and that Intel is potentially going to release 10-core CPUs by the end of the year. Based on your experiences with previous generations of Intel processors, will the I7-9700K or I9-9900K be capable of handling the tasks that I mentioned and will they be powerful enough to handle the demands of newer GPUs such as the successors to the current RTX cards (e.g. RTX 3080 or 3080 Ti) or could I potentially face a CPU bottleneck issue due to using an 8-core CPU instead of a 10-core CPU?

Thanks.
 
First off, Nvidia GPUs now support FreeSync over display port, so I'd recommend not buying a GSync monitor now that they've effectively been EOL'd.

Though to tell on Intel ice lake this early in the game. I will say the i7-9700K is a great gaming GPU. I doubt that Ryzen 3000 will dethrone it.
 
Any current or last gen Intel top shelf CPU like an i7 or i9 is more than capable enough to support any existing or upcoming graphics card. Any current or upcoming Gen Ryzen 7 or TR is capable of supporting all the top shelf graphics cards as well. Obviously, since Intel still has the IPC advantage, if you want the best performance, period, then you go with Intel. If you want the best bang for the buck you go with AMD.

It's improbable, unlikely and not expected that any of the Ryzen 3000 series CPUs is going to outperform even a Kaby lake CPU (Two Intel gens back) much less current Gen models, when it comes to single core performance and at some point there has to be diminishing returns on additional cores because these games and programs have not been optimized to take advantage of more than six or eight cores anyway. Certainly they could be, but they haven't, so anything more than 8 cores with hyperthreading is probably just giving you support for some background processes or multitasking, not actual in game performance. If you tend to record and stream while gaming, then you might find that to your benefit.

Otherwise, 8 cores with stronger single core IPC is pretty much at the point where going beyond that may not be terribly helpful. Then again, it's really hard to say what might be in the future, since we don't have that information yet.

As for the Freesync recommendation, of 400 tested monitors, only 12 met Nvidia's requirements, and even on systems that are completely compatible indications are that using Freesync with an Nvidia graphics card does not actually seem to work the way it's supposed to and honestly I'd avoid going that route if you are particular about things as it will likely lead to a lot of frustration. I'd stick to a G-sync monitor if you're going to go with an Nvidia card but of course that's totally your call.

https://www.pcworld.com/article/333...c-monitor-support-geforce-graphics-cards.html

Even Nvidia says that for most Freesync monitors

“it may work, it may work partly, or it may not work at all”

 
Last edited:
FreeSync works perfectly on all monitor tests I've seen around the net. There's even a sizable user created list of FS performance using Nvidia GPUs with overwhelmingly positive results out there if you're interested. There are a few crummy bottom of the barrel FS monitors that don't even give a reliable experience on AMD GPUs which Nvidia likes to exploit and generalize over the masses. The auto-enable list (this is not a compatibility list) has gone from 12 to 15 now but Nvidia has only tested through manufacturers starting letters A-B. They're dragging their feet and slandering freesync in an effort to hold on to the last threads of GSync sales they can. As long as a FreeSync monitor has DP input, you can manually enable it on any 10xx or 20xx GPU regardless of whether it's on Nvidias list.

As you'd expect, the wider the FreeSync range on the monitor you buy the better. 40-48Hz is generally the low bound frequency. Look for a monitor with LFC (most/all 100+Hz monitors have this)
 
Yes, I see that there are some signs of success but they seem to require a good deal of manual configuration so if you're looking for out of the box plug and play configuration like you'd have with a Gsync monitor, I don't think you're going to see that happen.

I also see that while there are a good many people reporting success there are also a good many that claim it's causing glitches and blurry displays on some monitors that have Freesync. I guess it's up to you. I'm not generally willing to be an early adopter or make recommendations on early adoption until some time has passed, enough to be sure that I'm not just being a lemming and following everybody else over the edge of the cliff.
 
Are we almost asking if a 9900K will be considered a ...<shudder!> bottleneck... the instant something a little faster comes along, and/or, a faster GPU is released ? :)

That'd be tough to comprehend this soon. (An 8700K was not suddenly slow or inadequate once the 9700K/9900K released, nor was it inadequate for driving a 2080Ti......)

Although the exact clock speeds of the new 10 core i9 are not yet known, one would likely assume Intel will not allow a performance decrease from the 9900K's successor, so will likely adopt tricky numbers of cores active/max turbo clock ratios to ensure the new flagship leads (100 MHz higher for 7-8 cores active will do to generate a lead over the 9900K), even if at least 8 cores must run at at least 100 MHz higher than the 9900K's typical 4.5-4.6 GHz all core turbo, with but 1-2 cores at 5GHz. It would be hard to imagine they'd be able to contain the heat if allowing all 10 cores to turbo even as high as 4.6-4.7 GHz or higher without needing to dissipate 150-165 watts but, I look forward to seeing the first samples reviewed, hoping temps are at least 'semi' under control...with a good cooler. Realistically, perhaps they've further 'perfected' 14 nm ++++ , and we might see slightly higher clocks with the same temps. Maybe 1-2 cores at 5.1 GHz, rest of cores at 4.5-4.6 GHz?)
 
Last edited:
I agree, especially since an Ivy bridge i7 has stronger core performance than a current Gen Ryzen 7 CPU, which itself is still capable enough for use with a 2080 TI should you care to go that route. I'd have no qualms about running my 6700k with ANY graphics card that is currently available.

Trying to guess about new CPUs is always an exercise in futility as there is just no way to know what they're going to do or how they're going to implement certain things.
 
Are we almost asking if a 9900K will be considered a ...<shudder!> bottleneck... the instant something a little faster comes along, and/or, a faster GPU is released ? :)
Still the dilemma remains,with the 10th gen CPUs he will be getting more CPU for his moneys or he could opt to spend less moneys to get an 8/16 CPU.

As it is today the 7700k is still just as fast as any other CPU so there is no reason to fear then certain amount of cores (above 4) is not going to be enough.
 
The 7700k has only four cores, so no, it's not as fast as any other CPU. It has "close" to the same single core performance as 9th gen processors because Intel hasn't improved IPC much at all over the last few generations, or since Haswell really, but it has fewer cores than 8th or 9th gen i7's and we know that the majority of what's out there now CAN support up to eight cores before the return on the number of cores starts diminishing, so depending on the game or application, even a Ryzen 7 might outperform it on anything that isn't strictly limited to single or few core optimized code.

Right now we're at a point where it's a balancing act. Too few cores and you're not taking full advantage of modern games and apps, or multitasking, and too many and you're not likely to see much improvement on most of them either.

Honestly, just get the best CPU you can reasonably afford, and forget about it. Any top shelf CPU from the curren or last few gens is capable enough to do just about whatever you want to do.
 
https://www.hardware.fr/articles/965-3/performances-jeux-3d.html
All heavily multithreaded games,the 7700k is just as fast as the fastest CPUs and about 8-10% faster than the ryzen systems,and that's at stock,not that you have a whole lot of overclock potential (on either ryzen or the 7700k) but still.
And even with games that get their scaling wrong (oof just look at those ryzen numbers if you think they have proper scaling) the 7700k is still as fast as the 8700k that has 6/12
https://gamegpu.com/action-/-fps-/-tps/far-cry-new-dawn-test-gpu-cpu

Bottom line if you only do minimal multiasking with your games 4/8 would still be not only cutting it today but it's still pretty much the best you can do.
(Or as all the AMD fanbois say you'll never notice that last 10% at that high a FPS)
 
I don't know who runs that site you referenced, but I sure as heck don't trust anything they publish and can't understand anything there anyhow. There are plenty of English language websites that do comparative testing, so I don't see any reason to go find an obscure alternate language site and cherry pick the results you want to see.

Obviously, they are not the same, in anybody else's testing. At the following link, the 7700k gets significantly lower scores in all of the benchmarks and if you continue on to the next page with the gaming benchmarks, while there are a few where they score similarly, and those will be games that are mainly only using one to four cores since there is little difference in the single core performance between Kaby lake and Coffee lake, you'll see there are significant differences again in things that are optimized for multithreading like Ashes of the singularity and Battlefield 1 with the GPU bottleneck removed. Mainly, that is the problem, that during much of the testing that was done on the 8700k, the best card you could get which was the 1080 TI at that time, could not do enough work to not be the limiting factor and when the resolution is dropped to remove the bottleneck, the 8700k gets significantly better results than the 7700k or 2700x.


If you look at the gaming benchmarks here, you'll see that there are clear differences in the minimum and average FPS between the 9700k/8700k and the 7700k. It's ridiculous to think there wouldn't be, at least on anything that can use the additional cores to it's advantage. For anything that can't, then they will be the same.

https://www.gamersnexus.net/hwreviews/3421-intel-i7-9700k-review-benchmark-vs-8700k-and-more
 

InvalidError

Titan
Moderator
If you are worried about Comet Lake 10C20T making Coffee Lake 8C16T obsolete, you should also be worried about Ryzen 3850 or whatever it will be called making things even worse for Intel by bringing 16C32T under the $500 mark on top of 200-300MHz higher clocks and ~10% better IPC closing most of the remaining single-threaded performance gap with Intel.

I wouldn't be surprised if Intel nudged the pricing points up a couple of extra bucks across all tiers that gain cores or threads, which means Intel's gaming bang-per-buck propositions may get substantially worse than it already is compared to AMD's current lineup especially after recent price drops.
 
I don't know who runs that site you referenced, but I sure as heck don't trust anything they publish and can't understand anything there anyhow. There are plenty of English language websites that do comparative testing, so I don't see any reason to go find an obscure alternate language site and cherry pick the results you want to see.
Well that obscure russian site shows you a video of the part in the game they tested-actual gameplay- and they also show you cpu utilization, it's in pictures and graphs so anybody can understand them.
Of course, the canned/prescripted benhmarks you are looking at are only made to test the graphics part on the same CPU so they will have different results.
But even on the benches you show the 7700k is still just as fast or even faster than the 2700x in gaming.