Intel Coffee Lake (8th & 9th Gen Core CPUs) + Skylake-X Refresh & W-3175X MegaThread! FAQ and Resources

Page 19 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
*********************************************************************************

Hey Community, just an FYI, I've done a massive update to this megathread talking about the new overclockable Xeon, Skylake-X Refresh and Coffee Lake R (9th Gen chips).

This megathread will now encompass all X299, Z370/Z390 and W-3175X discussion.
*********************************************************************************
 

RobCrezz

Expert
Ambassador


In most games no. In any that use more than 4c/8t then there would be a performance boost.

I wouldnt recommend it unless you were going for a 2080 ti.
 

goldstone77

Distinguished
Aug 22, 2012
2,245
14
19,965
Ian Cutress at Anandtech added the i5-8400 to their benchmark list, so you can make comparisons. Looking at all the averages you can see that their is little difference in average FPS between these two processors. I hope this provides a little perspective about what kind of gaming experience you could expect.
DqjrbyLX0AEwT-t.jpg

Dqjrcs9XcAAVO5k.jpg

https://www.anandtech.com/bench/product/2274?vs=2263
 
  • Like
Reactions: gplayersv

Gaidax

Distinguished


You will get extra 2 cores and 4 threads, that's a nice bump, but overall it depends on what you are doing.

If you are just gaming - you don't really need these extra resources. If you are streaming high quality and making videos all the time, then yes.
 
For pure gaming the 7700K is still a good performer. No need to upgrade yet.

If you are streaming, I'd recommend going to NVENC (ShadowPlay) on Nvidia cards, uses the Nvidia GPU for encoding. I have done extensive tests personally and notice no difference in quality vs x264 (CPU encode) and NVENC (GPU encode), but the performance uplift is absolutely monstrous. (Specifically NVENC on Pascal and Turing. Maxwell runs an older NVENC encoder which is still good, but I think it is showing its age.)
 
Assassin's Creed: Odyssey is very CPU intensive. It can max out an 8700K, and a 9700K.
Yeah.. I am witnessing it now.

So is seige btw.. its maxing out my 8700k.
Yeah but they only fill up the cores without adding (any) FPS...odyssey at least tops out at ~100FPS with the 4c/8t any additional cores only work to make the game look more important then it is,or some other reason except for improving performance.
https://www.techpowerup.com/reviews/Performance_Analysis/Assassins_Creed_Odyssey/4.html
cpu_720p.jpg
 
  • Like
Reactions: Ben Harley
Yeah but they only fill up the cores without adding (any) FPS...odyssey at least tops out at ~100FPS with the 4c/8t any additional cores only work to make the game look more important then it is,or some other reason except for improving performance.
https://www.techpowerup.com/reviews/Performance_Analysis/Assassins_Creed_Odyssey/4.html
cpu_720p.jpg

Which is expected; once the GPU is consistently fed, you aren't going to gain any more performance by adding more CPU cores.
 
Which is expected; once the GPU is consistently fed, you aren't going to gain any more performance by adding more CPU cores.
Exactly, the thing is though that normally a game will stop filling up cores at that point,it will only use as many cores as it needs to feed the GPU if you take a look at the quotes I answer to, U6b36ef tells us that a 9700k 8cores is getting maxed out.
 
Not many folks care what happens at 720P in various games, or, everyone would have gone X99/X299 years ago...

But, the above 720P graph shows fairly bad core/thread scaling, IMO....I'd expect a better improvement in jumping from 4c/8t upward, but, apparently 4c/8t is enough there...and being stuck at sub 100 FPS regardless of CPU is not a very nicely optimized game...
(Maybe we'll know more about this games quirks in 4-5 years time...)
 
Not many folks care what happens at 720P in various games, or, everyone would have gone X99/X299 years ago...
A lot of people care about future proofing though and 720 shows you what will happen with a better GPU,just have a look at our forum to see how many topics are about "I got a better GPU and now I have worse performance"
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
A lot of people care about future proofing though and 720 shows you what will happen with a better GPU, ...
I disagree that it's useful for "future proofing":
  1. You very rarely (if ever) replace your video card to gett better performance at minimum settings. It's far more relevant to know what performance I will get at the resolution I want to use with the card I plan to buy soon.
  2. The performance in future games is also not a known factor. We know that until recently most games were optimized for two to four threads and saw little increased performance with more threads available. Now an increasing number of games are optimized for eight threads, with an ability to make use of more than that. Therefore a "high speed" Core i5 that was really good in this type of test a few years ago will in modern games be inferior to a lower clocked Ryzen 5 that suck at the older games.
 
I disagree that it's useful for "future proofing":
  1. You very rarely (if ever) replace your video card to gett better performance at minimum settings. It's far more relevant to know what performance I will get at the resolution I want to use with the card I plan to buy soon.
  2. The performance in future games is also not a known factor. We know that until recently most games were optimized for two to four threads and saw little increased performance with more threads available. Now an increasing number of games are optimized for eight threads, with an ability to make use of more than that. Therefore a "high speed" Core i5 that was really good in this type of test a few years ago will in modern games be inferior to a lower clocked Ryzen 5 that suck at the older games.
1. All test are made with ultra settings for this reason,resolution is 100% based on GPU power (graphics units) if you can hit 100FPS at a game in 720 you can hit the same FPS in any resolution as long as the GPU can keep up.
"it's far more relevant to know what performance I will get at the resolution I want to use with the card I plan to buy soon."
Yup,that's what GPU benchmarks are for.

2.Yes,and now we have a couple of years worth of games that (some of them) do use 8 and more cores and we can clearly see a trend with those ones.
And that trend is that there is no scaling beyond 4/8 threads even if you go to 720p resolution ON A FRIGGIN 2080TI .
https://www.techpowerup.com/reviews/Performance_Analysis/Assassins_Creed_Odyssey/4.html
cpu_720p.jpg
 

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
There may be an argument that 4c 8th will run AC: Odyssey as fast as 8700K. However I would question that.

These graph of CPU scaling added in this thread, might be for average frame rates. hHowever it's the minimum fps rates that matter too.

I have hard evidence to an extent that 4c 8th is not enough for AC: Odyssey. Previous to my 8700K, I had the 4790K, but only running with RAM at 1600MHz. The 4790K struggled quite badly with Odyssey.

In fact I gave up playing Odyssey after about an hour, because it was not worth playing. The 4790K would run generally around 60 fps. Meaning I would often see shimmer and stutter associated with going under 60fps. When Odyssey got even a little intensive, fps dropped quite a bit.

(To compare performance with AC: Origins. In Origins, the 4790K would drop to about 40 fps min at worst. In Odyssey it would be worse, because Odyssey is more CPU intensive. Although I never got far enough in Odyssey with the 4790K to see that for myself.

The 8700K in Origins would manage 75fps in places where the 4790K would manage 40 fps.)

I can't account for a CPU like the 7700K. It would be running faster RAM, than my 4790K, and be faster too.

Generally in AC: Odyssey, the 8700K beats 4790k, quite well. CPU load runs at about 40-75%, and sometimes all cores are nudging 100%. FPS might drop to 57fps. (The 4790K in those places would be struggling badly; maybe 35fps.)
 
Last edited:

U6b36ef

Distinguished
Dec 9, 2010
588
1
19,015
The short answer however to whether it's worth upgrading from a 7700K, is different. If I had a 7700K, I would play games that it can play well. I would leave and save games, that the 7700K could not play well. Then upgrade when I can not play many games properly, due to the CPU. (That's what I did with my 4790K >> 8700K.)

It also depends on the frame rate that you want. E.g. I played The Division (1), on my 4790K, and it ran perfectly. 70-90fps all the way. On the 8700K, it ran at 110fps which I cap frame rate at. (144Hz g-sync monitor.)
 
There may be an argument that 4c 8th will run AC: Odyssey as fast as 8700K. However I would question that.

These graph of CPU scaling added in this thread, might be for average frame rates. hHowever it's the minimum fps rates that matter too.
CPUs with more cores have even higher clock deltas across all cores meaning that they will run games even more un evenly,a 4/8 CPU will boost all cores to the same clock.
I have hard evidence to an extent that 4c 8th is not enough for AC: Odyssey. Previous to my 8700K, I had the 4790K, but only running with RAM at 1600MHz. The 4790K struggled quite badly with Odyssey.
Nobody has argued that a CPU from 5 generations ago will run games as well as a current CPU.