Time to upgrade from i7-5820k overclocked to 4.582Ghz to the new i7-8700k vs stay put ??

stillsurfing

Prominent
Oct 15, 2017
8
0
510
I have a really good i7-5820k cpu which can overclock to 4.8Ghz with 1.4v and stays below 80C with heaving stress tests pushed all the way.

But, I have never pushed it that far in daily use and run 24/7 at 4.582Ghz with lower voltage.

I use a coolermaster HAF XB EVO case with massive cooling, 18 fans (including the 3 on the EVGA 1080ti FTW3).

With that type case cooling everything runs very cool and quiet !!

I use the computer only for email, web surfing, a little word processing and gaming.

I play graphics intensive games like Witcher 3 and Skyrim on a Asus 34" 2K PG 348Q overclocked to 100hz.

With max ultra graphics settings on all game features, the system gets in the 70 fps range on really high graphics intensive games (like Witcher 3).

The question I'm having trouble answering is if there would be sufficient performance gain moving to the i7-8770k ..... would all games then max the monitor out to it's limit at 100fps?

Or, should I really wait until Ice Lake is released in 2019 which will be using 10nm process technology instead of the current 14nm process technology.

Unfortunately, since the 8700k requires a new z370 motherboard, the cost to make the change will be a little over $700.

I might be able to offset a few hundred by selling the x99 board & 5820k on ebay.

The 5820k and 8700k are both 6 core/12 thread cpus and both made using 14nm process technology.

So, would it really be an "upgrade" or more of a "sidegrade" ???

Any thoughts out there about this ??

Or, any results by anybody who had a high overclocked 5820k that has moved to a similarly overclocked 8700k ??

 
Solution
The limiting factor here is the gpu. The cpu is well capable of handling that fps, resolution means almost nothing to the cpu, that's all on the gpu.

Take a blade of grass. The cpu is responsible for translating the code from source into what a blade of grass looks like, it's angles, directions, flow, color, shading, location on screen etc. It'll do that for 1080p or 4k just the same. That info gets sent to the gpu, which now has to take that info and paint the picture. At 1080p, that's easy for a 1080ti, at 4k it's a lot more involved. The cpu will also send that info to the gpu on demand, if that demand is 70fps, so be it, same info at 1080p as 4k, so that's really a non issue. The gpu however, has to now paint a 4k blade of grass...
It would be more of a sidegrade. Sure the 8700K can be clocked a bit higher, but not nearly enough to push you from 70FPS to 100FPS and not enough to warrant spending money on new CPU and mobo.

18 Fans? I'm surprised your computer are still on the floor on not hovering at the roof.
 
The 5820k is based on 22nm, not 14nm++ like 8700k. The 8700k has roughly 10% higher IPC vs 5820k. I've had mine since launch day in 2014. I run mine at 4.4Ghz with 1.325v. I'm planning the same upgrade myself, waiting for 8700k on backorder like everyone else, haha. I want a more modern platform, plus my launch day 5820k I had to settle in at 4.4Ghz to stay on a reasonable voltage, and took forever to get memory dialed in. Had to settle on 2666 due to my IMC. Hoping Z370 setup can be used with 144Hz 4k down the road too. Currently very happy with 60Hz 4k G-Sync. I was using 34" 60Hz Dell U3415W before also at 3440 x 1440.

The 8700k should give you a boost in games, but it will vary by game. I don't see you gaining near 30fps on CPU alone at 3440 x 1440P, maybe if you were using 1080P and CPU was bottleneck. Outside of gaming, the higher IPC and clockspeed should give you a performance boost in general use.
 
The limiting factor here is the gpu. The cpu is well capable of handling that fps, resolution means almost nothing to the cpu, that's all on the gpu.

Take a blade of grass. The cpu is responsible for translating the code from source into what a blade of grass looks like, it's angles, directions, flow, color, shading, location on screen etc. It'll do that for 1080p or 4k just the same. That info gets sent to the gpu, which now has to take that info and paint the picture. At 1080p, that's easy for a 1080ti, at 4k it's a lot more involved. The cpu will also send that info to the gpu on demand, if that demand is 70fps, so be it, same info at 1080p as 4k, so that's really a non issue. The gpu however, has to now paint a 4k blade of grass 70x a second, vrs a 1080p blade of grass 70x a second. Big difference. It could probably paint that blade of grass at 300fps at 1080p, but at 4k it's getting chopped to just 70fps.

The 5820k is quite capable of moving that info at 144Hz, it's the gpu that's struggling to get past 70fps.
 
Solution
Thanks guys !

Yeah, you're all sort of reinforcing the concern that there isn't going to be that big of a jump in gaming FPS with the switch to the 8700k from the 5820, especially if the 5820 has a reasonably high overclock.

In my situation it sounds like only about 10% improvement, say 7fps .... $700 seems like a lot to spend for that little improvement in gaming.

Unless there are some really unexpected performance gains after more guys get their hands on the 8700K, I think I'm going to just wait for the 9th gen intel or AMDs Zen cpus.


 



Thanks for the info !!!

Sounds like we have very similar rigs.

I've got the Asus X99 Pro with Adata DDR4 ram (xmp'd to 2800).

And, our monitor load is somewhat similar 4K 27" vs 2K 34". Not sure which is really more demanding.....

Just curious, if you're already maxing out the fps of the Asus 4K monitor (at 60fps) why not wait on changing the cpu until you get a more demanding monitor ?

Hey, if it's just because playing with new tech is fun .... I completely understand !!

Oh, and you're right ..... the Haswell chips are 22nm, not 14nm !!!
 



Thanks for the explanation !!!

It makes a lot of sense about why a new CPU isn't going to do much for fps..... especially compared to a GPU !!


 
Contrary to what ppl think (yeah lol) size matters not. A 1080p monitor at 24" has the same pixels as a 60" big screen TV, both have 1920x1080. The difference being in the size and spacing of the pixels. You have no real issues sitting 2-3ft from a 24" but sit that close to a 60" and all you see is fuzz. A 4k resolution has 4x the amount of pixels as 1080p, and each pixel has to be correctly represented every frame. At 2k (really not 2k, it's 1440p which has @1.7x the amount of pixels to 1080p) the gpu has to work almost twice as hard to paint every pixel right, and at 4k it's working 4x as hard as 1080p.

So 4k on a 27" vrs 2k on 34", size being irrelevant, the 4k gpu is doing over 2x the workload. Big difference in fps is a result, at 4k anything close to 60fps is very respectable, whereas 60fps at 1440p is chump change, attainable by a gtx1080, not to mention a 1080ti.
 



Very interesting ......didn't know that ........ best thing about life ..... ya can learn something new every single day !!!

Thanks !

What's your opinion about a CPU change from an overclocked 5820k to an overclocked 8700k (assuming are both base clocked at about 4.6Ghz) using a 34" ultra wide 2K monitor ?

Would there be a significant difference in FPS assuming the same 1080ti GPU and 16Gb of 2800Mhz DDR4 RAM ?






 
Not really. Depends Alot on exactly what settings are cpu bound. Many ppl do not bother running any form of anti-aliasing at 4k, there's really no need, but at 2k that might be dependent on your personal tastes. So dropping that down from max can be pretty freeing on cpu resources.

But most affects are not determined by max fps, but by minimum. So if you are averaging 70fps,you are seeing some frames at 100 or higher, maybe, but others might be dipping as low as 30ish. That's where you'll loose any buttery smoothness. The constant switching between 30ish then 100ish. This'll be where the 8700k might make a difference, but it'll be small, minimum frames might only go up 5fps or so, the maximum doesn't matter, being beyond the refresh of the monitor. To me, that's not really worth spending $700+ for.

Far cheaper to relax a few of the cpu bound settings, see if that makes a difference, if it does then you need to decide because if that relax doesn't really change much, then going 10% faster IPC with the 8700k is only going to change the game a similar amount at best. Viewing distance, grass detail, fog distance, physX, AA, fxaa, mxaa AF, are all cpu settings. There are others, but those have the most affect. Do you really need to see single blades of grass on distant hills?
 



well, that's a good point.

I've been playing most games at ultra-ultra everything ....... on every game ...... and, letting the fps hit where it hits.

The really graphic intense ones, like Witcher 3 end up in the 70s.

But, I think I'll relax some of the graphics settings until I hit a consistent 100fps on the asus monitor and game at that for awhile and see if I really notice any difference.

Really, it likely won't be even noticeable .......