Question i7 13700k vs i9 13900k for 4k gaming

Jan 12, 2023
1
0
10
Hey!

i7-13700k vs i9-13900k with "futur proof" in mind for 4k gaming, Ray Tracing, Unreal Engine 5, etc. What do you think?

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

I got an RTX 4080. I was comparing RTX 3090 vs RTX 4080 vs RX 7900 XTX.

I noticed that the 7900 slightly beats the 4080 on some games when it doesn't matter. What I mean by that is that the 7900 gets +3 to +10 FPS on games where both GPUs already reach 100+ FPS - So I don't care.

However when it comes to ''challenging games'' like CP2k77, and when it comes to Ray Tracing, then the 4080 can stay in the relevant FPS range of 45-50 while the 7900 struggles with 25-40 FPS. So when it matters, the 4080 is better.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Why did I say all of this? Because I want to know the exact same thing about i7-13700k vs i9-13900k but I struggle to find good comparison videos/websites. 1080P/1440P videos are useless to me. I want end-game comparison that matters like I said between 4080/7900.

So is there somebody who know if the i9 is worth it over the i7 for 4k challenging games, Ray Tracing, UE5, "future stuff"?

Thank you and sorry for the long post!
 
Hey!

i7-13700k vs i9-13900k with "futur proof" in mind for 4k gaming, Ray Tracing, Unreal Engine 5, etc. What do you think?

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

I got an RTX 4080. I was comparing RTX 3090 vs RTX 4080 vs RX 7900 XTX.

I noticed that the 7900 slightly beats the 4080 on some games when it doesn't matter. What I mean by that is that the 7900 gets +3 to +10 FPS on games where both GPUs already reach 100+ FPS - So I don't care.

However when it comes to ''challenging games'' like CP2k77, and when it comes to Ray Tracing, then the 4080 can stay in the relevant FPS range of 45-50 while the 7900 struggles with 25-40 FPS. So when it matters, the 4080 is better.

- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -

Why did I say all of this? Because I want to know the exact same thing about i7-13700k vs i9-13900k but I struggle to find good comparison videos/websites. 1080P/1440P videos are useless to me. I want end-game comparison that matters like I said between 4080/7900.

So is there somebody who know if the i9 is worth it over the i7 for 4k challenging games, Ray Tracing, UE5, "future stuff"?

Thank you and sorry for the long post!
The i7 will be more "future proof" than a 4080 @ 4k. IMO
 
  • Like
Reactions: Yacy
The reason they do comparison tests for cpu at 1080p is when they test at higher resolutions all cpu get the same number.
Even at 1080p the numbers have very little true value. Sure you might get 560fps rather than 520fps for some cpu but it really just lets you compare the cpu. Very few people buy top of the line cpu and video cards and then run them at 1080p.

Until the 4090 came out you could not see much difference even at 1440 on most games.

So except for some limited games until we get a video card that is much faster than even a 4090 the cpu you use when running at 4k is not going to make much difference. Now this is within reason if you have some 5 yr old cpu and compare it to a i9 13900 there will be a difference.
 
  • Like
Reactions: Why_Me

Karadjgne

Titan
Ambassador
Core count, clock speeds, IPC.

Most new games run with 6-12 cores, so any cpu with more may or may not be a bonus.

Clock speeds have almost become irrelevant for Intels, game devs base their game abilities on closer to a 3.2GHz cpu, as that gets the broadest possible amount of ppl, plus anything from Amd. Anything higher than @ 3.2GHz may or may not be much of a bonus, it's highly dependent on the particular game.

IPC. Instructions Per Cycle. Goes somewhat hand-in-hand with clock speeds, the more instructions the cpu can process per Hz, the higher the fps. So a 13700 at 4.0GHz is going to roughly equal a 9700k at 5.0GHz, speed making up for the lack of IPC.

Cpus have nothing to do with resolution. That's all gpu. If a cpu can send the gpu 500fps it's upto the gpu whether it'll render that at 1080p or 4k etc and just how much resources will be involved. At 720p you may get all 500fps, but at 4k you'd be looking at closer to 150fps on screen. So it makes little difference whether a 5600x can send 200fps or a 13900k can send 500fps, if at 4k the gpu can only process 150fps.

This is why cpus are tested at 1080p using a 4090, to remove the gpu as any sort of limiting factor.

The differences in cpus like a 13700 and 13900k boil down to cache size and clock speeds, lesser cpus being more limited by the game code itself, it's complexity, Ai, lighting affects as well as clock speeds and IPC etc. As those values go down, so does possible fps.

The monitor also has an affect, anything remotely modern is going to drive fps well beyond 1080p/60Hz, so fps becomes almost moot, it's only function being in a slight reduction in latency, which really only affects high motion multi-player games like CSGO, LoL or CoD etc.

So things have to balance. You only need a cpu capable of putting out higher fps than the monitor can handle, and you only need a gpu capable of putting out higher fps than the monitor can handle. That's upto whatever resolution you decide you want. Not much point in a 4k/240Hz monitor on a 5600x/3060ti, same as there's little point in a 13900k/4090 on a 1080p/60Hz monitor.