[SOLVED] Is anybody still using i5 2500K for gaming but with a high end gpu like GTX 2080?

Solution
Blah, blah, blah 1080p. Bump that to a 4k monitor and the i5-2500k will have almost no issues.

Fps limit is set by the cpu, not the gpu. Fps onscreen is the gpu. Cpu doesn't have anything to do with resolution or detail levels, that's all gpu. So a 9900k and 2080ti will get 60fps at 4k, the I5 2500k will be right behind it. The first will be gpu limited, the second cpu limited, but on a 60Hz 4k monitor, it doesn't matter which is limited but one for sure will be.

Even the more graphically intense games at 1440p are going to stress even high grade gpus.

Yes, there are plenty who use Sandy-Bridge and Ivy-Bridge still, it's a matter of not everyone plays the latest and greatest must have popularity contest games.
You can't expect a 9-year-old mid-range CPU to be amazing anymore.

View: https://www.youtube.com/watch?v=KPWEdbfJ0oE

Any new CPU would cream it.

Shadow of the Tomb Raider is like 2 years old and sees these results:
2500k gets 54 fps
A new Ryzen 5 3600 is 137fps
A new I7 9700k is 166fps

Why pay for a $500+ GPU and be so CPU limited you can only see 1/3 of the FPS you would get with a more capable CPU.

With 10th gen coming out, you could get a new 6 core 12 thread i5 for under $200.
 
Last edited:
  • Like
Reactions: mkaafy
You can't expect a 9-year-old mid-range CPU to be amazing anymore.

View: https://www.youtube.com/watch?v=KPWEdbfJ0oE

Any new CPU would cream it.

Shadow of the Tomb Raider is like 2 years old and sees these results:
2500k gets 54 fps
A new Ryzen 5 3600 is 137fps
A new I7 9700k is 166fps

Why pay for a $500+ GPU and be so CPU limited you can only see 1/3 of the FPS you would get with a more capable CPU.

With 10th gen coming out, you could get a new 6 core 12 thread i5 for under $200.
Have you looked at this?

Literally getting 1 third the framerate of new CPUs and having pretty bad stutter indicated by the lows below 30fps.

You can try to convince yourself that using a 9-year-old CPU worth like $30 and a $500+ GPU is a good idea, but it isn't.
 
The fact is that you don't need a 6 core or 8 core cpu to play high graphics and high FPS games with a decent GPU like a RTX 2080.
Ah ok. So that video comparing 4c/4t to 4c/8t to 6c/6t to 8c/8t to 8c/16t is completely wrong. Actually watch it and there are your answers. News flash. It's not 2011.

Why ask a question if you want to argue our CORRECT AND FACTUALLY SUPPORTED answers?
 
I think you should try and game on that CPU with 2080 and see what its like and run some test and see if you enjoy the performance because many people trying to help and you just want proof that it will bottleneck so testing it out might answer your question ! You need to move with the times when your system gets older because might not be able to play much games!. You have an good GPU but cpu is to weak!

Also try this might help little bit

https://www.userbenchmark.com/Software
 
Last edited:
I think you should try and game on that CPU with 2080 and see what its like and run some test and see if you enjoy the performance because many people trying to help and you just want proof that it will bottleneck so testing it out might answer your question ! You need to move with the times when your system gets older because might not be able to play much games!. You have an good GPU but cpu is to weak!

Also try this might help little bit

https://www.userbenchmark.com/Software

I don't have that RTX 2080 GPU yet, what I have is Gigabyte GTX 960 Windforce G1 Gaming 2gb GDDR5 GPU with i5 2500k overclocked at 4.5 GHZ and fortunately when I play CS.GO the average FPS was 80.
 
I don't have that RTX 2080 GPU yet, what I have is Gigabyte GTX 960 Windforce G1 Gaming 2gb GDDR5 GPU with i5 2500k overclocked at 4.5 GHZ and fortunately when I play CS.GO the average FPS was 80.

CS:GO is an older game and pretty much is a best case scenario for the 2500k, the game doesn't scale across lots of cores so the 2500k's low core and thread count by today's standards doesn't mean much and with an overclock the 2500k's single core performance isn't terrible by today's standards. Newer games that actually scale across more cores and threads do poorly on the 2500k. The 2600k hangs in there better due to hyperthreading and is still a somewhat viable gaming chip on current titles, but even with an overclock it still loses to more modern options.
 
  • Like
Reactions: aqe040466
CS:GO is an older game and pretty much is a best case scenario for the 2500k, the game doesn't scale across lots of cores so the 2500k's low core and thread count by today's standards doesn't mean much and with an overclock the 2500k's single core performance isn't terrible by today's standards. Newer games that actually scale across more cores and threads do poorly on the 2500k. The 2600k hangs in there better due to hyperthreading and is still a somewhat viable gaming chip on current titles, but even with an overclock it still loses to more modern options.


Supernova1138 you got the best answer Bro.
 
Blah, blah, blah 1080p. Bump that to a 4k monitor and the i5-2500k will have almost no issues.

Fps limit is set by the cpu, not the gpu. Fps onscreen is the gpu. Cpu doesn't have anything to do with resolution or detail levels, that's all gpu. So a 9900k and 2080ti will get 60fps at 4k, the I5 2500k will be right behind it. The first will be gpu limited, the second cpu limited, but on a 60Hz 4k monitor, it doesn't matter which is limited but one for sure will be.

Even the more graphically intense games at 1440p are going to stress even high grade gpus.

Yes, there are plenty who use Sandy-Bridge and Ivy-Bridge still, it's a matter of not everyone plays the latest and greatest must have popularity contest games.
 
Solution
You got that screwed up. Ignorance to nothing would make you the smartest person on the planet, basically beyond supra-genius. So how that would be inexcusable (spelled it right this time) is beyond me.

You can fix ignorance, that's called teaching. You can't fix stupid.
 
  • Like
Reactions: artk2219
A 2080 is massive overkill if all you play is CSGO, it is not a very gpu demanding game. It is also a game that can only use 4 cpu threads. However if you want to play modern AAA games then a 2080 would be excellent especially at 1080p or 1440p. However that old quad core will be a problem. I’ve got a second system with an i5 4670k and that thing struggles to average 40+ FPS in some games due to the cpu. With some game now able to use 12+ threads any 4 thread cpu is the bare minimum for modern games. Pairing a high end gpu with bare minimum cpu is a waste.
 
You should run that card on Pentium 4 pro, trust me, it works, a better cheap air warmer for winter.
Let's get a few facts straight. One, there was never a Pentium 4 Pro--it was either a Pentium Pro or a Pentium 4.

Secondly, most Pentium 4s weren't any more tdp than current processors, and the Pentium Pro generally didn't even have a fan on the heatsink so neither put out the heat of even a modern gpu.
Have you tried playing playing a game in Intel 386 system using Windows 3.1 OS?
I did very regularly growing up. It was still fun. Newer doesn't necessarily make a game more fun, hence why games like 'candy crush' are still making big bucks.
 
Your most defiantly going to bottleneck any highend graphics card that is PCIE 3.0 or higher due to the 2500k only capable of PCIE 2.0 16x
Even more so if you leave the 2500k at stock speeds. I have seen people over clock 2500k's to offset the lower bandwidth of PCIE 2.0, but you are still not getting the full potential of your graphics card any way you slice it. Same we can currently say with Intel's 10th gen line which is only capable of PCIE 3.0 and pairing it with a PCIE 4.0 graphics card. That is why I am waiting till Intel releases their Alder Lake-s cpu's.
 
Your most defiantly going to bottleneck any highend graphics card that is PCIE 3.0 or higher due to the 2500k only capable of PCIE 2.0 16x
No. Bottleneck implies an obstruction to the flow of data, whether that be fps or not. The cpu is the source of data going to the cpu, although with the Ampere models the cpu can be bypassed for some direct renders by the gpu. The cpu will NOT slow the gpu down at all.
but you are still not getting the full potential of your graphics card any way you slice it.
Bingo!
Same we can currently say with Intel's 10th gen line which is only capable of PCIE 3.0 and pairing it with a PCIE 4.0 graphics card.
With the apparent death of sli and the dismal showings of mgpu capable/optimized games, it's all single cards again for graphics. The 2080ti is hard pressed to saturate pcie2.0 x16 bandwidth and takes a 3090/Titan class card to even come close to saturating the bandwidth of pcie3.0 x16. The only benefits to pcie 4.0 are a slight increase in transmission speeds, not capability.

With the processing speeds of the older gen cpus, budget cpus, slower ram, slower storage etc you'll not be able to fully utilize the ability of Gen4 anyway, hense the non-necessity for Gen4 in the B450 motherboards, the limited availability of Gen 4 in the B550 motherboards and the absense of Gen4 in the Intel mobo's until 11th Gen cpus show up (supposedly).


Gen4 usage is a 300mph race car, but with the supporting tech, it's being driven on a Go-kart track.

A 2080ti used on a 4k monitor, even with a 2500k makes perfect sense, since the gpu will be needed for that resolution. A 2080ti used on a 1080p monitor is somewhat wasted as the cpu can't supply enough fps to make it worth it.
 
Last edited: