Review Intel Core i5-10600K Review: The Mainstream Gaming Champ

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

TJ Hooker

Titan
Ambassador
Ultra high-framerates aren't about reaction time. Their main benefit is smooth, natural motion of fast-moving objects.
For competive gaming, especially professional, improving reaction time/input lag is absolutely an important part of why people chase super high fps. Now, I don't know how much difference it actually makes for an average Joe, but for the pros having higher fps and/or refresh rate makes a measurable difference in performance is something like a fast paced shooter.
 
  • Like
Reactions: Gurg

bit_user

Polypheme
Ambassador
For competive gaming, especially professional, improving reaction time/input lag is absolutely an important part of why people chase super high fps.
That might be what they say, but the difference of a couple ms is imperceptible (unless we're talking about VR). Of course, they want every edge they can get, so that might indeed be something they're chasing.

I think the far greater effect is that you can flick your wrist to glance left/right and see a smooth pan, rather than a couple disjointed frames.

for the pros having higher fps and/or refresh rate makes a measurable difference in performance is something like a fast paced shooter.
I have no doubt it makes a difference, but I believe it's mainly about helping the brain track everything the eyes are seeing. When you have fast moving objects (including fast-paced view changes), the extra frames make a big difference in how smooth the motion appears.

I've seen it for myself. I doubt most of the high-framerate skeptics ever have.
 

demonsoldier

Distinguished
Jul 29, 2011
82
0
18,630
They should have somehow gotten that 10nm going because at this point they finally turned on threads for all CPUs which from my understanding it could have been possible sooner. But keeping 14nm around for so long the IPC gains seems to come from higher clocks and activated threads. Downside to all of this. Is they now take the controls for being the more power hungry of the 2 manufacturers.
 

Specter0420

Distinguished
Apr 8, 2010
111
28
18,710
That's one datapoint, though. I don't doubt there are others, but 5 of the 9 gaming benchmarks (i.e. just those listing fps) have > 100 fps @ 99th percentile!

So, yeah, you can be CPU limited, but that doesn't mean everyone is!

Everyone will become CPU limited eventually. I don't think that I am that unique, I upgrade my video cards until the overclocked base platform is no longer able to keep up (or I find a CPU bottleneck like VR flight sims). Then I upgrade my system and move my higher end cooler and other components over to the new one.

Ryzen has just now matched the overclocked i7-6700K in gaming ability... My cousin paid 20% more for that Intel option with a cooler vs AMD at the time. He upgraded the video card to a 1080TI along the way. If he went the AMD route he would have had inferior performance for 5 years. He would have needed to rebuild the entire PC for Ryzen and upgraded that CPU at least once, maybe twice before matching the gaming performance he could have had for the last 5 years for FAR less money overall...

I have a similar story, 10 years on a heavily overclocked first gen i7-920 (base 2.66Ghz, my OC 3.8Ghz). Only needed to upgrade for VR flight sims and it took 8 years for AMD to match my overclocked performance. That would have been 4+ complete AMD rigs before finally getting the performance I experienced for 8 years...

AMD is now far in the lead with process node, if they can come up with an architecture nearly as good as the Intel 14nm+++++, they will be mopping the floor with Intel at all ends of the spectrum. That said I'd still go Intel if I needed to upgrade right now but luckily I don't. I'd also recommend Intel to avoid the AMD VR Tax, VR on a capable rig with a modern VR HMD is amazing, the future of gaming without doubt. The 12'ish percent Ryzen loss becomes a 15-20'ish percent loss after 10th gen max overclocks start showing up, then add another 5 percent loss in VR tax and you are looking at a big mistake for future gaming. In VR that will be the difference between a mind-blowing experience and blowing chunks.
 

bit_user

Polypheme
Ambassador
If he went the AMD route he would have had inferior performance for 5 years. He would have needed to rebuild the entire PC for Ryzen and upgraded that CPU at least once, maybe twice before matching the gaming performance he could have had for the last 5 years for FAR less money overall...

I have a similar story, 10 years on a heavily overclocked first gen i7-920 (base 2.66Ghz, my OC 3.8Ghz). Only needed to upgrade for VR flight sims and it took 8 years for AMD to match my overclocked performance. That would have been 4+ complete AMD rigs before finally getting the performance I experienced for 8 years...
Got any stories about Pentium 4? Because that's about as relevant.

Yeah, so AMD CPUs sucked for a while, so you got Intel CPUs and they lasted a long time. Gee... now, why could you keep using them for so long? Maybe because you weren't CPU-limited, at the start? Maybe not even close?

That's all I'm saying - a lot of gamers aren't CPU-limited. Sure, keep any component long enough, and it's likely to become a bottleneck as you upgrade others.

AMD VR Tax,
I never said framerate didn't matter for VR. In fact, I already cited it as one case where I'd agree that milliseconds matter.
 
Last edited:

GenericUser

Distinguished
Nov 20, 2010
294
139
18,990
Oh, 144 versus 60Hz, absolutely agreed there. But what about 144 vs 120? Or 144 vs 100? etc.

I'm fairly certain that it's a little before hitting triple digits that humans cannot distinguish the frame rate changes, and most certainly cannot physically respond that fast.

I see what you're saying now. I don't remember the name for the term, but it's about the human eye's ability to more easily notice a larger variation between two things as opposed to a smaller variation, between the same two things. Like going from a 60hz monitor to a 144hz one, as opposed to maybe 144hz to 165hz. The smaller the margin is that you're increasing things by, the less likely one may be able to distinguish between the two.

So basically if you're only increasing by a small margin, I can definitely see some validity to what you're saying (speaking strictly of perception; more on reaction in a second). I misunderstood what you were originally getting at and thought you were implying that the available monitor refresh rates above 60hz weren't capable of being perceived by an ordinary human eye. My mistake.

As far as the reaction time, I read several articles on this a while back (I should go find them again) and to a certain point you can capitalize on the gains in terms of reaction times, but your brain needs to have been "trained" to do it. Some people are better "trained" at it than others (like some hardcore gamers), but there's definitely an upper bound for sure. I think fighter pilots were at around the top tier for that sort of thing. But as far as refresh rates go, as was mentioned before, I think most people go after the higher numbers for just a smoother picture.
 
........
Ryzen is often the better choice for old fashioned casual gaming. Intel is the choice for more serious gaming or modern VR gaming especially for those like me that are sensitive to ASW artifacts and need the full refresh rate of their VR HMD. But this CPU, with what I am seeing on Userbenchmark, is the far better choice for gaming in general vs anything Ryzen has.
.........

So basically you think you are a "serious gamer" cause you have an Intel CPU?

Sorry I don't/can't agree, I consider myself a "serious gamer" too, I just figure out and decided to go with the fastest GPU I could buy at the time and a very decent CPU.

Just because the game you play, plays better on intel does not mean is the same thing for every single PC setup around the world and for every other game out there.

At 1440p high/ultra detail I don't feel constrain at all on any of the games I play with my Ryzen 5 3600 (in fact I get steady Avg/1%low 60FPS all the time even with PB disable).
Is quite simple actually, when I got the Ryzen CPU, Intel parts that performed similar were about U$60 more, and where I live thats a ton of money. Heck even today intel CPU still cost way mroe than the current Ryzen counterparts.
 
Last edited:

Specter0420

Distinguished
Apr 8, 2010
111
28
18,710
Gee... now, why could you keep using them for so long? Maybe because you weren't CPU-limited, at the start? Maybe not even close?

That is exactly my point. Far too many AMD fans use the argument "At 1440 and 4K my GPU is the bottleneck in gaming so who cares about CPU performance?" I could save 20% on the CPU, mobo, and RAM and put that towards a better video card."

5 years later you've rebuilt an entire new rig and upgraded your CPU twice (turning your 20% savings into 300+% loss) before finally having the performance you could have had for the last 6 years. Guess what, you finally matched 6-year-old Intel performance in gaming, just in time for 6-year-old gaming performance to bottleneck the latest high-end video cards...

Meanwhile, the Intel guy has upgraded his video card once or twice and pushed his overclock.

Now both are looking into building a new rig. Intel guy had great gaming performance for the last 5-6 years and is going to stick with Intel because it is still the gaming leader. He is going to spend another 20% more again for the superior solution and best experience. It is going to save him money in the long run and provide a better experience the entire time.

AMD guy is listening to all of his buddies tell him the CPU weakness doesn't matter still. It will still be hidden by the first video card he gets. They tell him "AMD CPUs used to suck historically, but they don't now! Don't listen to that guy over there with proof that the NEW CURRENT 3RD GEN Ryzen still only beats an i5 from 5 years ago by 2 percent after overclocking both sides, and loses to it in VR! Games are going to use more cores soon! I know we've been saying that for 12 years now but seriously, for real this time!"

AMD guys sticks with AMD but deep down, he knows he chose the inferior solution. He'll find out in a couple of years when he wants to upgrade his video card and finds out he needs a new CPU that needs faster RAM. Two-three yeas later he is looking to rebuild again to finally match the performance of that i5-10600K he should have got to begin with.

I get it. AMD has made great strides and even has the process node advantage. It is a great choice if you are a casual gamer that runs a Chess AI, unzips WInZip packages all day, or compiles millions of lines of code daily. It is still not the best choice for gaming, especially modern gaming (VR) and future gaming (VR+).
 
  • Like
Reactions: Gurg and larkspur
All of these 10600K reviews and it's not on the store shelves yet. Anyone actually know when it will be released. And, I do not trust "golden" CPU samples sent out by Intel to review sites because they are often cherry picked for higher overclock performance.
 
  • Like
Reactions: bit_user

Phaaze88

Titan
Ambassador
All of these 10600K reviews and it's not on the store shelves yet. Anyone actually know when it will be released. And, I do not trust "golden" CPU samples sent out by Intel to review sites because they are often cherry picked for higher overclock performance.
Some retailers did receive stock, but they sold out in a matter of hours, if that.
Intel's still hurting from the 14nm shortage, so expect this to continue for a few more months, at least.
That's not even taking into account all the preorders that weren't fulfilled...
 

bit_user

Polypheme
Ambassador
Like going from a 60hz monitor to a 144hz one, as opposed to maybe 144hz to 165hz. The smaller the margin is that you're increasing things by, the less likely one may be able to distinguish between the two.
Increasing the framerate not only gives you more samples/sec, but it also reduces jitter (i.e the error between the intended and actual display time of the frame). Reducing jitter makes it easier for your eyes & brain to track objects and generally makes the motion smoother and more natural.

So, while I'd agree that going from 120 Hz to 144 Hz might not be noticeable, that's not to say it's entirely without benefit.
 
  • Like
Reactions: GenericUser

Phaaze88

Titan
Ambassador
Excuse my filthy casual-ness...

People that obsess over fps and input lag need to uninstall those fps monitors/counters and enjoy the blasted games they're playing.
Those dang things are poison, making people overspend and ruining the game experience.
I feel sorry for those sponsored gamers though, because they play to win, not to have fun. Is that not why everyone else plays games? To have fun? Relieve some stress?
You can't do that if you're constantly watching some bloody software monitor. You create a problem when there really isn't a problem; the Placebo Effect.

Yes, Intel is still gaming king - but at what cost?
Is everyone aware of what is needed for that top tier gaming performance? I highly doubt it. You gotta spend big for that edge:
-1080p resolution screen or lower.
-a 'K' SKU cpu.
-a Z series motherboard for overclocking.
-a high end cooler to keep the thermals under control. There's a need to test overclock stability, after all.
-a high top tier gpu. No, a 2070 Super isn't quite there. It's gotta be the models few people can afford to get their hands on.
-fast ram. Intel Skylake and its refreshes continue to scale up with faster ram, even past Ryzen 3000's hard limit of 3800mhz on 1:1 mode... this is with diminishing returns VS price.
People shouldn't try and cut corners from the above, as they'll end up downplaying themselves and bring the gaming performance closer to be on par with someone running a more affordable Ryzen build... and that's no good, is it?
Because AMD is for plebs, right? < I'm pretty sure some people think like that. Another form of segregation; the financial kind... :pfff:

For the Average Joe, the Poor Man, and the Money Conscious = AMD.
For the top gaming performance with money being no object = Intel.
If you get an Intel cpu and still choose to cut corners anyway, you'll be right there with folks who got themselves a decent Ryzen setup.

On topic:
10600K, and 10th gen in general: Too little - or too much, depending on POV - too late, except for those with lots of... cake? Bank, I meant bank! Slipped my mind for some reason...

-A rant by a filthy casual.​
 
Last edited:

TJ Hooker

Titan
Ambassador
That might be what they say, but the difference of a couple ms is imperceptible (unless we're talking about VR). Of course, they want every edge they can get, so that might indeed be something they're chasing.

I think the far greater effect is that you can flick your wrist to glance left/right and see a smooth pan, rather than a couple disjointed frames.


I have no doubt it makes a difference, but I believe it's mainly about helping the brain track everything the eyes are seeing. When you have fast moving objects (including fast-paced view changes), the extra frames make a big difference in how smooth the motion appears.

I've seen it for myself. I doubt most of the high-framerate skeptics ever have.
I'm not so sure.

I'm mainly going off the video LTT did on high fps/refresh rates, which is one of the best investigations I've seen (if you know of any other good articles/videos on the matter, please share). A couple of their findings seem to disagree with what you're saying:

1. They did a couple of pure reaction time tests. One was completely synthetic (how fast you can click after the screen flashed), the other was a scripted scenario in CS:GO (sniping in a fixed location through a narrow gap as a bot ran past the gap at random intervals). So no panning/view changes. In both cases higher refresh rate/fps resulted in measurable performance improvement, for both amateurs and pros.

2. They also tested with having high fps (I think around 200) on a 60 Hz monitor. This also resulted in improved performance across the board, for pros and amateurs. I don't think this would really increase the 'smoothness', could even degrade the perceived smoothness due to having multiple screen tears in every display refresh. Seems like it would primarily benefit overall input lag.

Increasing the framerate not only gives you more samples/sec, but it also reduces jitter (i.e the error between the intended and actual display time of the frame). Reducing jitter makes it easier for your eyes & brain to track objects and generally makes the motion smoother and more natural.

What you've described here as jitter seems to be more or less the definition of input lag. I.e the length of time between when an input is received when that input is reflected on screen.
 
Last edited:
  • Like
Reactions: Gurg

TJ Hooker

Titan
Ambassador
So basically you think you are a "serious gamer" cause you have an Intel CPU?

Sorry I don't/can't agree, I consider myself a "serious gamer" too, I just figure out and decided to go with the fastest GPu I could buy at the time and a very decent CPU.
Of course, everyone knows you want an intel CPU for "serious", "modern" gaming! After all, everyone knows the hallmark of a "modern" game is one that is hugely dependent on single core performance, in contrast to "old fashioned, casual" games that are infamous for their ability to scale performance with more cores/threads.

/s
 
  • Like
Reactions: RodroX

bit_user

Polypheme
Ambassador
I'm mainly going off the video LTT did on high fps/refresh rates, which is one of the best investigations I've seen (if you know of any other good articles/videos on the matter, please share). A couple of their findings seem to disagree with what you're saying:
Neither of those examples particularly matches the one I cited.

It's pretty easy to understand, if you just try to visualize it. In the bit of eSports I've watched, it pretty common for players to flick around their mouse to see what's going on around them. The smoother (i.e. more frames in) the pan, the easier it is for your eyes to track and for your brain to spot items of concern.

What you've described here as jitter seems to be more or less the definition of input lag. I.e the length of time between when an input is received when that input is reflected on screen.
Nope.

In electronics and telecommunications, jitter is the deviation from true periodicity of a presumably periodic signal, often in relation to a reference clock signal. In clock recovery applications it is called timing jitter.[1] Jitter is a significant, and usually undesired, factor in the design of almost all communications links.
https://en.wikipedia.org/wiki/Jitter
 

bit_user

Polypheme
Ambassador
After all, everyone knows the hallmark of a "modern" game is one that is hugely dependent on single core performance, in contrast to "old fashioned, casual" games that are infamous for their ability to scale performance with more cores/threads.

/s
Cute, but your conceit is that casual gamers care about framerates as much as serious gamers, and have the same wherewithall and drive to pursue them.

As I'm sure you know, the reason casual gamers are set apart, is that we presume they fit @Phaaze88 's description, and are simply gaming for a bit of fun. To the extent they have a competitive streak, we presume the games they play are older or otherwise targeted towards lower-end hardware, or aren't terribly sensitive to lowish framerates.

Now, what I find interesting about the past few years is that 4k monitors became affordable rather quickly, and have gotten significant uptake by "normies" with otherwise unremarkable hardware. As a result, even many "casual" gamers probably won't get away with integrated graphics.
 

bit_user

Polypheme
Ambassador
6700K. While technically not the same cpu, they are the same.
Let's not play this game. Somebody says i7-7700K, I assume that's what they mean (unless it's quite obviously not).

Anyway, it's perhaps more accurate to cite Kaby Lake, since the 7700K had a 4.5 GHz turbo, while the i3-10320 has a 4.6 GHz turbo.

BTW, Skylake launched in Q3 of 2015. So, that'd be closer to 5 years than 4.
 

Phaaze88

Titan
Ambassador
Let's not play this game. Somebody says i7-7700K, I assume that's what they mean (unless it's quite obviously not).

Anyway, it's perhaps more accurate to cite Kaby Lake, since the 7700K had a 4.5 GHz turbo, while the i3-10320 has a 4.6 GHz turbo.

BTW, Skylake launched in Q3 of 2015. So, that'd be closer to 5 years than 4.
Ok, I'll stop.
 
Last edited:
  • Like
Reactions: bit_user
Cute, but your conceit is that casual gamers care about framerates as much as serious gamers, and have the same wherewithall and drive to pursue them.

As I'm sure you know, the reason casual gamers are set apart, is that we presume they fit @Phaaze88 's description, and are simply gaming for a bit of fun. To the extent they have a competitive streak, we presume the games they play are older or otherwise targeted towards lower-end hardware, or aren't terribly sensitive to lowish framerates.

Now, what I find interesting about the past few years is that 4k monitors became affordable rather quickly, and have gotten significant uptake by "normies" with otherwise unremarkable hardware. As a result, even many "casual" gamers probably won't get away with integrated graphics.
Even if you are a casual gamer and even if you don't actually care about the final FPS that you will be getting,
if you have to choose what your money is going to get you you are going to get whatever gives you the most of what you will be using.

Getting a CPU that is 10 times faster at productivity when you do 0% productivity then that's still 0% beneficial to you.Getting 10% more FPS is 10% beneficial to you because you are going to play some horribly optimized games or some games that are 100% single threaded or whatever.10% is 10% ,0% is nothing.