Review Intel Core i5-10600K Review: The Mainstream Gaming Champ

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

King_V

Illustrious
Ambassador
Let's not get ahead of ourselves.

At least not until we get one of these:

flux-capacitor-from-back-to-the-future.gif
 
  • Like
Reactions: bit_user and Flayed

TJ Hooker

Titan
Ambassador
Neither of those examples particularly matches the one I cited.

It's pretty easy to understand, if you just try to visualize it. In the bit of eSports I've watched, it pretty common for players to flick around their mouse to see what's going on around them. The smoother (i.e. more frames in) the pan, the easier it is for your eyes to track and for your brain to spot items of concern.
I know my examples didn't match yours, that was the point. If the improved performance from high fps/refresh only relates to smoothness of motion when you pan/flick your mouse/etc (as you claim), then logically it seems the examples I described should not have seen an improvement from high fps/refresh. But they did.

I know what jitter is, but it seemed that the way you described it was a bit different than the usual definition. But I realize now I may have misinterpreted you.

But jitter and input lag are linked. Every ms between when a frame is finished being rendered contributes to equally to jitter and input lag. If you increase the refresh rate/fps and reduce jitter, you will have also reduced input lag. Of course, there are other sources of input lag other than just the delta between when a frame finishes and when it is displayed.

I am definitely skeptical overall that shaving off ~10 ms of input lag with a high refresh rate display would in itself make any measurable difference in performance. So there does seem to be something else going on, and what you say about reducing jitter/improving smoothness could very well be it. But at the same time, it seems that may not be the whole story either.
 

TJ Hooker

Titan
Ambassador
Cute, but your conceit is that casual gamers care about framerates as much as serious gamers, and have the same wherewithall and drive to pursue them.

As I'm sure you know, the reason casual gamers are set apart, is that we presume they fit @Phaaze88 's description, and are simply gaming for a bit of fun. To the extent they have a competitive streak, we presume the games they play are older or otherwise targeted towards lower-end hardware, or aren't terribly sensitive to lowish framerates.

Now, what I find interesting about the past few years is that 4k monitors became affordable rather quickly, and have gotten significant uptake by "normies" with otherwise unremarkable hardware. As a result, even many "casual" gamers probably won't get away with integrated graphics.
I'm actually not really sure how this relates to what I said.

The situation being described is incredibly niche. If someone is running an i7 8086K at 4.9 GHz and a GTX 1080, they would not be CPU bottlenecked in the vast majority of cases, even in modern games (except maybe if you're playing at 720p or something silly). The only times you would be CPU limited is if you've playing a horribly optimized game, and/or VR (based on what people have said here, which I'm taking their word for), the latter still being a fairly niche market. The idea that only people who play VR and/or poorly optimized games are "serious" gamers is silly, hence my remark.

You could be a hardcore gamer, but if you're gaming at 4K you will not be CPU bottlenecked regardless of whether you have an AMD or Intel CPU (unless you're playing a game that's old enough that you're getting hundreds of fps anyway). Even at 1440p it would be highly circumstantial as to whether you'd see a difference. So again, saying that only Intel would be suitable for these people, when in reality it wouldn't make a difference, doesn't make any sense. Unless people consider "serious" to be synonymous with competitive twitch shooters or whatnot, and thus assume that all serious gamers are playing at 1080p or lower with a 240 Hz monitor and high end GPU. In which case I don't agree at all with that assumption.
 

bit_user

Titan
Ambassador
If the improved performance from high fps/refresh only relates to smoothness of motion when you pan/flick your mouse/etc (as you claim), then logically it seems the examples I described should not have seen an improvement from high fps/refresh. But they did.
Because I didn't say it's the "only" consequence, the presence or absence of other benefits is irrelevant to my point.

jitter and input lag are linked. Every ms between when a frame is finished being rendered contributes to equally to jitter and input lag.
Not true. Jitter is simply the error between when a sample is expected vs. when it is delivered. If all we're concerned about is smooth motion, then input lag doesn't even enter the equation.

If you increase the refresh rate/fps and reduce jitter, you will have also reduced input lag.
Two dependent variables can be influenced by the same independent variable, without having any dependency relationship between them.
 

bit_user

Titan
Ambassador
I'm actually not really sure how this relates to what I said.
In the last paragraph, I moved onto a tangent (about casual gaming vs. hardware requirements, in the age of cheap 4k displays). Perhaps that's what threw you off.

The situation being described is incredibly niche. If someone is running an i7 8086K at 4.9 GHz and a GTX 1080, they would not be CPU bottlenecked in the vast majority of cases, even in modern games (except maybe if you're playing at 720p or something silly).
Right. Which is why it's so ridiculous that this article only tested gaming performance at 1080P ...with a RTX 2080 Ti!!

Even giving Paul the benefit of the doubt that he was just trying to tease out the differences between the CPUs, he could've at least acknowledged that, in the text. If he did, I must've missed it.

If a reader approached this review with the question "which of these CPU should I get for gaming?", they'd likely come away with a wrong impression.
 

Specter0420

Distinguished
Apr 8, 2010
114
35
18,710
Same for me, I would go for the 3700X anytime

You would pay more for 9-11% worse gaming performance? That is 11% worse in 2D so 16-21% worse in VR. Far fewer AMD users overclock because it doesn't make much sense to eliminate your, already lacking, single-core boost for an even slower all-core overclock.

That said, you could probably compare the Intel OC results to the standard Ryzen results without much grumbling from realistic people. There isn't a performance downside to overclocking Intel at all. When comparing like that the difference is staggering, especially in VR.

You'll be able to fuel at least one more video card generation with that rig, possibly two extra generations. So you upgrade your GPU and CPU in two or three years (historically still not matching the OCed Intel from 4 years ago in gaming). A couple of years after that you'll be building a new base rig and upgrading your video card to finally match the Intel I had all along. I'll be looking into upgrading just my video card again and sticking with my original investment for another 2-3 years.

You spent 240% in the same 6 years to save the 20% upfront, I had the better experience all along. I don't have a crystal ball, so this is based on history but...

Pay more for less? That would be an insanely stupid move. Do you even game?

https://cpu.userbenchmark.com/Compare/Intel-Core-i5-10600K-vs-AMD-Ryzen-7-3700X/4072vs4043
 
Yikes! To use the Intel-biased userbenchmark of all things... and to make it worse, I know that CSGO result is wrong, as the Source Engine appears to favor Ryzen.
Benchmarking sites use the Csgo replay feature that records all the inputs of all the players and allows you to render the results afterwards,it's the equivalent of cinebench or something and has nothing to do with actual gameplay,same goes for pubg, lol, fortnite and probably many more.Most of the times if they show videos of it side by side you can see how they are the exact same frame for frame.

Some benchmarking sites were even cute enough to run the ancient benchmark of csgo that was the de facto years ago and is so small that it fits into ryzens cache, if not completely at least to the largest degree.

And nobody can touch them because there are zero standards on how to conduct a proper benchmark.
 

Existz

Prominent
Jul 7, 2019
17
0
510
reading all the comment in here... making me confused
people keep comparing i5 10600K to Ryzen 7 3700X where it should be R5 3600/3600X
maybe because AMD is way cheaper in the US or majority in western country. but in my country its more worth it to buy 10600K with Z490 motherboard compare to R7 3700X with B motherboard series... the different is very little.. so i build the 10600KF (identifal performance with no iGPU) and a MSI Z490 it works great!! at any task..
very easy to cool off with cheap budget air cooler like gammaxx 400.. it can sustain 5ghz OC with temperature around 50-70C (most of the time).. if lower it to 4.8ghz it cool down to 50-60C on gaming or heavy task.. this with a cooler with just wort of 15$ (for you who always complaint about stock intel cooler) AGAIN, for 15$ you get to overclocked your cpu to 5GHZ all core,comfortably! btw my setting abit different with TPU review.. i set the Vcore to 1.320V (adaptive) anda LLC mode 2. and ring to 4400mhz only.. its stable on all task.. downloading,encoding video,gaming etc. no crash or BSOD.
 
Feb 19, 2020
7
0
10
well I have 144hz monitor , Avg'd 130fps in BFV High settings RTX off with 1660 TI and 2700 non x.

165fps average Ultra /rtx off 3600x + 2060super @ 1080p resolutions . My framerate with RTX on isn't bad really even with the 1660TI but it adds this odd imput lag where I click n ppl aren't stabbed when they should be /shrug.

still bust 100 fps avg <-- at 1440p.

Granted I OC and had the #1 2700 all categories on User benchmark off and on for 14 months , hit 489 CPUZ Singlethread with it , with 240m AIO heh.

Same AIO got me the #6 US region Timespy score with the 3600x ive had it for about week now , same cooler.

the review guys do the best they can but they don't get to spend as much time with these pieces of hardware as we do. NM all the different types of ram kits / motherboard combonations and cooling solutions / bios settings etc as we users do.

I Used intell from pentium 120mhz to the P2 450 ...p3 800mhz coppermine Core 2 ....Core 2 duo etc . The 2700 non x my first AMD cpu (cheap at the time and mobo platform had life left) Folks likely upgrade the CPU waaay too often. This hype over the console? We have to consider many of the games they are boasting won't run at 120hz or 144hz every frame/scene in somecases IF AT ALL.

Magically AMD created a CPU that we should be concerned with that can keep itself cool and blow us all away at stock inside a shoebox? Anyone remember how hot the 5700 /5600 GPU series ran? Gonna stick a Navi card in a shoe box next to the 16thread CPU ?

I am not worried heh. When they give Console buyers the ability to change settings and hardware THEN ...well then none of them would keep it in those lunch boxes anyway n they would be on pcpartpicker with the rest of us lol.