Question I always had Intel CPUs. Convince me about AMD.

Page 5 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Wikingking

Commendable
Nov 11, 2019
33
3
1,535
Hi there!

So I'm looking forward to build a new rig for gaming. Playing games in 1080p (21:9, 144Hz monitor) and then later in 1440p possibly will be in the focus. I'm planning to use this setup for the next 4-5 years and I am thinking of spending quite a lot of money, so high-end but not NASA-killer setups are in the talking (i5-9600k, i7-9700k, RTX2080-ish GPU).

I always had Intel processors but I am aware that AMD once again became a heavy-hitter, real-deal in the business. I am glad and everything but it is hard to let loose my initial discomfort of changing manufacturer and platform. However I am willing to do it considering that there are more than enough valid reasons behind it. Although emotions play a part and I surely can't really miss with an i7 or i5, nor can I with a Ryzen 5 (3600 and X) or Ryzen 7 (3700X).

So please, try to convince me buying a new rig with AMD architecture! But please keep in mind: framerate and longevity are the two key factors. I am not really interested in things like faster video export or faster times in WinRAR :D

Thank you :)
 

joeblowsmynose

Distinguished
... Intel for highest fps in games, but more expensive and with fewer cores (except 9900k and until next year.) AMD for 10-20% less fps in games, but lower price point with higher core/thread count making production work faster.

Isn't it 30-40% now? ;) (why does this number keep growing?)

There's a lot more to consider than simplifying it to that equation (which had wrong variables anyway)

"Testing with Assassin's Creed: Odyssey, 3rd-gen Ryzen is ~10% faster than the 2700X which is good, but not good enough to beat the 9900K, at least when looking at the average frame rate. Despite similar frame time performance, the 9900K was 4% faster on average at 1080p with an RTX 2080 Ti. " - testing R7 3700x --- this testing was done with very high or ultra game quality settings and a mix of 1080p and 1440p - which the OP said he would be likely using both. Unless he's the kind of guy to turn all the game settings to low, it is this 4% difference he should be considering, along with other points

Source: https://www.techspot.com/review/1869-amd-ryzen-3900x-ryzen-3700x/

Steve Burke, with his testing on medium game settings, and strictly 1080p noted 6-8% difference on average, as I mentioned earlier in the thread.

Its not 10-20% in real life. Also note the comment on frame times as roughly equal - smooth gameplay is superior to faster fps with worse frametimes. Average FPS does NOT tell the story of how smoothly a game plays.
 
I have that board in mATX form. I didn't really have any issues with the included bios, but I did update fairly early on. I was able to run some "not approved" Corsair RAM at 3200 with tighter timings than XMP -- Maybe I was lucky :)

I do plan on putting a 3900x into it soon and I love that I don't have to shell out more for a new board - -I don't need PCI4 anyway - as is the case for most people.

Due to my planned upgrade I recently updated to a zen2 compatible bios -- I wish I had not done that yet. MSI didn't leave enough "room" for the zen2 bios in their boards and it was first reported that they would NOT support Zen2 at all. AMD worked with them, and managed to create a very cut down version of the new bios that would fit.

There's no more raid support (but I didn't need it) and almost all my OC features are gone now. The GUI is all text based now (reminds me of the old days), but that males it pretty hard to set fan profiles where there used to be a graph.

I have a r7 1700 currently, and it needs to be OCd (can OC 900mhz from base and still keep it really cool), it seems Ryzen master resets all settings when the computer sleeps - a pain in the butt.

Here's my take away -- and to the other fella back there in the thread ... I have a feeling that MSI first gen boards won't support Zen3 at all due to this misstep by MSI on their 1st gen boards.

Other than that issue, the board is very good.
The thing is that previous AGESA versions had a branch for different CPU generations, and were all delivered side by side. With more recent AGESA releases, AMD managed to shrink them down to a single branch, making it quite a bit smaller. This may help with further releases.
Moreover they are far from the worst "offenders" there, as they did provide a 16 Mb flash chip for the BIOS on that board - all the first-gen cheapo boards that included only 8 Mb of flash are in a much worse position, as newer BIOS versions actually force you to choose the BIOS according to the CPU you're gonna put on it.
 
Isn't it 30-40% now? ;) (why does this number keep growing?)

There's a lot more to consider than simplifying it to that equation (which had wrong variables anyway)

"Testing with Assassin's Creed: Odyssey, 3rd-gen Ryzen is ~10% faster than the 2700X which is good, but not good enough to beat the 9900K, at least when looking at the average frame rate. Despite similar frame time performance, the 9900K was 4% faster on average at 1080p with an RTX 2080 Ti. " - testing R7 3700x --- this testing was done with very high or ultra game quality settings and a mix of 1080p and 1440p - which the OP said he would be likely using both. Unless he's the kind of guy to turn all the game settings to low, it is this 4% difference he should be considering, along with other points

Source: https://www.techspot.com/review/1869-amd-ryzen-3900x-ryzen-3700x/

Steve Burke, with his testing on medium game settings, and strictly 1080p noted 6-8% difference on average, as I mentioned earlier in the thread.

Its not 10-20% in real life. Also note the comment on frame times as roughly equal - smooth gameplay is superior to faster fps with worse frametimes. Average FPS does NOT tell the story of how smoothly a game plays.
A single reviewer doesn't tell the whole story, especially when as you say, they're using medium settings. Many people run lower settings in more competitive games for higher fps. If you are one of those people that prefer better graphics over higher fps then it doesnt matter what CPU you get on the higher end.

Personally, I prefer AMD right now for three reasons. AMD is currently much cheaper for slightly worse or similar performance compared to Intel's currently available CPUs. The lower temperatures, which allows for cooling solutions that are cheaper than what would be required for a 9700k or 9900k. Lastly, the higher core and thread count on the budget and higher end CPUs.
 
  • Like
Reactions: RodroX

joeblowsmynose

Distinguished
A single reviewer doesn't tell the whole story, especially when as you say, they're using medium settings. Many people run lower settings in more competitive games for higher fps. If you are one of those people that prefer better graphics over higher fps then it doesnt matter what CPU you get on the higher end.

Personally, I prefer AMD right now for three reasons. AMD is currently much cheaper for slightly worse or similar performance compared to Intel's currently available CPUs. The lower temperatures, which allows for cooling solutions that are cheaper than what would be required for a 9700k or 9900k. Lastly, the higher core and thread count on the budget and higher end CPUs.

For me the pros also outweigh the cons, as I enjoy visual fidelity in the games I play and I do 3d rendering and animation ... intel isn't much an option for the latter except for 9900k (too expensive for me with not enough, or no benefit over less expensive Ryzen 8 core). My old 1700 non x still rocks rendering amazingly well once OCd.

I feel the Techspot review is a little closer to representative of any average gamer with a 2080ti, as opposed to those who think 220 vs 250 fps will make them a better gamer ... 60 vs 240, yes there's big difference ... 220 to 250. Comeon lets be real with ourselves, no ones eyes will notice that difference even for twitch gaming.
 
Last edited:

joeblowsmynose

Distinguished
Input latency is another aspect to high refresh rates. However, I'd agree the difference between 220~250Hz wouldn't be much but it's not always about visuals.

Video cards now have features that can reduce input lag. But again the lag reduction in 220 to 250 is going to be very minimal at those fps. Its all greatly diminishing returns, which is why I laugh at 300hz monitors - I think they're a waste of money. 144hz is plenty enough for almost everyone.
 

schaperb9

Prominent
Nov 23, 2018
20
2
515
What are you talking about?
The i9-9900k and the R 9 3900x cost about the same and the i9 is about 10% faster on average which means it has at least 10% better FPS to dollars ratio even with today's GPUs that severely limit how fast the i9 can go since it's still 10% faster running at 4Ghz.
If you are talking about specific CPUs you have to state the models nobody can read minds in here.

Why is everybody trying to convince a gamer that somehow CB scores (or whatever productivity) are in any way relevant to what he is going to do with the system.

My response here is no longer on topic but I had to respond to this. Go look at userbenchmark and compare the 3600 and the i9 9900k. In the top 5 games right now, the i9 leads by as little as 4% in overwatch and as high at 12% in fortnite. Now while this will vary game to game, and I dk what games the original poster plays, you are trying to tell me that those extra 4-12% increase in frames are worth 250% more money?!? Sure if money is no option and you are chasing every possible frame you can go with the 9900k. But don't try to pitch me on how Intel is still worth it with those kind of numbers. Only thing I agree with in your post is a 3900x not making sense for gaming. But people who buy that cpu aren't buying it exclusively for gaming, they most likely also are going to leverage the 4 extra cores in whatever productivity tasks they do. Your mistake is trying to compare 2 CPUs from both sides at the same price point, but you don't need to shift side ways, you can shift to the lowest end cpu on amds side. And when the original poster is asking to be convinced to switch, I think saving $250-$300 and only losing 4-12% is a pretty compelling argument no? That money goes a long way in upgrading his ram/ graphics card to help close that gap even more. So I have to return the question, what are you even talking about?
 
  • Like
Reactions: drea.drechsler
My response here is no longer on topic but I had to respond to this. Go look at userbenchmark and compare the 3600 and the i9 9900k. In the top 5 games right now, the i9 leads by as little as 4% in overwatch and as high at 12% in fortnite.
Yeah and why exactly would you want to compare the 9900k to the 3600?
The 9600k is at the same price as the 3600 and also is 10% faster at 3 out the 5 games and still faster then ryzen in the other two and you still have about 20-25% higher overclocking headroom.
It's still at least 10% better value for money when comparing FPS to $.
https://cpu.userbenchmark.com/Compare/Intel-Core-i5-9600K-vs-AMD-Ryzen-5-3600/4031vs4040
And when the original poster is asking to be convinced to switch, I think saving $250-$300 and only losing 4-12% is a pretty compelling argument no? That money goes a long way in upgrading his ram/ graphics card to help close that gap even more. So I have to return the question, what are you even talking about?
Yeah but you can save $250-300 by still going with intel without losing 4-12% of performance and still having the ~5Ghz ceiling instead of the ~4Ghz all core of ryzen.
Vote with your pocket man,you just get much more for your money by going with intel.
Ryzen only makes sense if you have to do 3d rendering stuff.
 
Yeah and why exactly would you want to compare the 9900k to the 3600?
The 9600k is at the same price as the 3600 and also is 10% faster at 3 out the 5 games and still faster then ryzen in the other two and you still have about 20-25% higher overclocking headroom.
It's still at least 10% better value for money when comparing FPS to $.
https://cpu.userbenchmark.com/Compare/Intel-Core-i5-9600K-vs-AMD-Ryzen-5-3600/4031vs4040

Yeah but you can save $250-300 by still going with intel without losing 4-12% of performance and still having the ~5Ghz ceiling instead of the ~4Ghz all core of ryzen.
Vote with your pocket man,you just get much more for your money by going with intel.
Ryzen only makes sense if you have to do 3d rendering stuff.

Save $250-300 with what? i5 9600k is still $25 more than the 3600 and needs an aftermarket cooler.
Using userbenchmark to display performance differences doesn't exactly validate your point since they're well known for being biased towards Intel.

The benchmarks on 9600k vs 3600 are also very telling. The games where 9600k leads both perform well, but where the 3600 leads the i5 is getting noticeable stuttering and framedrops.
It shows that non ht hexacores will in a year or 2 be in the same position non ht quadcores are now.
 

joeblowsmynose

Distinguished
Save $250-300 with what? i5 9600k is still $25 more than the 3600 and needs an aftermarket cooler.
Using userbenchmark to display performance differences doesn't exactly validate your point since they're well known for being biased towards Intel.

The benchmarks on 9600k vs 3600 are also very telling. The games where 9600k leads both perform well, but where the 3600 leads the i5 is getting noticeable stuttering and framedrops.
It shows that non ht hexacores will in a year or 2 be in the same position non ht quadcores are now.

100% true ... 12 threads is basically the new minimum, even for gaming. Gamer's Nexus now recommends against 9600k vs 3600 due to the bad 1% and 0.1% lows in some games vs the 3600. 9700k gets a pass in my book, just because its faster than the 9600k to the point you don't get the frame drops, but maybe in a year or two games will be even more reliant on extra threads for smooth gameplay. Who knows but its something to consider.


Average FPS tells nothing about how evenly distributed frames are delivered. Elite Dangerous used to have this micro-stutter issue on some setups (mine included until I got a new CPU) -- the FPS would be 70 but it might as well have been 20 for how playable it was, due to the extremely poor 1% lows.

We need to stop thinking that average FPS alone relates to user experience ... Higher 1% and 0.1% lows at 60fps is a superior experience vs lower 1% and 0.1% lows at 80 FPS. Average FPS is deceiving in this case.
 
  • Like
Reactions: rigg42
My last 3 computers have been AMD. Someone has to support the "competitor" or there will not be a competitor. Without a competitor you get 2 core processors for $1000.

The first Dual Core mainstream CPU's launched in 2004 and by 2006 they were substantially down in price with the AMD Athlon 64 X2 3800+ for circa $295 and the Pentium D 805 dual core processor at circa $162. As you can no doubt tell we have had competition for a long, long time with AMD on the rise during the Athlon days (which were great CPU's and I was fully Athlon'd up at the time) and then Intel taking over with the Core series etc, etc. Swings and roundabouts with AMD coming back now...Having a choice is a good but making a different choice seems to hurt some on both sides...absurd.
 
My last 3 computers have been AMD. Someone has to support the "competitor" or there will not be a competitor. Without a competitor you get 2 core processors for $1000.

You made me laught a good deal, thanks, but you still have a valid point there. Not everyone need a Core i3, Core i5, or even higher to use Word or browse internet.

My work computer (not mine, but from the IT department that i work for), which I mainly use for Word, Excel, Adober Reader, Browser, etc, is an AMD A8-9600 (APU) + 2x4GB DDR4 2400MHz and its more than enough for any of those tasks (even with really big files). The only thing I miss every single day is the SSD, man it feel soo slow to work with a mechanical drive this days.
 
Last edited:

AnUnusedUsername

Distinguished
Sep 14, 2010
235
0
18,710
Eight years ago, when I built my last system, everyone was making the same argument of 'you're going to need more than four cores soon, AMD is a better value in the long run'.

Turns out it wasn't. We've had 8+ core CPUs available for a decade now, and nearly all games are still almost entirely reliant on single-core performance. Many games will still run on a dual core, just about all will run on a quad. I just upgraded from an (overclocked) i5-2500k to a stock i7-9700k and while the performance gain in gaming is significant, it's not nearly as high as you'd expect from doubling the core count and eight years worth of architecture improvements. Why? Because the extra cores don't really do anything for gaming right now.

It really shows how little things have changed. We hit a wall on clockspeed and heat around 2010 and since then the major improvements have just been more cores that game engines can't use.

Not to say there's no reason to ever buy AMD. If you do any sort of computational work, they're the obvious answer. They also have better options at lower price points, it's better to buy 6 slow AMD cores than 4 slow Intel cores. AMD has been making great mobile chips in the past few years, although not nearly enough laptop manufacturers actually build with them.

If you're looking for gaming longevity, buy the fastest single-core performance you can get. Chances are, whether it has six or sixty cores, you're going to need to replace your CPU because it's per-core performance can't keep up, not because you need more cores. Be smart about it though. Quad cores are very slowly beginning to phase out now, it'd be a bit of a risk to buy a brand-new CPU with less than six threads.

At least in my case, I was seriously considering AMD, but micro center has the i7-9700k for $300, so it was cheaper than a 3700X at the time.
 

Karadjgne

Titan
Ambassador
All that, and I have a 60Hz monitor. So really ask yourselves if getting 150fps on a R5 3600 or 200fps on a i9 9900k rally matters at all. Honestly it's only a bunch of benchmarks, half of which are useless info, and the other half rarely apply.

If the 3600 got 100fps and the 9900k got a massive 20% boost, that's still only 120fps, and it takes a very rare human being to be able to physically see that difference.

Which really leaves performance differences regarding games as nothing much more than numbers on cheap paper.
 

InvalidError

Titan
Moderator
If the 3600 got 100fps and the 9900k got a massive 20% boost, that's still only 120fps, and it takes a very rare human being to be able to physically see that difference.
Seeing a difference is one thing, FEELING the difference is another. While the average person may not be able to distinctively perceive individual images beyond 60fps, most can still feel smoother motion beyond 144fps. 60Hz is merely the lower threshold for persistence of vision so most people didn't get headaches from flickering back in the CRT days, not as much of an issue with LCDs where the image/backlight is either always-on or strobed multiple times between refreshes.
 

joeblowsmynose

Distinguished
Eight years ago, when I built my last system, everyone was making the same argument of 'you're going to need more than four cores soon, AMD is a better value in the long run'.

Turns out it wasn't. We've had 8+ core CPUs available for a decade now, and nearly all games are still almost entirely reliant on single-core performance. Many games will still run on a dual core, just about all will run on a quad. I just upgraded from an (overclocked) i5-2500k to a stock i7-9700k and while the performance gain in gaming is significant, it's not nearly as high as you'd expect from doubling the core count and eight years worth of architecture improvements. Why? Because the extra cores don't really do anything for gaming right now.

It really shows how little things have changed. We hit a wall on clockspeed and heat around 2010 and since then the major improvements have just been more cores that game engines can't use.

Not to say there's no reason to ever buy AMD. If you do any sort of computational work, they're the obvious answer. They also have better options at lower price points, it's better to buy 6 slow AMD cores than 4 slow Intel cores. AMD has been making great mobile chips in the past few years, although not nearly enough laptop manufacturers actually build with them.

If you're looking for gaming longevity, buy the fastest single-core performance you can get. Chances are, whether it has six or sixty cores, you're going to need to replace your CPU because it's per-core performance can't keep up, not because you need more cores. Be smart about it though. Quad cores are very slowly beginning to phase out now, it'd be a bit of a risk to buy a brand-new CPU with less than six threads.

At least in my case, I was seriously considering AMD, but micro center has the i7-9700k for $300, so it was cheaper than a 3700X at the time.

No, wrong ... I guess you didn't read the thread through ...

Just three years ago four cores is all you could get - so it was enough. Gamers Nexus now recommends against 9th gen i5s completely because the lack of threads gives really low 1% and 0.1% lows. Regardless of average framerate, frametime is metric you need to use to determine overall play-ability, smoothness, and lack of stutter.

You can easily have 120fps avg and 10fps for 1% lows and it will be almost unplayable, where you can also have 60FPS avg and and 50fps for 1% lows and it will look and play like gold compared to the former.

Games engines are offloading things like map loading to extra threads, if you don't have a thread for that it has to be shared with another, and you get stutter and micro-stutter.

It's not changing as fast as people assumed it would, but four cores and even six cores without SMT is no longer offering as good of playing experieince to gamers as 12 threads, hence why again, Steve Burke from Gamers Nexus recommends against 9600k completely and for R5 3600 completely.

Per core performance pretty much made almost no strides in the last 6 years except until AMD came on to the scene and forced Intel to max out all their OC headroom at stock and keep refining process to reach as high of locks as they can ... 5.0ghz is pretty much the limit of x86 architecture.

You'll never ever see a 6ghz cpu ... it doesn't work that way ...
 
Last edited:
Save $250-300 with what? i5 9600k is still $25 more than the 3600 and needs an aftermarket cooler.
I was going to use the 9600 non k in my argument but they don't have FPS results for that.
That would be same price with the intel still being about 10% faster.
The k version just has another 20% overclocking potential so it's still by far the much better deal.
Using userbenchmark to display performance differences doesn't exactly validate your point since they're well known for being biased towards Intel.
That's what schaperb used for his argument so that's what I used to counter his argument.
The benchmarks on 9600k vs 3600 are also very telling. The games where 9600k leads both perform well, but where the 3600 leads the i5 is getting noticeable stuttering and framedrops.
Well,anybody who is ok with running games at 1440 or even 4k at ultra to make believe that his choice is not worse than a different choice can make the same argument for the i5,at 1440 and even more at 4k stutter would disappear.
Gamers nexus HTT test for RDR2 explains this pretty well.
It shows that non ht hexacores will in a year or 2 be in the same position non ht quadcores are now.
Look above...stuttering is the result of messed up engines and can be countered by lowering FPS but you can't easily raise FPS if your CPU isn't up to it.
 
My last 3 computers have been AMD. Someone has to support the "competitor" or there will not be a competitor. Without a competitor you get 2 core processors for $1000.
There is always competition, even if there would be only one company left every user would still have the choice of keeping the CPU for another year instead of upgrading every year,so no $1000 dual core would not happen.
They would have to sell CPUs every year so prices would be low enough for this to happen.
All that, and I have a 60Hz monitor. So really ask yourselves if getting 150fps on a R5 3600 or 200fps on a i9 9900k rally matters at all.
If someone forks out $500 + for the CPU alone they better have a decent display.
The number one argument in favor of ryzen is that at 1440 and higher it gets pretty close to intel.
And if you have a 1440 monitor that can handle 144hz than yes, being well above that to prevent drops is much better than to barely hit the target and drop below.
Gamers Nexus now recommends against 9th gen i5s completely because the lack of threads gives really low 1% and 0.1% lows. Regardless of average framerate, frametime is metric you need to use to determine overall play-ability, smoothness, and lack of stutter.
That's a really hard sell if you look at the 9900k 4c/8t scenario on the right picture against the full 8c/16t overclocked to 5Ghz on the left,with 4/8 you still do not drop below 65FPS even on the 0.1% .
If next "gen" intel i3 come with HTT enabled it's going to be a fun time.
View: https://www.youtube.com/watch?v=-pRTweQp2uw&t=1038s

CsN5DwU.jpg


Per core performance pretty much made almost no strides in the last 6 years except until AMD came on to the scene and forced Intel to max out all their OC headroom at stock and keep refining process to reach as high of locks as they can ... 5.0ghz is pretty much the limit of x86 architecture.
Intel is 10% ahead in gaming ,when GPUs allow,plus another about 25% ahead in clocks,anytime any thread needs a bit more power which one do you think will handle it and which one do you think will drop performance?
AMD did improve over bulldozer by a huge lot but that's not really hard to do.
View: https://www.youtube.com/watch?v=RmxkpTtwx1k

haDrc0I.jpg
 

joeblowsmynose

Distinguished
...
That's a really hard sell if you look at the 9900k 4c/8t scenario on the right picture against the full 8c/16t overclocked to 5Ghz on the left,with 4/8 you still do not drop below 65FPS even on the 0.1% .
If next "gen" intel i3 come with HTT enabled it's going to be a fun time.
View: https://www.youtube.com/watch?v=-pRTweQp2uw&t=1038s

CsN5DwU.jpg



Intel is 10% ahead in gaming ,when GPUs allow,plus another about 25% ahead in clocks,anytime any thread needs a bit more power which one do you think will handle it and which one do you think will drop performance?
...

What I see in that data is there's almost a 20% drop in 0.1% lows if yourestrict the i9 9900k to 8 threads. That is significant.




Without anything to compare it to, this chart is rather useless ...
 
Last edited:
I was going to use the 9600 non k in my argument but they don't have FPS results for that.
That would be same price with the intel still being about 10% faster.
The k version just has another 20% overclocking potential so it's still by far the much better deal.
Yeah... that would not beat a 3600, and the cooler point still applies.

That's what schaperb used for his argument so that's what I used to counter his argument.
Ok then

Well,anybody who is ok with running games at 1440 or even 4k at ultra to make believe that his choice is not worse than a different choice can make the same argument for the i5,at 1440 and even more at 4k stutter would disappear.
Gamers nexus HTT test for RDR2 explains this pretty well.
I didn't mention anything about 1440p or 4k.
Not that it would matter as the 3600 and 2600 are still cheaper.

Look above...stuttering is the result of messed up engines and can be countered by lowering FPS but you can't easily raise FPS if your CPU isn't up to it.
So you'd rather artificially take away what little the i5 had going for it to make up for it's shortcomings, than get a 3600?
 

Wikingking

Commendable
Nov 11, 2019
33
3
1,535
How do you "restrict" the number of working cores/threads of Intel? (or AMD) Is it by using Task Manager and manually tweak with the cores or some other method?
And why would you do that? What does that represent, other than the importance of single-core power, or more precisely: the game's dependence of it?
 

joeblowsmynose

Distinguished
...

Look above...stuttering is the result of messed up engines and can be countered by lowering FPS but you can't easily raise FPS if your CPU isn't up to it.

It would be entirely unrealistic and unreasonable for anyone to believe that stuttering is only caused by "messed up game engines" and that the solution is to artificially lower fps.

That weird quirk noticed in RDR2 is definitely not typical, but consider that it was tested for console which would never reach 120fps. There was no reason to ever fix this issue in that one game engine specified because it was only available on console until just now.

Lack of smooth game-play can be caused by a multitude of issues; not having a free thread available for off loading texture streaming, object loading, etc, is one of them.

We were talking about "best for gaming" here --- putting up with microstutter caused by any reason isn't lending itself to "best for gaming" in any way.

That's why 9700k is really the best Intel CPU for gaming, because 9600k runs into the lack of threads issue in some game engines (the RDR2 issue is not this but something else entirely), and the 9900k is a lot more money for almost no gain at all over the 9700k, especially when both OCd to 5.0ghz.
 

joeblowsmynose

Distinguished
How do you "restrict" the number of working cores/threads of Intel? (or AMD) Is it by using Task Manager and manually tweak with the cores or some other method?
And why would you do that? What does that represent, other than the importance of single-core power, or more precisely: the game's dependence of it?

Ryzen has such features in bios and with Ryzen master (but I don't recommend messing with those settings - I turned off four cores once and had to reflash my bios to bring it back to eight).

I imagine Intel has a similar utility. One can always turn off SMT(HT) in chips that have it as well to cut threads in half but leave physical cores operational.

You wouldn't normally do it at all, usually just for testing. It made some sense to test this on Ryzen due to the nature of their modular architecture, to try to reduce memory latency.

Steve from GN does tests like this occasionally to try to see the thread count/performance scaling in specific games. He likes to test as much variables as possible to try to find patterns.

The consensus though is that pretty much leaving everything at default works best across averages for Ryzen, while a few games do see a boost with SMT off, many games see worse performance.

sorry, I edited this like 100 times ... :)
 
Last edited:
  • Like
Reactions: Wikingking