• Now's your chance to speak with MSI directly! Join our AMA and get entered to win a RTX 2060 GPU, plus more! Ask your questions here.

    Catch the next Tom's Hardware livestream on May 28 at 2PM Eastern: Catch the show here!

    Need Hardware Help? Include the Info Requested Here.
  • Join our Best PC Builds competition and get on the Tom's Hardware home page. Check it out here.

AMD Ryzen 5 2600X Review: Spectre Patches Weigh In

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

nitrium

Distinguished
Jul 27, 2009
1,462
12
19,665
195

That's an interesting way to look at it - i.e. Intel threw AMD a bone by NOT releasing CPUs multiple generations ahead of AMD even though they easily could have, which allowed AMD to remain competitive and not go bankrupt.
The counter argument is that sans any genuine competition, Intel has been slumming it and raking in bucket loads of cash by effectively rebranding decade old CPUs with minor tweaks and seeing stock prices soar. Wonder which is true (it's pretty obvious imo)?
 

While it is true that they haven't done anything huge they did produce a steady line of improvements,for example just because most people only compare benchmarks with decade old software that hasn't ever increased it's usage of IPC doesn't means that the CPU hasn't increased IPC.
CPU cores have much more IPC then any software can use up,which is why every CPU get's the same result in things like cinebench but compare a ryzen quad to a intel quad in gaming and you will see a very big difference because now the CPUs have to run something that has much less IPC to work with.
 

Ninjawithagun

Distinguished
Aug 28, 2007
737
15
19,165
73
Regardless of the whole HPET fiasco, the bottom line is anyone who games higher than 1080P won't notice anyways. Do a blind game test. Have one PC use the 8700K and the other a 2700X. Both systems will have identical 16GB (2 x 8GB) 3200Mhz DDR4 and a GTX1080Ti. Play several games at 2560 x 1440, 3440 x 1440, and 3840 x 2160 using the same monitors for all runs. I'll bet you can't tell the difference ;-)
 


With this line of thinking you can just get the quadcore i3-8100,even with the 1080ti at 1080 ultra you won't be able to tell any difference,at 1440 and 4k? There won't even be any difference.
 

toyo

Distinguished
Sep 24, 2011
36
0
18,530
0
People are way too much attached to PC gear these days, it's like part of their identity for whatever reason. We should just stop this emotion-based thinking and simply get to the facts, which in this case, happen to be quite simple.
- 8700K is the fastest gaming CPU, it does a decent job in anything else - stellar if Ryzen was not in the market
- 2700x is a great gaming CPU, better than the 8700K for almost anything production-related that can use all those threads
At this point, one should simply make a choice based on these things, and what platform you want to use.

But hey, nothing is really that simple these days, is it? So why not test the CPUs with a 1070 or 1080, where the differences will be so small that you can see a 8400 and 7700K (not OC) being suddenly better than a 4.9GHz 8700K - like we see in these test, for example FC: Primal.

Immediately any reasonable person would go "hey! stop right here. That's not possible - especially with a 6 core CPU locked at 4.9GHz". It should be mathematically impossible for a 4.5GHz CPU from a previous generation to push more fps than the 4.9GHz new CPU. I mean, nobody stopped and think this?

For better or worse, we all know that the cores Intel gives us are quite similar when you run them at the same frequency, performance wise. So no, that 7700K is not suddenly fast than the OC 8700K. Something is wrong, and nobody cared to check. Maybe the 8700K is throttling because it's not cooled? But that's not my job to guess why that happened, but for a CORRECTLY functioning 8700K it shouldn't.

So here we are now, where these type of tests are muddying the waters unnecessarily, just like Anandtech did, suddenly discovering days later that they borked 8700K fps for up to 75% (!!!!!!!!!!!!!!!). And they irresponsibly published that, as if it was a good result, with the Internet celebrating that "Spectre killed Intel" and other such crap, while people that actually own the CPU in a properly working PC watched in disbelief, knowing the results to be pure BS.

At the opposite end of the spectrum, Ryzen reviewers suddenly discovered that the 2700x "does not boost to the advertised speed of 4.3GHz". Really, now? When is 8700K run at 4.7Ghz? Basically never, it's something that it achieves only idling on the desktop without background apps. But nobody cried about that. Yet when AMD came up with the same type of Turbo, people started to cry.

It's kind of hard after seeing these "mistakes" continue to happen to just give a pass to reviewers. maybe, just maybe, if your results look out of the ordinary, STOP! and retest, find out why, don't publish. Maybe treat AMD and Intel the same, and don't cry about only one of them when they do something questionable, when the other did the same for years.
 

PaulAlcorn

Senior Editor
Editor
Feb 24, 2015
757
113
11,160
0


We have known that the -8700K was not performing correctly for some time. We reported this during our -8700K review. This issue was present on both MSI and Gigabyte motherboards. We have communicated said issues to Intel, MSI, and Gigabyte, and sent over test data that reflects the performance disparities. We tested with the same cooling and power supply after the BIOS update, which corrected the issue.

 
Looked today and the 8700K is going for $339.00 w/o cooler and the 2700x is going for $329.00 plus they come with a good cooler. The 8700K has 6 core & 12 threads, the 2700x has 8 core & 16 threads.
 

alextheblue

Distinguished
Apr 3, 2001
3,056
92
20,870
2

The post I replied to stated that it had ZERO impact. I posted a link to an article discussing the impact.

I never disputed that HPET has sometimes produced performance issues. What impact depends massively on the board/CPU/etc. As I pointed out there are cases where forcing HPET *resolved* stuttering problems in games (prior to all the Meltyghost patches). It was a major YMMV situation and HAS been for years.
 


You think these are "mistakes"?
I own Intel Cpu but get to work and play with some AMD ones as well. Tom's (since being bought out) has always been on the dig at AMD.
Just now, they mention and quote the AMD new release with all the problems fixed and not fixed, which is factual, so no problem there.
When you look at an Nvidia driver release, they never list the problems Nvidia still lists in their release notes.
Just things like that, twisting of truths. They should be Politicians!
 

Kahless01

Distinguished
Sep 14, 2009
129
0
18,690
2
so whats the nonX like? why wouldnt you just save the 30$ if you were going to overclock. buy a stout cooling solution and overclock the 2600?
 

alextheblue

Distinguished
Apr 3, 2001
3,056
92
20,870
2
Blame Nvidia. Not Tom's.

In general you're correct but there's a couple of things to consider:

  • ■ The X comes with a substantially better HSF. Almost worth the extra money by itself, unless you have special requirements (silent cooling or aggressive overclocking)
    ■ The X is better binned (in theory, chip lottery still applies).
    ■ The X turbos high out of the box so a "mild" all-core overclock would actually have worse lightly-threaded performance. Only a serious overclock can match it on lightly-threaded work.
With that being said, I like big air coolers (and I cannot lie) so I will probably get the non-X 2600 and a Noctua HSF.
 

dalauder

Splendid
You guys are looking at it wrong. We all have biases. My bias, which seems to be the same as Tom's is that I prefer a competitive industry. Monopolies are always bad for the consumer. So I celebrate every AMD victory. I think we all were happy to see Ryzen 1st gen, but it's pretty excited to see that the 2nd gen doesn't have any real problems.

I'm not going to lie to people seeking advice and blindly tell them to buy AMD (or Radeon). But all things being equal, Team Red is a better choice for the consumer, longterm. I generally state something to that effect in my recommendations, which are always for Ryzen systems these days no that they got their act together.

 

Weeeeeeeeeeell,that's not quite that simple is it now?!
The 8700k gives you 6 cores and at the same number of cores and the same clocks it has 100% of the top speed that is possible today,no matter what you run no other CPU is faster at the same #cores and same clocks, plus that you can reach the highest sane clocks that are possible today which is ~5Ghz which AMD also reached before but can't with ryzen.
You can even see this in this somewhat flawed benchmark,while the 8700k has an 16% lead in clocks and the ryzen has an 33,3% lead in cores the 8700k comes out ahead with up to 40% in quite a few of the benches.
 

alextheblue

Distinguished
Apr 3, 2001
3,056
92
20,870
2

The IPC and clock advantages are undeniable. The 8700K is an absolute beast. But the core count argument is silly and could be applied against all high-core processors. Even the hexa-cores in many cases, depending on the software in question. Especially when looking at lab benches.
 


No it can't be applied against all high-core processors,with the same IPC even (especially) in lab benches you will only get the difference that is there in clocks,you have a quad with 16% more clocks you will get a 16% higher score at most.
 

alextheblue

Distinguished
Apr 3, 2001
3,056
92
20,870
2
I'm specifically talking about the difference in core count which you brought up. Hence why I said "the core count argument". You could apply the same logic "oh it has X% more cores but only Y% more performance" to hexacores. For example comparing an Intel quad to an Intel hexacore. In most benchmarks a CFL i5 with 50% more cores simply isn't going to get 50% more performance than a CFL i3 at the same clocks. Performance doesn't scale linearly with cores past a certain point in most current software, especially in variable-eliminating, non-multitasking lab benches. Especially with most current software/games being tested.

So again, the whole core count vs performance argument is silly. We already KNOW a lot of software doesn't scale with the extra cores. That doesn't mean the extra cores aren't desirable to a lot of users who plan on streaming or who knows what multitasking. With that being said I personally don't need more than a cheap hexacore at most.
 
Status
Not open for further replies.

ASK THE COMMUNITY

TRENDING THREADS