AMD Ryzen 7 2700X Review: Redefining Ryzen

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

MCE2k

Prominent
Mar 2, 2017
4
0
510
Damn. I was hoping Tom's would do comparisons in solid modeling, engineering, and scientific applications like the 1000 series Ryzen article last yerar.
 


What false sense of security? Do you think most of TH's members here don't know that the higher you go up in resolution the less CPU speed/power means as the GPU becomes more important? We are not Apple users. And again, you ignore why EVERY website has 1080p benchmarks in new CPU reviews (or in the case of an Anandtech Pentium IV review from 2000, using 640x480 instead of the high resolution 1280x1024 of the era: https://www.anandtech.com/show/661/16

^Note they even hit a GPU bottleneck at 1024x768 (the mainstream 1080p of that era) with their GeForce2 GTS.



WHAT whole picture? A straight across the board same FPS among all CPUs at 4K that we all know would be the case? That's a waste of bandwidth. Go check out Guru3D's review if you want to see a 1-2 FPS difference at 1440p or 4K. Again, you are making it sound like most of TH's readers are "special needs" people or something. GG it's like trying to talk to a brick wall.

Oh by the way: I note you do not complain about Ryzen's superior productivity performance over Intel. Is that because you just happen to like those results?

 

Rogue Leader

It's a trap!
Moderator


This whole post is a just a giant No.

Take a look at any site that benched this at 4k. What do you see? A bunch of results that bounce between 60 and 62 fps. What in the world does that tell anyone?

The point of a CPU review is to ELIMINATE anything that may affect the CPU's performance due to things the CPU cannot control. CPUs do not care about higher resolutions, they care about game logic, data loading, etc. In a few years when we have a top line GPU that can run anything at a super high fps 4k without blinking then sure, but even the 1080ti, bottlenecks at 4k.

You are right, the card is a bottleneck in your scenario, but that result is irrelevant in a CPU PERFORMANCE COMPARISON. It would be relevant if we were testing say, the best system you can build for $x price, or comparing prebuilts. Or comparing "midrange combos". But this article is a pure CPU performance comparison.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015
@Redgarl
I've re-read your comment several times because I want to make sure I'm addressing your point correctly. You're saying that "the results from using a top end video card at 1080p testing don't apply to the average customer." Furthermore, "this is because their performance is practically the same when you use a less powerful, more accessible video card." I'll try and break down my counter-argument on a per paragraph basis.


This is all true. I don't disagree that the most popular components are middle to low end. Many new computer builds are composed of these parts. In the situation you described, the user will be relying on the power of their graphics card for games. The CPU won't matter.


I'm a little foggy on what scope creep is. In low resolutions with top end video cards 1080p benchmarks can show a big divide in framerates but at higher resolutions the results get diluted and all look the same. Are you saying that CPUs look artificially more appetizing when there is a more noticeable distinction in benchmarks? Are you saying that this article is misleading its readers? Or are you also implying that it's not worth investing in a more expensive CPU because you won't notice the difference in higher resolutions? None of those cases represent the intention of this article.


The hardware configuration selected for the testing of the Ryzen 2700x in today's article wasn't meant to represent hardware commonly found in new budget desktop builds. It's not even meant to show off shiny new gaming performance in a variety of resolutions. This hardware was chosen to STRICTLY REMOVE THE VIDEO CARD BOTTLENECK FROM TESTING.

In other words, this hardware configuration isn't meant to "impress". Its meant to create a testing environment where the author could isolate the difference between CPUs. There shouldn't be any interference from an under powered video card in any resolution and the "whole picture" of CPU behavior in 4K is outside the scope of CPU testing.



This goes along the lines of my last paragraph. The author of this article isn't trying to demonstrate the leaps and bounds of extra performance you'll get from buying the brand new Ryzen 2700x. They're not trying to "impress" their readers with the big numbers Ryzen 2700x will give you. If the author just wanted to show you ridiculous numbers, they could render the games in 1024x768 so you could see Grand Theft Auto 5 run at 300fps. But this is useless and outside the scope. Showing results tainted by video card bottlenecks is outside the scope.

The scope of this article is ONLY meant to show the difference between CPUs.



This final sentence encompasses your problem with the article and why we're talking about this right now. You walked into this review with a certain set of expectations for how the 2700x should be tested and what the results should show. Based on my understanding, you wanted to see how the 2700x would augment the average, middle end PC with a mid-tier graphics card. What you got instead was an article trying to isolate the CPU performance differences between modern CPUs. Since your expectations weren't met, you ended up with "a problem with the way that bench is done...".

I'm not trying to fight with you or prove how much smarter I am than you or anything. I'm not trying to be insulting. I just want to make sure you understand why the author made his testing decisions and why they were the right choices.

edit: a bunch of phrasing changes.
 
Agent he just won't get it nor will he ever. Let's flip his argument on its side and look at Anandtech's 4K bench of GTA 5 using a GTX 1080:

https://images.anandtech.com/graphs/graph12625/97195.png

Note the Ryzen 5 1600 is only 1 FPS behind the Ryzen 7 2700X. So why buy the Ryzen 7 2700X when you can save big bucks on a Ryzen 5 1600 build for 4K gaming?

Edit: And I'll take another of his points and flip it: he likes to state that 2-3% of user number of people with 144Hz 1080p monitors. Well, that's about the same number as those running 1440p and 4K resolutions he wants to see according to Steam's hardware survey the last I checked.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015
I looked at those benchmarks from Anandtech earlier. You're right. They show that there's virtually no difference between CPUs at high resolutions. That's what we expect to see and it's also useless when comparing the computational power of CPUs. This is the point we've been restating.

As I mentioned in the final paragraph of my last post (that one took me a while to write), I think Redgarl's expectations don't align with the author's intentions.
 

papality

Honorable
Dec 1, 2012
74
0
10,640
Agentlozen, I appreciate you being more civil and less condescending than Rogue and tentacle who are just shouting at this point.

I'm not trying to say that 1080p is testing is bad or doesn't count or shouldn't be used or anything. I just think it would be interesting to have the CPU benchmarked at other resolutions because it does matter in many games. Not a lot, but 5fps at 4k like many of the anandtech games show can make the difference between playable and unplayable. Especially when all of their tests are, I believe, without any overclocking. You can also pick a midpoint between total GPU bottleneck(4k+) and total CPU (1080-) like regular and ultrawide 1440, because what y'all seem so mad about is using the GPU at all.

Again, my question is, what harm are you really doing by adding the information for consumers, especially if AMD and Intel are advertising avpit 1440 and 4k? (Other than, of course, showing that Intel has no real performance lead once you start to use a GPU that has any thump to it.)
 


Who is SHOUTING here? We are just trying to explain to you why for nearly 20 years hardware tech websites have used lower resolutions on CPU tests. Did you even read that 18 year old Anandtech review I linked earlier of a Pentium IV and see what resolutions they tested at? And again, why waste bandwidth of benchmarks showing flatlined bar graphs of higher resolutions with a GPU bottleneck?

Please look at Guru3D's or Anandtech's benchmarks showing only a 1-3 FPS difference at 1440p and 4K using GTX 1080s (4K which in and of itself is not realistic since that's GTX 1080 Ti territory). Having to repeat the same reasons over and over during every new CPU test gets frustratingly old. Finally, as I made the point above, using those higher resolution settings is counter-productive to needing a faster/more powerful CPU in the first place (for gaming purposes at higher resolutions). A Ryzen 5 1600 keeps pace with a Ryzen 7 2700X at 1440p with a GTX 1080.
 

Rogue Leader

It's a trap!
Moderator


Capitalizing a couple of words in a response is hardly shouting or condescending, shouting is capitalizing all words, one or two is for emphasis. I have been nothing but respectful to you in the entire post. What I pointed out is that you kept discounting the crux of the argument. That in the end there is no "performance boost" at higher resolutions, this is a pure CPU comparison, and its clear at higher resolutions you see basically the same results, however with a much smaller differentiation.

Looking at Anandtech's results most everything was around 60fps, a 5 fps difference at that high of an fps is completely not the difference between playable and unplayable.

When it comes down to it, a CPU performance review should show as much CPU only related performance data as possible. Testing 40 games at 1080p is far more useful than testing 20 games at 1080p and 4k. In 3 years if you are still using this CPU and GPU's have now caught up with 4k you're going to now see these larger fps discrepancies that you see at 1080p as games become CPU bound at that resolution.

Intel not having a performance lead at higher resolutions is a flawed argument. Because its not driven by the CPU performance, its driven by GPU performance. An argument that once new GPUs are released becomes completely invalid.

All you are doing posting benchmarks at 4k now is posting redundant information, and if they were prioritizing 4k over 1080p at this point the article would be straight up useless.
 

FormatC

Distinguished
Apr 4, 2011
981
1
18,990


GTA V @UHD is similar to the Golden Gate Bridge, rush hour and traffic jam. A Fiat 500 might be faster as a Ferrari if he selected the faster lane. But you can't learn from this simply nothing. It depends only at the driver and his selection / experience :D

 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015
You're going to see testing in 1080p, 1440p, and 4K during the next video card launch. I don't know exactly when that will be, but I'm sure Tomshardware will use a 2700x and an 8700K in their benchmarks when it lands.
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015


Cars make a great analogy to this situation. When you're testing a car's performance, you need to use a track that's wide open with no obstructions. In the case of the Golden Gate Bridge during rushhour, traffic is so unpredictable and slow that any tests you run in that environment wouldn't represent the capabilities of that car.
 

Giroro

Splendid
I have heard that the X470 chip-set improves memory latency in the 2000 series Ryzen when compared to the X370 chip-set, even when things are clocked the same. Is this true, and if so how much a performance impact is there?
 

Neuspeed

Distinguished
BANNED
Sep 24, 2007
102
3
18,695
I wonder why people are raving about Ryzen... AMD / Ryzen vs Intel is like comparing a fat kid vs an athlete. The fat kid, who is out of shape and not been good at anything, finally decides to step it up, and finally starts making improvements and thus become competition to the athlete. The athlete, on the other hand, has been delivering results for many year. I suppose people prefer to cheer for the underdog than the one who has been continuously delivering for many years? I'm not impressed with AMD. I currently own an AMD system and my next system will definitely be Intel.
 

Rogue Leader

It's a trap!
Moderator


Because competition drives innovation. Intel has become complacent and lazy, basically selling incremental upgrades to the same formula for the past 5 years because AMD had nothing to put against them. Now AMD is competitive with them again, forcing Intel to up their game. If AMD didn't exist or continued to be significantly behind, the results would not be good for enthusiasts, and innovation would continue to be stifled.

If you have an AMD system thats not Ryzen based then surely you'd want to replace it with Intel, I would, because their prior product was weak. However if you actually read or review these articles you would see AMD Ryzen offering great value for the money, and performance that is on par with its Intel equivalents, and in some cases better.
 

mellis

Distinguished
Jun 17, 2011
25
1
18,535
I would still take a 7700K over this processor. I also still do not see a reason to upgrade my 3930K system with a 390X GPU at this time. When a system configuration supports 4K gaming at a decent price, I will then consider upgrading.
 
I disagree price is the biggest factor for most of us regular shoppers. AMD has led with more cores for years now the performance per core is getting them back in the conversation. So most people would make their final decision on the price of the CPU since AMD give a good core count and can be OC'd on pretty much any AM4 motherboard where as with Intel it has to be the Z370 model their high-end chipset. I would have liked to see a little more Ghz but this is still a good offering and I already have stock on day one! Now if someone would just show up to buy something. This has been the deadest month this year.

 

FD2Raptor

Admirable


Because to buy and use the Ryzen CPU, you'd also have to buy the almost non-existent graphic cards?


Anyway, I've already had a 2600/2600x (whichever is available first in my region) on order and I'm looking for the X370-PRO as upgrades to my 2ndhand i5 3470. But frankly speaking, the fanboys have come close to pissing me off enough to change my mind about trying AMD.
 

Fernando_8

Commendable
Feb 18, 2016
6
0
1,520
8700k you must add watter cooling price to run that. $359 + $109 = I7 8700k cost $468
Nobody is talking about overcloking. 8700K need realy superb cooling to run at stock cloks 3.7~4.7.
Testing Aida System stability test I7 8700 non K achieve easyly reach 90Cº cooler Gamerstorm Lucifer V2 and Phanteks PH-TC14Pe
Why do you think all test I7 8700k whith watter.
Ryzen 7 2700X Thermal Solution Wraith Prism with RGB LED. At stock clocks you need more than this?
 
Status
Not open for further replies.