AMD Ryzen 7 1700X Review

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Amdlova

Distinguished
for me, less power more more cores is a clear winner. today the i7 will be the fastest but tomorrow will be only junk. why dont put cpu utilization on these bench. I have only see a "LAGS behind intel" i got 2 xeon e3, 2 i7, 2 i5 in few past years. And cleary i see the superiority from AMD on ryzen.
 
People tend to forget that Ryzen will be in future consoles. Octacore will be mainstream in 3-4 years, however Ryzen is finally there for the general public for initial integration.

Also, for gaming benchmarks... please toms... please... you are once again spreading Non-Applicable real world scenarios. At 1440p, Ryzen close the gap and at 4k it doesn't matter.

I am a 4k user... why would I spend 50$ less for a 7700k for the same performances at 4k with a 1080 GTX than a Ryzen 1700x, while I can benefit from all the strengths of the 1700x and none of the 1080p weakness you are reporting...?

That's why this review is rubbish. You don't present a real global point of view. You are sticking to your guns and refuse to admit these results are situational at best. Does a CPU performances really be reviewed at 1080p in a non-gpu bound situation over real-world applied situations...?

Unfortunately, with today technology, the answer can be no. Technology is changing and yesterday methodology might not be able to translate well with today reality.

The only thing this proof is that a Ryzen CPU at 1080p is not a good idea when gaming is part of the equation, however it's a useless dilemma at 4k, and barely one at 1440p.
 

Oranthal

Commendable
Jun 29, 2016
23
0
1,520


Woot to a fellow QHD user. Best upgrade I have done in ages. To the point these tests conceal the main item of note for gaming, there is no issue you will notice. The R7 requires an external GPU all but insuring someone is using an above average GPU. The ideal 1080p 144hz games are not even tested, games like CS:GO, Overwatch etc. Autocad 2.0 and other tests are far more important in showing where the CPU will change your experience. I understand your point but the review could and should of included extra data points.
 


We all have our own anecdotal "real world" scenario. That doesn't mean it's for everyone else. Tom's would spend half a year getting a good snapshot across the board to represent everyone's hardware scenario for benchmark tests.

And as I stated previously, the vast majority of users still use 1080p and QHD 1440p is still a small niche market. UHD 4K is even smaller. There is an increasing number of gamers preferring to choose faster 1080p Freesync and Gsync monitors for smoother game play over a 60Hz 1440p eye candy monitor upgrade at the same GPU power level.

But yes, for years game benchmarks have shown for the most part that the higher you go in quality settings and resolution, the more the GPU takes over than the CPU (remember, this is a CPU review, not a GPU review). That's nothing new. It's also why for years CPU gaming benchmarks have always been run at the main stream resolution for time. Case in point, this old 2009 Intel Core 2 Duo E8400 game benchmark using 1024x768 (before 1080p became main stream): http://www.pcstats.com/articleview.cfm?articleid=2394&page=10

^^And pertinent to your comment "Technology is changing and yesterday methodology might not be able to translate well with today reality" that road goes both ways. What may be pertinent today may not be so three years from now (more CPU demanding games even when higher resolutions and quality settings are factored in). We don't know what tomorrow holds. We only know what holds today. With that said, here's to Ryzen going into the PlayStation 5 and Xbox Two!
 

InvalidError

Titan
Moderator
"Intel's Core i7-7700K is the only processor in our test field handicapped by cheap thermal paste between its die and heat spreader."

If Kaby is like past Intel chips, the thermal paste is of perfectly fine quality and outperforms most aftermarket pastes. The problem lies in how thick the gap between the IHS and die is. Usually, you'd have some physical contact between the two surfaces with the paste filling the 0-50 microns differences in surface flatness in-between. For some reason though, Intel "gaps" the IHS around 100 microns off of the die and makes the paste 100-150 microns thick. Any thermal paste's performance would suffer horribly from being that thick.
 
G

Guest

Guest
Only an idiot could thing that Ryzen will get better as more core are being used by games. Well true sherlock but also more cores in i7 will be used meaning Ryzen will never get better. 1080p is a true representation of Ryzen performance, and once you remove GPU bottleneck in 4k you will see same crappy performance compared to Intel in those resolution as well. Right now you don't see that because the current GPU generation is not up to the task. The only good thing about Ryzen is a price and if that is not your problem...don't waste time with it.
 

techy1966

Reputable
Jul 31, 2015
149
3
4,685
Toms good review but as it has been proven memory speed is a very big factor when it comes to Ryzen performance. I guess it has something to do with the Infinity Fabric speed being tied into the memory speed. The faster the memory speed the faster the Infinity Fabric bus speed is. There was a review last week on Youtube that explains it all and also shows the increase in FPS in all games with the fastest memory speed he could get form his kit. It was at 3600mhz DDR4.

In this Youtube video he had the Ryzen CPU's with different memory clocks form the lower 2666mhz to the 3600mhz speed.The gains were huge The Ryzen itself was running at 3.97Ghz. It showed it getting beat by Intel at lower memory speed to out doing the Intel part @5.0Ghz in most games when memory was at 3600mhz on the Ryzen @3.97Ghz.

I think toms needs to try doing a write up like this also. I guess you need the latest bios and Windows updates. Latest Bios give better memory support I think the board used was the Asus ROG board. He also did not use the provided AMD press kit memory I think it was G Skill that said it was made for Ryzen CPU and 3600mhz.

I think AMD's Ryzen is a good CPU and even at 300Mhz memory it does pretty good in the games. 97% of the people would not be able to notice the difference between the Intel part and the AMD part except from what I gather the games do tend to seem smoother on the Ryzen systems. I can not confirm that since I am a Intel Sandy 2700K @5.2Ghz and for me things run smooth even with the old Sandy CPU.

I guess if a site like Tom's was able to do tests like this like I explained above and if they either got the same results or do not at least it would prove if the guys doing these tests on other sites are on the level or just full of BS..:) I tend to take my information from the big sites like this and trust it more than from an unknown site or Youtube video's.
 

silverblue

Distinguished
Jul 22, 2009
1,199
4
19,285

I think AMD have said that they will be using equal core counts in both CCXs, meaning that the Infinity Fabric is still in use. However, I would imagine lower traffic due to the reduced core count overall, or that they will simply be less sensitive to memory bandwidth requirements.
 

s4fun

Distinguished
Feb 3, 2006
191
0
18,680
What is this another BS AMD propaganda piece. There is no "shockingly low prices", when it benches worse than a i7-7700K for games, it does NOT deserve to be priced higher than $299 you can get at Microcenter. This should be an objective article and not an ad for AMD.
 

InvalidError

Titan
Moderator

An i7 has four cores, a 1700 has eight. If games capable of effectively leveraging more than four cores become more common, your precious i7 is out of extra cores for said games to leverage and should fall behind unless you paid 2-4X as much for an LGA2011 CPU and associated platform.

With AMD putting Ryzen test platforms into game developers' hands, there is a high probability that many recent and future games will have optimizations specifically for Ryzen and those could turn tables on Intel too, or at least fix the bulk of performance hiccups.


Yes, AMD has confirmed that all R5 will be either 3+3 or 2+2. Sounds like a huge waste of silicon for the quad-core version but that's how it is. At this point, it looks like people will need to wait for the SoC/APU versions to get a single-CCX quad-core CPU. Even then, AMD may still use the fabric to connect the memory controller with the CCX and IGP.

 

Jo_7__

Prominent
Mar 27, 2017
3
0
510
You forget R7 1700 just 326$,, and is better than 7700K.. for gaming you only lose 10 FPS behind


"It would be easier for us to recommend Ryzen 7 to gamers if it was less expensive. But with Core i7-7700K and Core i5-7600K performing so well, and both CPUs less expensive than the 1700X we're reviewing today, Kaby Lake maintains its leadership. But there remains near-term hope for the Ryzen family: AMD's Ryzen 5 series will surface early in April at price points better suited to take on mainstream Core CPUs. "
 

InvalidError

Titan
Moderator

Computers are used for stuff aside from gaming and in heavily multi-threaded number-crunching scenarios, Ryzen puts the i7-7700k to shame and even gives the $1000+ i7-6900/6950 a run for their money. There is no reason for Ryzen to be priced at a discount based on gaming benchmarks alone where it still performs more than reasonably well.

As noted many times in different reviews, even Intel's own $400+ HEDT chips (i7-6800 and up) are generally worse for gaming than the i7-6700/7700k. You aren't seeing Intel price LGA2011 CPUs below LGA115x ones because of that either.
 

Jo_7__

Prominent
Mar 27, 2017
3
0
510
You Forget R7 1700 is only 326$ more cheaper that 7700K,, that better value,, dont keep misleading people Ryzen is suck for gaming when only 10 FPS behind Matured 14Nm intel Kabylake chip with 5 years ecosystem optimize for Intel.. this baby Ryzen perform well in first born that only need Matured,,


""It would be easier for us to recommend Ryzen 7 to gamers if it was less expensive. But with Core i7-7700K and Core i5-7600K performing so well, and both CPUs less expensive than the 1700X we're reviewing today, Kaby Lake maintains its leadership. But there remains near-term hope for the Ryzen family: AMD's Ryzen 5 series will surface early in April at price points better suited to take on mainstream Core CPUs. "
 

s4fun

Distinguished
Feb 3, 2006
191
0
18,680


Quit trying to change the subject and be an ad for AMD. I do NOT care about rendering, nor would I pay more for HEDT chips. The focus is GAMING first. All others are bonus extra credit. Just like cars on the track, say the Nurburgring, we DON'T give a rat's ass if you bring a top fuel drag racer or super towing heavy duty duece and half, they can all do some odd things better (tow more, get higher top speed in a straight line etc.) than your race car, but they will never win the race on a typical road track. Quit it with this bait and switch, hand waving look at all this other crap distraction. We are NOT fooled. We are NOT idiots like trump supporters. We do NOT fall for BS and LIES! Stick with the program. AMD needs to get their act together and stop trying to scam people. Price at 90% of the 7700K and no one will get mad at AMD.

Only then can Tom's write about shockingly low price with a clear conscience. Otherwise it is just another fluff piece ad for AMD.
 
Thanks for the review as usual!

And why not test in higher resolutions in a follow up review? I don't mind having 1080p, but by neglecting higher resolutions you're leaving information outside of the scope in your conclusions. Although it's fair to extrapolate the usual "but lower stresses the CPU more", that is just an assumption and not fact.

Would you ever followup with a full fledged review mashing up all the way up to 4K?

Or at least investigating the outliers (DeusEX and Tomb Rider) how they escale?

Cheers!
 

Yup but twice the cache per core plus half the transfer between so shouldn't matter nearly as much. The CCX will still be in play but shouldn't matter. The 1200 ryzen 3 with only 8MB is the one that may not have the CCX at all.
 

InvalidError

Titan
Moderator

That's only you. In case you didn't know already, there are other people on the internet and on the planet, many of whom use their PCs for stuff other than gaming.
 

jlyu

Distinguished
Dec 25, 2009
48
0
18,530
Awesome review. I'd like to see a follow up after/if software gets a good optimization for Ryzen and their CCX design. Until then I'd like to see a comparison of all the chips used in this review overclocked. FX 8350 @4.8-5GHz+, 7700/7600K @4.9-5GHz+, 6900K @4.3-4.4GHz+. Maybe even throw in 6-Core Broadwells.

.:edit:.

People keep spreading this BS about playing on 1080P is not real world usage but they need to realize there are varying people with varying opinions in what USB more important.

Not only are 1080p monitors the most common but there are still games that can't run 60FPS smooth with a $500+ GPU when maxed out. Consider the High-Refreshrate monitor users also. It's much harder to run 120hz+ at 1440p/4K. Then there are Ultra-Wide Monitors and Multi-Monitor setups. If some games can't run maxed on a single GPU at min 60FPS at 1080P how do you expect them to do so at higher resolutions and higher refresh rates. Having a CPU advantage helps.

If you are happy with the performance great. Some people may have a different opinion or expectation. Doesn't make one right and wrong. I'm a user in between. I'm gaming less and less and working more but when I do game I like every option turned on. The perfect balance for me is a 6-Core i7. Should have a slightly higher Single Core IPC, higher overclock, better gaming performance than ryzen but better work/rendering workload than a quad i7.

I just hope software can catch up quickly for Ryzen, at least in time for Zen2. Skylake/Kabylake-X HEDT vs Coffeelake/Cannonlake 6-Core vs Zen2 should be a nice battle.
 
I want Ryzen to be good, preferably with both games and productivity.
But apparently in a shot to get both we have neither.

We can't claim that Ryzen was meant to compete against Xeons when the highest Ryzen, 1800X, is a 8C/16T with all motherboards that I'm aware of being single socket.

Xeons scale to a much higher core and thread count along with being compatible with multiple sockets.

Now of course these Xeons do cost many times more than an 1800x, a E5-2667 v4, being about the closest competitor at $2200 each, but anyone needing that level of performance knows it won't be cheap "Unless Foreshadowing see end of post" .

The Ryzen chips do beat the i7s in a few productivity applications like "Large-scale Atomic/Molecular Massively parallel simulator".
But seriously, with a name that long are those scientists really going to choose an i7 for the job?

I would have much rathered if AMD had kept to 4C/8T and optimized that for games instead of going for 8C/16T, maybe even with a frequency boost.

If AMD wanted to go higher thread count for productivity then don't stop at 8C/16T.

Go for the the throat and make a 16C/32T to really put pressure on the Xeon chips to lower their astronomical pricing.

I really do hope AMD has actual server chips in the works.

These they are not.

PS: This chip reminds me of:
https://www.pinterest.com/pin/453385887468075371/

Trying to do both gaming and productivity.
 

mavikt

Distinguished
Jun 8, 2011
173
0
18,680
They should've added World of Warcraft to the benchmark; from what I remember it actually scaled alot with added CPU resources.
About threads; it was ages ago since I read about a game engines getting multi-threaded. I can't believe the problem is that games uses to few threads.
I just fired up Refunct, a 3$, half an hour indie game I bought off of steam last weeked; 47 threads. Not representative by any means, but still.
Perhaps the thread work is grossly unbalanced, or Ryzen just don't get the umphf out of them. Perhaps 'game threads' need to share data between each other whereas 'productivity application threads' just work on their own data until done.
Somethings a missing for general purpose CPU like this.
 
I would have liked to see i5 performance included in some of the workstation tasks like photoshop where they still perform quite well. Not all workstation tasks are heavily threaded and a ~$200 i5 may very well have tied or beaten the ~$300+ ryzen chips. Yes in certain tasks ryzen compete very well but in just as many others mainstream i5/i7's are still competing/beating amd at the same or lower price points.

It's definitely an upgrade over fx so that's good. In quite a few tasks it seems that amd is falling back into the traditional lots of cores/threads but falling behind scenario, especially in many of the games. It will be interesting to see how the ryzen 3/5 chips pan out. In many games the 8c/16t model was struggling to keep up with 4c/4t but I'd agree with others, similar to the hedt chips moar cores/threads aren't for everyone or every task. Power consumption seemed pretty good for ryzen.
 

InvalidError

Titan
Moderator

Many rumors say there is a dual-die Ryzen coming on a new socket with quad-channel memory for HEDT to enter Intel's single-socket LGA2011 space and beyond that, you have AMD's Naples Ryzen-based server chips which have up to 32 cores per socket, two sockets and eight memory channels per CPU.

The R7 was only the beginning.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
866
354
19,360


Thanks, I added the version number.
 
Status
Not open for further replies.