AMD Ryzen 7 1700X Review

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

SomeTechDude

Prominent
Mar 28, 2017
2
0
510
Seriously, Ryzan is a new platform and needs some time for the glitches to be worked out. It is every bit as capable as Core i7-7700k in gaming. Just wait for a while until the optimizations and tweaks start appearing.

Why do you need more than 60fps with a good nsync or freesync monitor anyway? It is just fine for gaming right now and will be better soon.
 

computerguy72

Distinguished
Sep 22, 2011
190
1
18,690
I bought the 7700k and happy with it but the testing methodology for Ryzen is a little confusing to me. If you invest in a nice cpu and motherboard ($400+), probably new memory etc.. Isn't it also likely you are upgrading to move *from 1080p to something higher? I'm not sure who this review is for. People who upgrade to relatively high end equipment but still game at 1080p? Not sure too many people will benefit from that analysis.
 

InvalidError

Titan
Moderator

This is a CPU review, not a GPU review. The point of choosing 1080p for CPU benchmarks is to see how quickly the CPU can run the game's logic when the GPU bottleneck is reduced as much as possible. In an ideal world, you would use a NULL driver to eliminate the GPU altogether, effectively emulating a GPU with infinite processing power.
 

Well, getting headaches can be problematic, this I can understand for sure. The thing is, NTSC is only 30fps and so are consoles. Do you get headaches from movies and consoles as well? PAL is even worse, it's only 25fps. I'm not trying to dismiss what you're saying because I do believe you and I've read of others having similar problems with headaches or even vomiting. I'm trying to understand it because thankfully, I don't get these symptoms. Sorry if I'm being inquisitive, it's in my nature. Is it simulator sickness (I've heard of that) or is it something else? I fully agree that in your situation a higher frame rate would be a requirement but I was talking in general and I know that such reactions, while not overly rare, are still far from common. So my question is, do other forms of media affect you this way? Is it related to the delay between mouse movement and screen reaction (I've read that can do it too)? You don't have to answer if you don't want to, I only ask because I always try to get all the perspectives that I can.

I do stand by what I said about frame variance however. If you want to nit-pick at the maybe 2% of times that the CPU causes it compared to the 98% of times that the GPU causes it, I really don't know what to tell you. Developers code their games to avoid this because they want their games to run well. When a GPU causes it, it's almost always a problem with the drivers. Focusing on a CPU for frame variance is like a judge throwing the book at a shoplifter and letting the drug lord go free.
 


Another "expert" with zero best answers and no expertise badges. It's funny how that works isn't it? LOL
 
Check this out, there's a YouTube Channel called "Game Testing" that pits different CPUs against each other and also different GPUs against each other and shows them in split-screen mode in real-time. On the left side of the screen you have one contender and on the right side you have the other contender both running at the same time.

I think that these are really creatively done (and I didn't do them) and they really do show the difference (especially the lack thereof) in gaming between the Summit Ridge (R7-1700) and Kaby Lake (i5-7600K) architectures. Both CPUs are at stock clocks, the resolution is demonstrated to be 1080p and an nVidia GeForce GTX 1070 is used to ensure that there are no GPU bottlenecks. Enjoy!

We'll start with a video that pits the R7-1700 against the i5-7600K in seven modern titles. Those titles are Project Cars, Arma III Apex, Fallout 4, Rise of the Tomb Raider, Hitman 2016, Just Cause 3 and Far Cry Primal.
https://www.youtube.com/watch?v=RBbJtOPUVcU

Now to add more like Grand Theft Auto V:
https://www.youtube.com/watch?v=UFffMS6m2vg

The Witcher 3:
https://www.youtube.com/watch?v=TXOHmZEQmAc

Crysis 3:
https://www.youtube.com/watch?v=piNUGg3pejA

Need For Speed 2015:
https://www.youtube.com/watch?v=y57Beci-vf0

Far Cry 4:
https://www.youtube.com/watch?v=jJ4Mi62bvkA

Watchdogs 2:
https://www.youtube.com/watch?v=rNq2hPMrmTw

And Battlefield 4:
https://www.youtube.com/watch?v=vgLNFpgFOl4&t=48s

So there's fourteen games, all played at 1080p with a GTX 1070 with one side of the screen using an R7-1700 and the other side using an i5-7600K. I hope that this helps give a real and meaningful demonstration, far better than any benchmark bars ever could. I didn't make these videos but they are the best thing I've ever seen for demonstrating real gameplay experience between Summit Ridge and Kaby Lake. Their channel has a whole bunch of other videos including videos saying that I don't even need to upgrade my old FX-8350 for gaming just yet!

And the best thing about it? Nobody can argue what's clearly in front of their face and they don't have to take my word for it! LOL
 

shknawe

Respectable
Oct 22, 2016
1,287
47
2,490


90 % of gamers still game at 1080p or lower that's why.
 


YouTube reviews are not trustworthy IMO. We do not know what they are doing "behind the scenes" and only taking their word for it. I prefer to trust credible professional hardware review websites like Tom's who have been in the business for at least 15 years and built up a reputation.

That's the same reason I never trust anyone's FPS bragging about their rig and showing the FPS counter in the video during gameplay. Anyone can say anything: "Check out how my RX 470 is hitting 90fps at 1080p in ultra and 4xAA in Doom!"



That's been mentioned here at least a half dozen times and people still do not get it. The fact that CPU performance in games means less at higher resolutions and quality settings is absolutely nothing new in hardware testing. You can only tell someone something so many times. If they do not want to accept it, you cannot force it on them.
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
I don't see much to gain from watching side by side YouTube videos of cpus being benchmarked. At best you will see variances in how the gpu handles color and that's about it. Image quality can be the result of compression. The videos themselves cap out at 60hz. So you won't notice the difference in these benchmark suites where the minimum frame of nearly all the cpus are above 60fps. You might get microstutter from compression.
 

spooh_

Prominent
Mar 14, 2017
3
0
510
I don't get something....
How it happens in single threaded workflows that 1800x at 3.8GHz beats stock 1800x XFR'ing to 4.0-4.1?
 

orifiel

Distinguished
Nov 12, 2010
44
0
18,540
Today the amd facebook page, promoted a video by a guy in youtube, he used higher latency ram at 3600 and the results are greater than i7 7700k in gaming!! Maybe we can have some more test and reviews in the near feature. I am not sure if I can post this video AMD promotes but the user who run these tests is called... MindBlank Tech

He used a g.skill cl16 I think and that makes it even more interesting. We have to invest in specific rams to make the cpu performe.
 

Robert_390

Prominent
Mar 29, 2017
1
0
510
The interesting thing to me is how well the 8350 does on gaming relatively speaking given that it's ancient tech and you can pick on up for $150. 25% to 50% better performance at 200% of the price doesn't strike me as an awesome deal unless you absolutely need to have 50fps at full resolution and detail.
 

InvalidError

Titan
Moderator

Once the R5-1400 launches, it'll be 25-50% more performance for about the same price as the R7's extra cores are under little to no load in most games anyway.

Most people don't buy massively multi-core CPUs for gaming. They buy them for productivity first and gaming is either a bonus or non-applicable.
 

lsatenstein

Distinguished
Mar 8, 2012
77
0
18,630
How can Intel Fight back?

Intel can introduce some new instruction(s), that offer a "patented" function not available with Ryzen. True enough, Ryzen could do that too. But with AMD doing that, it would be similar to the tail wagging the dog.

I am interested to know the differences in instructions sets between the two vendor's chips. Do both have aes instructions, for example?
 
An i3 and i5 are both better value CPUs than the 8350, as is a Sandy Bridge i7.

 
I'm pretty sure it's frame variance or simulator sickness. I get nausea, or at least headaches from eye strain with frequent dips into the 20fps range, despite the fact that movies at 24fps that have visible frames (during scenery pans) don't make me sick at all.

And for the most part, I barely get motion sickness, such as teacup rides or mountain roads.
 

falchard

Distinguished
Jun 13, 2008
2,360
0
19,790
I think Disney found that the human eye can detect fps differences up to 56 fps. Newer research found the human eye can see above 200 fps if they are focusing in on something. I think just a dip in fps can cause eye strain, even if its 100 fps to 60 fps. I feel constant FPS is much easier for a person's eyes.
The 24 fps thing is actually more of a minimum to achieve fluid motion in animation. It has nothing to do with the limits of the human eye or if it can cause nausea.
 

InvalidError

Titan
Moderator

24fps is a compromise between the minimum necessary for reasonably fluid motion and the amount of film needed to shoot with. On today's huge screens though, even 60fps can still cause very noticeable trailing in fast-moving scenes.
 


What's funny though is that we've gotten so used to 24fps that when most people watch stuff at 60fps, they don't like it. It doesn't have that "movie" feeling.
 

inmotion

Distinguished
Jul 11, 2011
3
0
18,510
It would be good for a review that was not based on gaming, some of us actually work for a living and require a machine that can handle the load.
 


Yes both have AES instructions. But Intel has a more powerful variant of it.
 

ah

Reputable
Oct 29, 2014
69
0
4,630
People complain about the 6800K is not as good as the 7700K, but my 6800K clocked to 4GHz, G1 1080, 1440p, running Ghost Recon Wildlands an average of 74 fps, at very high preset. While most reviews have managed only an average of 67 or 68 fps at very high preset. Also, my 6800K always 20C cooler than the G1 1080, 47C vs 67C after 2 or 3 hours of gaming.
 
Status
Not open for further replies.