Ryzen Versus Core i7 In 11 Popular Games

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

Talking about this problem.
Someone found this (check the two videos):
https://forums.anandtech.com/threads/ryzen-strictly-technical.2500572/page-21#post-38789965
I think that this is an architectural problem due to Ryzen MCM nature (2xCCX).
Thats why AMD said that there is nothing to be fixed.
I might be wrong though...
edit: Does Paul Alcorn have time to check this?
edit2: My bad, PCPer already checked it from what I read on their article.
 
This is the first time I hear someone recommend to disable the BIOS setting for the High Precision Event Timer (HPET).

What would happen if you disable that setting also on the Intel Motherboards ?

Switching this setting would also help overclocking ?

Can you elaborate a little more on what exactly this BIOS setting does at the CPU - Mottherboard level ?

Thanks a lot in advance.
Roberto.
 
Just find it hilarious how butthurt fanboys are so quick to jump to conspiracy theories. This review has been on par with what I've read everywhere else. More cores don't alway equal more FPS, it's just not that simple.
 


Then you proceed to show benchmarks in which a 4 core, 4 thread 7600k outperforms an 8 core, 16 thread 6900k and the similarly equipped Ryzen processors. What was the point of adding that line when it's clearly not very accurate, at least beyond 4 cores? Going by these results, the "sophisticated engine" of Project Cars clearly does not scale well with additional CPU cores. Even if we account for the higher clock rates (which should benefit any CPU-limited game, no sophistication required), it's clear that the game isn't making much use of more than 4 threads. Perhaps it scales well up to 4 threads, but that's not particularly useful when we're testing 8 core, 16 thread processors, and it shouldn't be implied that it scales well on these chips.




One thing to keep in mind though, is that most people tend to keep the same CPU for several years or more. It's likely that games will start to make use of more threads in the coming years, and it's very possible that these 8 core CPUs could start to outperform their higher-clocked, 4 core counterparts in demanding games before long. And of course, none of the games benchmarked here were tested on Ryzen hardware during their development, so they haven't been optimized for the new architecture, and it's likely that we'll see optimizations to better make use of Ryzen processors in upcoming games. Added together, there's a decent chance that a 6 or 8 core Ryzen CPU could have better staying power in the long run than a similarly-priced 4 core Kaby Lake. Nearly all existing games run well enough on any of these processors to perform identically at the refresh rates that most people game at, and by the time new games start to push CPUs notably harder, those games are likely to be better optimized for the Ryzen architecture.

About the only way I'd say Kaby Lake clearly comes out ahead, is if someone is using a 144Hz screen, and has a high-end graphics card with enough performance to push those kinds of frame rates in these demanding games. Of course, for most competitive first-person games, where high frame rates on a 144Hz screen might have the most impact, all these chips would once again perform similarly, since games like Overwatch or CS:GO are designed to keep CPU usage low to maintain high frame rates even on lower-end hardware.
 

You are WAY overstating the importance of 4K gaming. According to the Steam hardware survey (the latest one) only 0.69% of gamers use the 4K res. You have it backwards :)
1080 is still overwhelming.

 
I know it probably sounds stupid for some of you guys but I want to see factorio and minecraft as games for cpu benchmarks. Modded minecraft with high view distance is completely crippled by cpu performance, factorio mega-factories bring the game to 1/3 to 1/5 the normal physics rate.
 



Wait are you really just writing off my question?

Do I need to post links lol. Techpowerup and Techspot aren't some fanboy blog buddy. Stop projecting your bias on me...
 
Interesting, so the 6900k broadwell-e arch now be enough powerful to beat the 7700k kabylake arch in 1080p. Even as no seemed in any others reviewss.

C'on tom$hardware we need serius reviews. NOT PAYED.
 
Interesting, so the 6900k broadwell-e arch now be enough powerful to beat the 7700k kabylake arch in 1080p. :B
 
Wait, are you saying AMD payed Tom's Hardware to make the 6900K perform worse than the 7700K in games?

Damn, where's MY green green?

 


This is a good point, actually. Steam statistics, right now:

Factorio: 4,488 players online

Ashes of the Singularity: 73 players online

11 Popular games... indeed. : P
 
Out of interest, what do you consider "great" framerates and what games is your rig struggling with? Do you have a variable refresh rate monitor?

 
I read that in the Commando's voice from Command and Conquer... :-D
k8xlaT3.jpg


I'm guessing that you're referring to my question about the Intel C++ Compiler and what they were found guilty of and if they're doing it again. I highly, highly doubt that they are, but one occasionaly needs to check things that appear to be obvious and logical. Intel's (and any for-profit company) primary purpose, after all, is to create a return for their investors and they are staffed by humans, and humans can occasionally make bad decisions.

And yes, I agree with you: more cores does not always result in better performance.

 
they are only teething issues all new tech goes threw this even intel has its issues i may own itels atm for reliability i am waits 6 months for the price & issues to be worked threw before i buy think about when cannon lakes comes out intel going have simular issue to the ryszens are now then we will see who comes out swinging. Give AMD a chance only learning with new aritechture i predomanently use intel but uptto haswell as see no point wasting any $ on the next gen intel as i gain nothing much more for the set ups i am running for teenager gaming & streaming requirements. I am saving for a rysen or a zed just wait 6 months then watch intel squeel
 
I have an i7 6700K and in my opinion Ryzen is a verry good CPU. Is better in most aplications than 6900k , at half price. In gaming i have a question who on earth can see the difference between 80 fps and 90fps or 100fps and 110fps in a game when most of us have an 1080p monitor with 60Hz refresh rate ?
 

This.

I'm running a 6700 (non-K) and I'm VERY happy with it. It's a massive performance jump over my old faithful E6750. Actually, it's more performance than I need right now outside of Battlefield 1 that uses 8 threads. (Even the VMs that I used ran fine on my E6750!)


  • ■Am I going to rush out and replace my 6700 with a 1700? Nope.
    ■Would I mind having 1700? Yeah, it would be nice, but I wouldn't notice a difference outside of using Handbrake.
    ■Will I recommend a Ryzen to someone if it's 1) in their budget range, 2) they don't want to upgrade their PC for a few years and 3) aren't ONLY gaming? Ab. So. Lute. Ly.
 
I'd be more interested in performance analysis in Adobe applications. That's where my aging Ivy Bridge struggles most, improved timings when working with RAW files would make me buy a new HW. I was hoping for AMD to be competitive in this area and I was disappointed. I have my gaming needs already covered by 8-core AMD in my XBOX ONE, no complaints.
 
Can't really see why this CPU is so hyped.

Yeah .. it does reach Intel in a few benchmarks, but in most regards an I5 outperforms it pretty easy.

And lets face it. Highly doubt Any serious business will choose AMD and their poor reputation in stability for any Mission Critical server systems. i surely wouldn't, and the average home user really don't need more than 4 cores.
There is very few applications that seriously takes advantage and benefits of more than 4(8) cores from a standard i7.

IMO, Gonna take a whole lot more than the Ryzen to put AMD back in the game again after being left completely overhauled by Intel for most of the time the last 20 years.

The big problem have always been, that no matter you can find some AMD models that really shined and made Intel shiver a little, the motherboards and chipsets were always their big problem.
I have yet to see an AMD MB that could actually run 24/7.
Even for a home User there have been so many examples of crappy and buggy AMD chipsets that never got to run stable even with several firmware updates.
 


They do not mention it, because this rumor was denied by both MSFT and AMD. W10 is using the Ryzen correctly according to AMD. Not the OS, the applications are not utlizing the CPU efficiently.
 
@PaulyAlcorn I find the Civilization VI results very interesting. In the AI Benchmark tests your results show the Core i5 7600K having the fastest average turn time, which I wouldn’t have expected. I was under the impression that the CPU bound title would benefit from additional cores, as I’d read in another article attempting to measure CPU performance for CIV VI. I assume the 2 second difference between the i5-7600K and the i7-6900K is outside the margin of error? Do you have a theory to why the beefier processor lags in average turn time, but then blows the i5 out of the water in the frames per second results in the AI test? I don’t really understand why the i7-6900K would have slower turn time processing, yet still provide a higher frame rate. I’d appreciate your expounded thoughts on these results.

Also, somewhat separate from the results discussion, do you have an opinion on which processor provides the better experience in game? I wonder if the benchmarks provide one result where your in game experience may be different.

Thanks for your input!

Cheers,

Moose
 

Check out Puget Labs's results for Adobe.
Photoshop: https://www.pugetsystems.com/labs/articles/Adobe-Photoshop-CC-2017-AMD-Ryzen-7-1700X-1800X-Performance-907/#Conclusion
Lightroom: https://www.pugetsystems.com/labs/articles/Adobe-Lightroom-CC-2015-8-AMD-Ryzen-7-1700X-1800X-Performance-910/#Conclusion
My conclusion based on Puget's data: Get a 7700K as it seems to offer the best balance of performance in both applications, as well as budget.

Tom's results: http://www.tomshardware.com/reviews/amd-ryzen-7-1800x-cpu,4951-8.html
My conclusion based on Tom's data: Get a Ryzen if you mainly use After Effects, otherwise get a 7700K.

However, I would wait for a little bit to see if Adobe is releasing a patch containing Ryzen-specific enhancements (because hey, who actually cared to optimise their product for anything other than Intel before now?)
 
Sounds like the thread scheduler may need to treat these quad-core groups similar to separate CPU sockets. Context switching is expensive, thread migration is even more expensive, and transferring data between caches does not help.

I've always questioned AMD's use of high level exclusive caches. Inclusive caches reduce the latency among cores by keeping the high level cache primed. This does reduce efficiency of cache usage and incurs a constant overhead cost, but the reduced latency is a huge win for the kind of multi-threading that is needed for interactive programs.

If you want heavy number crunching with large datasets, like what GPUs are already good for, then exclusive large caches are awesome.
 
Status
Not open for further replies.