AMD Ryzen 2 vs. Intel Coffee Lake: What's the Best CPU Platform?

Page 3 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
A great time to be a computer consumer and enthusiast, for sure... if not for what amounts to asinine RAM prices as well as inflated GPU prices due to miners or what have you. :/

I'm holding out hope that things even out at some point seeing I'm interested in building a new rig and despite my FX build from 2012 still working just fine for my needs.
 

theyeti87

Honorable


Doesn't Intel also have to share with AMD over the x64 architecture that AMD (AFAIK) has patented?

Ryzen 1's 40% IPC gain is what made me go from a gen 1 i7 to AMD for the first time, ever. Loving the difference between my 65W R7 1700 and the old 150W i7-950.
 

magnus909

Distinguished
Aug 3, 2011
9
1
18,520
One aspect is often forgotten in bencmarks : DAW-performance. That is, performance for music production with lots of virtual instruments and digital effects in a Sequcener/Audio recording program like Cubase, Ableton Live, Studio One, Reason etc.
When it comes to this Intel still rules.
Even the old Intel 7700k with its 4 cores/8 threads got better benchmarks on another site than the best first gen Ryzen at low latencies.
I can only imagine how much faster the 8700k with its 6 cores/12 thread is now compared to the fastest second gen Ryzen (that still is only 10% faster than 1 gen).
 

Non-Euclidean

Distinguished
Nov 5, 2009
463
0
18,810


You make me wanna dig through the closet.

I was a big Diamond fan. So I have a Diamond Stealth, a Diamond Fire and I need to go see what else. There is an ASUS 6830 or so on the W7 box I never use...

 


As far as I know, sound mixing is very dependent upon latency; 1st-gen Ryzen had very high latencies, which was much reduced in the second-gen. On the other hand, the 8700K uses a different, somewhat slower memory bus configuration than the 7700K - so you might actually see a much smaller delta between Ryzen 2xxx and i7 8xxx.
 

magnus909

Distinguished
Aug 3, 2011
9
1
18,520


It's also dependant upon raw CPU-frequency per core in some occasions. But for what it is worth there are also differences in how fast the SSE/AVX(AVX2 instruction sets are executed and it matters for DAWs. AMD:s implementation were lacking in that department and I'm not sure if they are much better now.
Another big thing is how the Ryzen cores were split up into two groups where they had to use the slower memory for some operations. For the Intel everything is communicated at the cores.

Edit : I just took a look at https://techreport.com/review/33531/amd-ryzen-7-2700x-and-ryzen-5-2600x-cpus-reviewed/7
For me the most important scores there are for VI (Virtual Instruments). There the 8700k wins comfortably at 48 kHz (most record at 44.1, like I do though, but that is close enough).
The Ryzen 2:s are clearly better than the first gens were though.
But to choose the 8700k over the Ryzen 2700x is a no-brainer for DAW:s, especially if you consider that the 8700k is much, much more overclockable which will kill the Ryzen in these type of benchies.
Maybe gen 3 of Ryzen will be close enough....

 


They patched what they brought to light initially, yes; however, as time has gone on, new attack vectors have surfaced.
 

bit_user

Titan
Ambassador

Click the "Chart" link, at the very top of the page. You'll find CPU charts which compare CPUs across multiple generations. Hasn't been since 2015 (allegedly, although that link seems broken?).

We need to lobby for these to get updated!
 

Specter0420

Distinguished
Apr 8, 2010
114
35
18,710
I think it is funny watching all the AMD fanboys complain about how Intel winning in gaming doesn't count because most people don't game at FPS that high. It still counts as a win, it still provides more performance. Yeah, you can't tell the difference between 110 FPS and 120 FPS on your monitor right now. But what about when you finally get a VR headset and realize your CPU requirements to feed the GPU just doubled and anything at or below 89 FPS really equals 45 FPS or worse. How about in 2 years when the new games bring your FPS down, now you are looking at upgrading your rig again. I've been happily gaming on a 10 year old i7-920 with a decent overclock paired with a 1060 6GB. It destroys anything I throw at it except DCS World 2.5 in VR, she is starting to struggle with that one but DCS recommends a 1070, so it probably isn't even my CPU holding it back. Regardless, Ryzen really struggles in VR, and it struggles extra in DCS World VR. There are people that need more performance than you, and the Intel chip has more gaming performance, it gets the win.
 

stdragon

Admirable


the 35% hit was primarily in IOPS only. That's because a user application that has to traverse calls through to the kernel forces a CPU cache purge at each transaction. A perfect example would be databases, file transfers to storage, or effectively, anything where a user application needs to commit changes to disk via the file system. To show this, try running a disk benchmark to measure IOPS both pre and post spectre BIOS update fix.

Playing games however, not much of an issue because it's all happening in user-land with the exception of having to read and write back to disk.

The real nasty one in terms of performance hit across the board will be the new Spectre (variant 4) called Speculative Store Bypass. That can have up to an 8% hit on integer performance. Floating point and vector (SSE) based performance hit is unknown. FYI, most CPU operations are integer based. It's so bad, that the security fix will be disabled by default, and that Intel states the user must decide performance or security, but not both. OUCH!
 

Indeed - however, if you're coming from a 7700K, the Ryzen 2xxx do provide a noticeable improvement for much cheaper than the 8700k (as in, more than a hundred bucks when you consider the motherboard price premium and the cooler you have to buy for the Intel system). They're nice runner-ups instead of being out of the race.
 

zetzabre

Distinguished
Jan 27, 2011
54
0
18,630
Nice to know all the details. I feel pretty confortable with my 8700k. But i know there are others who will find a Ryzer like a better solution.
 
"Ryzen 2 wins this competition by a nose, winning five rounds to four."

So much for the AMD fanboys claiming that Tom's Hardware (and Paul Acorn specifically) is biased against AMD and pro-Intel (yeah you know who you are). Their silence of any complaining here like they normally do in an AMD review mentioning the cons about it is deafening. That's right fanboys, there are pros and cons of BOTH chip brands. Hiding any negatives about an AMD chip just because you favor AMD is not being truthful in a review. If you want a pro-AMD echo chamber, go read WCCFTECH, aka the US National Enquirer/UK Daily Mail of tech websites. Nice and factual balanced review and comparison Tom's!

And I fully agree with the conclusion: I'm going Ryzen 2 for my next gaming build later this summer which will also be used as a Vegas Studio video editing machine for 4K. Besides, at 4K resolution, there is no difference between the two chip brands in gaming when two directly competitive chips are compared (say Ryzen 2700X vs. i7 8700K). Now if we can just persuade AMD to invest in the development of an Nvidia high end GTX 1180 Ti competitor that will be released early next year (based on history of new generation Nvidia GPU releases).
 

Karadjgne

Titan
Ambassador
Heh. With what Vega can do with ½ of an APU in graphics, why something like a full sized Vega 96 couldn't be at least equitable to a 1080ti or better is beyond my understanding.

It's like AMD has a recipe for awesome pizza, but a extra-large is only 12" around.
 

bit_user

Titan
Ambassador

Have you seen the power consumption figures for Vega 64?

The first problem with making it bigger is cost. The die is already huge and expensive. AMD probably had a razor-thin profit at the stated launch price.

The second is that you would have to clock it lower, to keep power dissipation from being insane. Then, to compensate, it would have to be still larger, which runs against the point above.

The third issue is that HBM2 is really expensive (signs indicate they mis-predicted this, and it might be the limiting factor in supply), and a larger Vega would probably need at least 3 channels.

Finally, I suspect that GCN has architectural limitations around > 4096 shaders, which is why they spent a lot of their silicon budget on higher clocks.


I think diameter of Vega 64 isn't the problem - it's thickness. It looks really good when you open the box, but you pick up the slices and they're really thin, with sparse toppings. Either way, I think we agree that more was needed.

IMO, GCN is running out of steam.
 

Olle P

Distinguished
Apr 7, 2010
720
61
19,090
If you can't tell the difference, it's not a win.
I have no idea how VR influence the CPU load. Seems likely that the CPU usage will be altered as well, which means that if the Intel CPU was marginally faster in the regular game it might run worse in VR...
Only tests can tell, if VR will ever become relevant to a significant portion of users.

So what makes you think today's CPUs will become "bad" in just two years? It's far more likely that a) the GPU will still be the primary bottleneck, and b) future games are better at utilizing more CPU threads.

And what about all those that like to stream video while gaming? The video stream from Coffee Lake i5/i7 leave a lot to be desired with stuttering and frame drops while Ryzen 5/7 fare a lot better.
 

madmat9

Distinguished
Feb 13, 2009
10
0
18,510
I see no reason to buy a new CPU until a newer generation is released that is not vulnerable to Spectre/Meltdown.
 

bit_user

Titan
Ambassador

We seem to be in an era where there's a continuous stream of vulnerabilities getting announced, so any new chip will be vulnerable to something and require a performance-robbing fix of some sort.
 

Gadhar

Reputable
Sep 26, 2016
189
6
4,715
I like that fact that for the first time in a long time that a person can choose either AMD or Intel and have almost the same experience. Yes Intel has a slight edge in most games and AMD has a slight edge in most productivity, but instead of trying to prove which one is better over the other we should simply sit back, buy what we prefer and be happy that either one is a good choice. Trying to argue that one is better than the other is like trying to convince a person that has different political views that your political views are the best.
 

urbanman2004

Distinguished
Aug 17, 2012
209
9
18,695
I'm just ready for Intel to release their supposed secret 8-core Coffee Lake processor based on the 14nm process this summer just to give myself a reason to upgrade ??
 

bentremblay

Distinguished
Jan 2, 2012
138
0
18,680
The image at the very top gives the wrong impression that it's Ryzen VS i7.
Because I'm a p**** I persisted, to find what I wanted: comparison with the i5 8400 I just bought.

^5
 


Personally, I don't think so. The fact is that the more parallel an infrastructure, the more keeping it fed becomes a problem.
Currently, people cater to driver-side optimization because that's what the market leader (i.e. Nvidia) recommends and invests massively in. AMD has, since GCN 1.1, recommended that the hardware is allowed to allocate its own resources with fine-grained prioritization - i.e. Mantle, DX12, Vulkan, async compute and other nifty little elements - clock for clock, Vega gives Pascal a run for its money in pure processing power.
Thermal envelope being sh*t on AMD's current chips is true, preventing it from clocking higher - but GCN as an arch is far from failed, and were Vega cards clocked at speeds similar to the 1080's, AMD would perform much better.
Now it would indeed require some reworking to work at higher frequencies with lower power consumption, but I don't think GCN has any limitations (other than practical chip size) to 4096 units apart from diminishing returns from a monolithic/single-threaded API.
 
Status
Not open for further replies.