AMD Ryzen Threadripper 1950X Review

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

iPanda

Reputable
Mar 25, 2015
87
0
4,640
so, i'm going to build a threadripper rig... but i also want to revamp a newer intel build... so throwing a 7700k into but... upgrade path to 8700k; will it be a new socket or drop in?

That's what I like about this AMD set-up so far, as they stated some socket stability for awhile. but what about intel? any confirmations or leaks on this yet?
 

spdragoo

Expert
Ambassador


It's a new socket, LGA 1151 v2. Same number of pins, but not pin-compatible. So Coffee Lake CPUs won't work in the 100/200-series boards, & Skylake/Kaby Lake won't work in the 300-series boards.

Basically, yeah, if you're looking for a primarily gaming platform, you're not going to consider Threadripper...but you won't be considering Skylake-X, either. Which means you're back to the Kaby Lake vs. Ryzen debate...& while Intel might be able to claim absolute #1 performance in that category, they only hold that title if the consumer doesn't care how expensive their system is. Most people have to consider their budget when buying a gaming PC -- either because they have other financial considerations (house payments, car payments, food, etc.) or because they are dependent on someone else (usually their parents) for the income/cash to buy the system.

As for the actual performance, I saw 4 games (BF1, Shadows of Mordor, Rise of the Tomb Raider & Witcher 3) where the spread between the fastest & slowest results was very, very small -- only 15FPS on Mordor, under 10FPS on the other 3, all of them under a 10% spread. That's not only pretty far from a "commanding lead" or "definitive win", that's bordering on a "too close to call". Also, on all 4 of those games, none of the chips saw significant improvements from overclocking (think the best was a 10% gain for Threadripper on Far Cry). And that's considering that they were able to overclock the i7-7700K by a whopping 36%.

But the whole, "Intel's such-and-such chip has a higher clock speed than AMD's chip, so it must be better overall" smells way too much about how AMD fanboys crowed about Bulldozer's clock speeds compared to Intel's chips...& how they then got burned when initial performance & IPC were way down. And that's still a problem this time around, only the problem is on Intel's side. Sure, that i7-7700K was able to overclock to almost 5GHz, & it managed to get an OC speed 22.5% faster than the R7 1800X... but unless it was able to significantly outperform the R7 chip (i.e. provide more than 22.% faster performance compared to the R7 1800X), then that means the Intel chip provides less performance per clock cycle...which, last time I checked, was part of the definition of IPC.

In short, it's almost like we're getting to a situation where Intel has to compensate for poor IPC by raising their clock speeds...
 

bloodroses

Distinguished


That statement is completely untrue. On a clock per clock basis, the Intel chips are faster than Ryzen chips. IE, Intel still has a faster IPC. AMD straight out even admitted that before the launch of Ryzen. The situation you're seeing with those benchmarks being so close is due to GPU bound, not CPU bound.

 
I'm going to ignore the apparent flame war that's occurring and simply state that this happens to be expected the kind of CPU I've been waiting for.

I use tons of bandwidth quite frequently, and also value a quiet computer. I was about to get an i9, but that fell through when I saw the cooling requirements.

It's also nice to see that this CPU can also handle gaming on the rare occasion that I decide to run one. That said, it'd still be a good fit for me even if it couldn't play games at all.
 

Brian_R170

Honorable
Jun 24, 2014
288
2
10,785


Wait, I thought the Core i9 CPUs are either 140W or 165W (for the unreleased 12-18 core models) and ThreadRipper is 180W?

 
You're correct. Unfortunately, the i9 uses a paste as a TIM, and that happens to create a bottleneck in the thermal transfer chain. You end up with a delta-T across the paste that is larger than the delta-T across the entire rest of your cooling system.

As a result, you're forced to run the fans at a much higher speed than the TDP would indicate, and even then you can't push the CPU to it's limits in some situations or it will throttle. It's a giant headache in my application, where I would essentially need to keep the cooler below ambient to prevent throttling. The resulting temperatures also raise some reliability concerns in my situation.

UPDATE: See this for more detailed information and the measured data that I based my conclusions on:
http://www.tomshardware.com/forum/id-3464475/skylake-mess-explored-thermal-paste-runaway-power.html

The results have been verified by a few other reviewers, as well as Der8auer.

Much of the issue is due to the linear voltage regulator that Intel decided to put inside the CPU package. God only knows why they decided to do something that silly.
 

Crystalizer

Distinguished
Dec 11, 2011
50
0
18,640
Can you add vr in the tests? It needs higher resolution and framerate. Also the experience have to be completely flawless(No lags, lag spikes, stuttering). This would be a relly interesting comparison.
 

grozzie

Reputable
Feb 26, 2014
25
0
4,530
64 PCI-E 3.0 lanes! can we expect M.2 ssd's in bootable RAID? Might not be on-board but plenty of scope for add-on cards. Also with such a big die is there room to add HBM2 memory to the CPU?
 

rwinches

Distinguished
Jun 29, 2006
888
0
19,060
I see in one chart that your 1920X OD'd to 4.1 (pg 10 Autocad 2016 3D) so it would have been your duty to your readers to include the 1920X in these charts as you always say 'to be fair'. I mean if your 1950X lost the silicon lottery then the 1920X would be more representative.
What voltage was required to get the 1920X to 4.1?
What temps at that OC?
Seems that Tom's never consistently gets AMD OC@voltage numbers that most others get or memory Clocks.

Using synthetics that don't scale is disingenuous as is using striped down systems that have little or nothing else going on. In the real world very few have systems with nothing but games loaded or just content apps. The single core performance is getting old as an argument. Software is is going to make use of cores and threads going forward and VR AR is getting better and will be more demanding especially when combined with streaming.
Multi monitor setups are getting more popular as prices for good performance drop and are a better solution than high priced wide screens so testing using multi screens should be included in reviews of GPUs and CPUs.
Just what percentage of users use compression SW for anything than packaging?
If a movie is converted to run on a phone or tablet who doesn't do other things while waiting? A couple of minutes has little meaning these days as speeds have increased all around.
 

M.2 RAID is apparently in the works, but it isn't ready yet. AMD has been pretty good about keeping their word on features like that lately, though, so I suspect we won't have to wait long.

Regarding HBM2, it's unlikely that they'll add it. They'd have to redesign quite a large chunk of the CCX in order to leverage HBM in any of it's variations.
 

rantoc

Distinguished
Dec 17, 2009
1,859
1
19,780
Hardly unsurprising that this chip isn't designed purely for gaming.

Gaming is like running a race track with quick turns... what's the point to bring a truck? That said the truck can haul some serious a$$ when having the right workload compared to the car... that said it performs very well in 1440p+ gaming and if someone shells out this kind of money on a cpu/platform chances are quite darn high they wont settle for 1080p IF they game at all.

Very nice platform for productivity!
 
I don't know why everyone is so upset at gaming benchmarks. They are complaining over on Anandtech about it as well. They act as if nobody uses high end chipsets for gaming and instead strictly for productivity. Nothing could be further from the truth. Besides, at 1440p or 4K or multiple monitor setups in gaming, there will not be any real CPU difference in gaming vs. the Intels. But there most certainly will be a difference when you are gaming, streaming, and using game footage capturing software all at the same time. Now that's what I'd like to see a comparison of. Real world applications like that. I'm sure more tests will be done down the road, so everyone just take a chill pill.
 

It's pretty simple when you think about it. It's easier to justify a $999 CPU purchase based on non-gaming metrics (i.e. to your business, spouse, parents, etc.) than it is to justify something that they would interpret as "I'm going to go get a glorified X-Box, and just one of the ten or so components costs $999."
 

bit_user

Polypheme
Ambassador
AMD's 16C/32T Ryzen Threadripper is likely the biggest processor release of the year, quite literally.
*Ahem* Epyc? Both in terms of working die area and sales volume/revenue, it should significantly overshadow TR.

TR is the biggest desktop CPU release, this year.
 

boringpie

Distinguished
Sep 20, 2011
9
0
18,510
Finally, a review that includes mathematical/computational performance!!! Big like to Tom's Hardware. Been struggling for years to find a suitable CPU for financial modelling. Time to replace my Xeon E7.
 


Everyone is acting like gaming is the focus here. I don't think you understood my point. There are more people than you think who use both productivity and games with a high end CPU. It has nothing to do with their domestic or professional situation (outside of wealthy kids, I don't know of many who would get their parents to buy them a $1,000 CPU anyway). This is not the same thing as say someone buying a $1,000+ workstation GPU like Vega or Quadro.

Tom's has plenty of bandwidth to run pages and pages of benchmarks to cover a broad spectrum of performance results - like the guy above said with mathematical/computational performance. To ignore the gaming segment would be an omission and you'd get an equal number of complainers asking why no gaming benchmarks were done. Tech review sites can't win no matter what they do/don't do. Someone is always going to complain about something because they don't like what they see.
 

DerekA_C

Prominent
Mar 1, 2017
177
0
690
I can tell you this is just a warm up of what is to be coming. They have Samsung as a partner and that is the biggest game changer for AMD. As Samsung expands and includes AMD into their fab designs and ideas, AMD will eventually take over the chip market. They are already positioning themselves into the right angle. This Infinity fab design is multi purposeful like GPU+GPU+GPU+GPU or CPU+GPU or CPU+GPU+CPU+GPU and when they get to 7nm+ who knows. Then there is Samsung doubling down on hbm2 eventually that will mature to hbm3 and it is the fastest lowest latency memory type to date. To think they only half piped the Vega at 2048bit 484gbs bandwidth they can easily double that. I would also like to take note that the IPC improvement for AMD has been 60% from one generation leap, when was the last time Intel has done this? With one chip refresh AMD could get their ryzens up to possibly 5ghz base line, but more likely 4.1ghz with boost of 4.5ish I'm sure. This is all first gen stuff here. I never buy first gen sorry. Exactly why I waited for Devils Canyon for Intel I7 4790k over the 4770k I knew they could do a refresh and they did and it runs at 4.4ghz @ 1.1volts.
 


AT stock yes but I would probably Intels 18 core CPU will be close to the 10 core.

What is interesting though is that when it is balls to the walls both CPUs suck power like no other.



The F250 is a pretty poor example especially since the F series is the best selling truck in the US.



True but the dark side does have cookies.
 

verndewd

Distinguished
Mar 27, 2009
634
0
18,990
Daw's have been around ages, Digital audio workstations can pound a cpu memory load and yet nothing in the so called tech community ever evolved for the millions of composers doing mock ups for film and TV, a test bench isnt available for heavy audio users in studios because all geeks think about is gaming and office productivity, this was mostly useless for me. i wouldnt expect Hanz Zimmer to glean anything useful here, That being said, AMD finally made the chip ive been waiting for, the future looks interesting now, for AMD. great stuff. Ryzen 1800 looks pretty amazing in some benches and the numbers really caught me off guard. But you seriously need a benchmark for a 100 set orchestra with native instrument patches and libraries, until reviewers start doing that its almost useless to people with music production degrees to read this.
 

ah

Reputable
Oct 29, 2014
69
0
4,630
I'm waiting for the review of the 1900X, 4 cores precision Boost to 4.2 GHz is good enough for me, no need to overclock.
 
Status
Not open for further replies.