AMD Ryzen Threadripper 1920X Review

Status
Not open for further replies.

Aldain

Prominent
Jul 3, 2017
5
0
510
Great review as always but on the power consumption fron given that the 1950x has six more cores and the 1920 has two more they are more power efficient than the 7900x is every regard relative to the high stock clocks of the TR
 
"Ryzen Threadripper 1920X comes arms with 12 physical cores and SMT"

...

Judging from Threadripper 1950X versus the Threadripper 1900X we can infer that a difference of 400 megahertz is worth the tdp of 16 whole threads.

I never realized HT / SMT was that efficient or is AMD holding something back with the Threadripper 1900x?
 
Hats-off to AMD and Intel. The quantity (and quality) of processing power is simply amazing these days. Long gone are the times of taking days off (literally) for "rasterizing and rendering" of work flows

...or is AMD holding something back with the Threadripper 1900x?
I think the better question is, "Where is AMD going from here?"

The first revision Socket SP3r2/TR4 mobos are simply amazing, and AMD has traditionally maintained (and improved!) their high-end stuff. I can't wait to see how they use those 4094 landings and massive bandwidth over the next few years. The next iteration of the 'Ripper already has me salivating :ouch:

I'll take 4X Summit Ridge 'glued' together, please !!

 

RomeoReject

Reputable
Jan 4, 2015
239
0
4,680
This was a great article. While there's no way in hell I'll ever be able to afford something this high-end, it's cool to see AMD trading punches once again.
 

ibjeepr

Distinguished
Oct 11, 2012
632
0
19,010
I'm confused.
"We maintained a 4.1 GHz overclock"
Per chart "Threadripper 1920X - Boost Frequency (GHz) 4.0 (4.2 XFR)"

So you couldn't get the XFR to 4.2?
If I understand correctly manually overclocking disables XFR.
So your chip was just a lotto loser at 4.1 or am I missing something?

EDIT: Oh, you mean 4.1 All core OC I bet.
 

sion126

Prominent
Aug 19, 2017
5
0
510
actually the view should be you cannot afford not to go this way. You save a lot of time with gear like this my two 1950X rigs are killing my workload like no tomorrow... pretty impressive...for just gaming, maybe......but then again....its a solid investment that will run a long time...
 

AgentLozen

Distinguished
May 2, 2011
527
12
19,015


A big die shrink like that would be helpful but I think that Ryzen suffers from other architectural limitations.

Ryzen has a clock speed ceiling of roughly 4.2Ghz. It's difficult to get it past there regardless of your cooling method.
Also, Ryzen experiences nasty latency when data is being shared over the Infinity Fabric. Highly threaded work loads are being artificially limited when passing between dies.
Lastly, the Ryzen's IPC lags behind Intel's a little bit. Coupled with the relatively low clock speed ceiling, Ryzen isn't the most ideal CPU for gaming (it holds up well in higher resolutions to be fair).

Threadripper and Ryzen only look as good as they do because Intel hasn't focused on improving their desktop chips in the last few years. Imagine if Ivy Bridge wasn't a minor upgrade. If Haswell, Broadwell, Skylake, and Kabylake weren't tiny 5% improvements. What if Skylake X wasn't a concentrated fiery inferno? Zen wouldn't be a big deal if all of Intel's latest chips were as impressive as the Core 2 Duo was back in 2006.

AMD has done an amazing job transitioning from crappy Bulldozer to Zen. They're in a position to really put the hurt on Intel but they can't lose the momentum they've built. If AMD were to address all of these problems in their next architecture update, they would really have a monster on their hands.
 
Sure Billy Gates, at 1080p with an 800$ CPU and an 800$ GPU made by a competitor... sure...

At 1440p and 2160p the gaming performances is the same, however your multi-threading performances are still better than the overprices Intel chips.
 

Ditt44

Honorable
Mar 30, 2012
272
0
10,960
<Moderator edit for rudeness>

Who buys this class of CPU and bases that choice on how well it 'games'? The performance is not 'mediocre' either. It's competitive. And as for Ryzen, you do get what you pay for: Exceptional value to performance. Back under the bridge with you, please.
 

evarty

Distinguished
Oct 7, 2011
32
0
18,530
Thank you for the review however, I hate to be that guy but it's 2017 and still no 1440p and 2160p benchmarks? I get a large portion of people are still on 1080p and I appreciate those results but a large portion of people are also not gonna be using Threadripper yet, yet you bringing us those benchmarks.

I would think with that same conclusion that 1440p(for sure) and 2160p(maybe) benchmarks should be included to say the least.
 

Solarion

Prominent
Jul 6, 2017
66
0
660
"Of course, switching into Game mode might enable higher performance in some situations, but we don't think professional users will tolerate constant reboots to toggle back and forth."

A professional would simply find a more palatable solution. For instance one could simply use a program like Bitsum's process lasso to force a troublesome application to run on specific cores.
 

sion126

Prominent
Aug 19, 2017
5
0
510
Well I would challenge anyone who says the multi-threading is limited on Threadripper. I compile a lot of code and the more cores and the more threads I can spread over the CPU the faster my stuff gets done.Even the Ryzen 1800X I was using before I moved to Threadripper was doing better than any i7 or i9 I tested with.

There are for sure some annoying latency issues that seem to freeze out anything else while it loads the CPU and executes a job, but for me it is much more effective at multi core multi thread loads simply because I have more cores and more threads.

And the thermal stuff is much better than Intel, I had MAJOR issues with the i9 packages I tried before I went this way.

I am sure for most people average use cases Intel will be better but for mine it is not and I get a Price bonus to boot!! :)

I mitigated by leaving one core for the rest of the system to process system requests, seems to be working well for me.
 

spdragoo

Expert
Ambassador


I know it's been said multiple times before, to multiple people, but I guess it needs to be said again:


  • ■ At 2160p/4K (& even 1440p) resolutions, performance is limited by the GPU. Even with monsters like the GTX 1080TI & GTX Titan Xp, there are games out there that are still limited by the GPU's performance even for 60Hz monitors, let alone 144Hz monitors.
    ■ At 1080p & lower resolutions, you have the potential to be limited by either the GPU or the CPU. The former, however, only happens if your GPU just isn't powerful enough -- i.e. trying to use a GT 710 to play Grand Theft Auto V, or hoping that a Radeon HD 7750 will do well in The Witcher III.
    ■ The reason is because the amount of data that's processed by the CPU stays the same regardless of the monitor resolution/refresh rate. Whether you're rocking a 4K display or plinking along with an old 640x480 CRT screen, the same data has to be crunched on each Civilization VI turn, the same number of enemies have to be tracked on PUBG, the same number of potential runover targets -- I mean pedestrians -- have to be accounted for in GTA V, etc.

If you'll consider it as proof, consider what happened when [H]ardOCP tested the Ryzen 7 & a Kaby Lake Core i7 against an old Sandy Bridge Core i7 with a GTX 1080TI (https://www.hardocp.com/article/2017/05/26/definitive_amd_ryzen_7_realworld_gaming_guide/1). For the games they tested, while they were able to see small differences in performance at 4K between the 3 CPUs, the key word was "small"; we're talking 1-5FPS margins on average FPS for most of them, & (except for maybe 1 game) percentage margins of only 5%. Maybe some people might call that "definitive" differences; me, I call that, "the performance is too close to call a real winner". And their margins at 1080p were the same or lower, because instead of using that powerful GTX 1080TI to limit any potential GPU limitations, they dropped down to the GTX 1060 instead...a fine GPU for 1080p gaming, but you're unable to tell with it whether it's your CPU or GPU that's limiting your performance.
 


Pretty much all current games are optimized for 4-core Intel chips from Sandy Bridge and up (if recent benchmarks such as this one on GamersNexus are any indication). So, what to get:

  • ■ only gaming, hefty budget: get a Core i5 or an i7;
    ■ only gaming, tight budget: Ryzen R3 are priced close to i3 but perform like entry-level i5, especially when overclocked a bit with the stock cooler;
    ■ gaming + rendering, Ryzen R5 1600(X) is cheaper than i7 and performs on par or better;
    ■ rendering, compiling with on-the-side gaming, R7 beat the i7/i9 in the price/performance area hand down.
As for me, I'm waiting for RAM prices to come back down to replace my Haswell + 16 Gb: maybe by then Zen+ will be out, if not I'll just grab a R7 1700 (provided Intel doesn't come up with a full-featured, cheaper, better performing, cooler CPU by then - yeah, right).
 

D3M1G0D

Prominent
Mar 28, 2017
2
0
510
@Billy Gates

Anybody who buys a 12-core/24-thread CPU just for playing games is an idiot. Hell, I don't even have Steam installed on my 1950X, nor do I ever intend to. I use my Threadripper system for mining and grid computing, and it is putting up dominating performance in World Community Grid. My Ryzen 7 system is for gaming.
 

rwinches

Distinguished
Jun 29, 2006
888
0
19,060
There is no real world case for loading up the CPU for gaming when just by moving up to higher res will keep graphics load on the GPU where it belongs. 1080p testing is nice to know info but basically irrelevant, maybe if you include more multi monitor testing .

Prime testing is also a nice to know but does not reflect real use.

Those Price Efficiency graphs a dubious in value as they aggregate individual tests along with tests that simulate simultaneous operations. A 4c8t CPU can never be a better buy in this type of use case ever, it is simply out of it's class.
 


Well, considering both the Xbox One and PS4 contain an AMD CPU, my guess is that AMD DOES make CPU for gaming - but game makers target Intel CPUs only on PC.
The 15-30% increase in performance for existing games that got patched after Ryzen came out seem to indicate that said game makers are revising their judgement.
 

Lieutenant Tofu

Prominent
Apr 11, 2017
5
0
510
"...and a Legacy setting to disable one CCX, solving compatibility issues."
Shouldn't this read "disable one die", since each die is composed of 2 CCXes itself?
Thanks for the great article!
 
Status
Not open for further replies.