Ryzen Threadripper 2 2990WX and 2950X Review - AMD Unleashes 32 Cores

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

rwinches

Distinguished
Jun 29, 2006
888
0
19,060
Just set performance enhancement to level 3 and core voltage to -50 best overall perf. on the 2990WX while cooler temps allow higher clocks when needed, best balance for IPC and boost speeds.
Try running two bench marks at once. Try 4K video to multi formats at once. This CPU is for heavy workloads and multi tasking where productivity is key. Not for running one thing at a time. Buy only if you really need 32/64 for $28 per thread.
Perf setting on 2990WX
https://www.youtube.com/watch?v=LD66CSR8mnU
 

Lasselundberg

Distinguished
Dec 31, 2007
27
0
18,530
people that buy these aint gaming at 1080p.....i know its to show off the cpu the test are run in 1080p....but please, can we see some 2k and 4k test....if im only loosing a couple frames i would like to support the underdog that finally managed to push intel enough that that released something interesting...im about to drop 4K$ on my next build....and gaming 3440x1440 resolution means the world to me...and probably to many spending this kind of money on a cpu....i would go straight to a 8700k if it wasnt for the hundres of hours of video i also gotta get through
 


Then get the 2950 - the 8700K is only useful for pure gaming rigs at 1440p and over, so if you have any video processing to do then it's Zen+ for you.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


There isn't much value to showing repetitive charts with little to no difference between them, and that's what you get with 4K testing. The CPU isn't the bottleneck there. If you do that type of work and want solid gaming at QHD and above, just go TR 2950X and don't look back.

 

agello24

Distinguished
Feb 8, 2012
136
2
18,715
1. any of those benchmarks optimized to run on AMD ryzen or the threadripper? 2. why were outdated benchmarks ran on these AMD chips? 3. none of the intel chips should have won ANY of the test if things where optimized for multire core chips. programmers need to stop being lazy and over charging for their products that are poorly optimized. 4. to say the platform is expensive is nuts. its an workstation chip. so what are you expecting? 5. stop running outdated benchmarks. push these companies to update their software. if they dont, then DONT USE IT. MAKE UP a test.
 


^^LOL from someone who has a track record of only 5 posts in two years and eight months on Tom's Hardware. Typical hit and run Tom's Hardware hater because they didn't sing complete glorious praises to AMD.

Did you even read the entire article before whining? Paul clearly stated overclocking these presents real challenges. Also, Ryzen has NEVER overclocked well compared to Intel and as someone else said, Tom's already had those Intel CPU overclocks in previous reviews.

Finally, when Ryzens are overclocked, it's not by a large margin. That's one of the key selling points of Intel: high overclock potential with proper cooling which not only provides significantly higher performance, but extends its life in performance compared to new generation chips.
 

Vladimir Iliev

Honorable
Dec 21, 2015
9
3
10,515
10tacle, if my track of comments are the weight needed to say that this article is NOT a total joke ok. Gaming review with 32 cores lol. The other site that toms owns - anandtech is the same s**te. For some real world scenarios for developers and content creators check something like phoronix. I was reading toms everiday but that s*it was the last drop.
 

Eastaman

Prominent
May 24, 2017
3
0
510
Why people are still contemplating buying this for gaming is beyond me. It's really for multi tasking. You can render your workload while you work on something else. Kudos to AMD for this release. It's not the perfect CPU for every scenario and I doubt there is one out there. It's basically for people who earn a living from their computer. For gaming you have the 8700k or if you wait a few more weeks, Intel will release a 9900k.
 
Aug 14, 2018
1
0
10
I'll go for the 1920x for now as I can utilize all 24 threads. Maybe a v2 cpu with otherwise same hardware when prices come down..
 
Aug 14, 2018
1
0
10
What does the DIMM Config "4 of 8" on the first page mean? Are these 4 RAM sticks on a 8 slot motherboard? If so why are they slower (DDR4-2666) than putting the sames stick in a 4 slot motherboard (DDR4-2933)?
 


The programmers don't set the price or the specs. Publishers tend to do a whole lot more of that than programmers. Those poor souls only help develop the specs and create the code. The publishers set shipping dates and costs.
 

mapesdhs

Distinguished
A gaming-focused review for parts like this is bizarre. For a more sensible writeup, see:

https://www.phoronix.com/scan.php?page=article&item=amd-linux-2990wx&num=1

Gaming is not remotely the relevant market for these CPUs, so implying it is by showing the gaming results first is silly. Hardly a wonder everyone is banging on about the 2950X instead. Do some more relevant tests and the 2990WX will shine, but as with anything, the right tool for the right job. Besides, it's going to be a while before OS variants can adapt to tech like this, ditto optimised BIOS/chipsets and especially applications. Kudos to AMD for actually pushing things along though, unlike Intel which sat on its butt for so long.
 

mapesdhs

Distinguished


Problem with CB is it's starting to break at this level of performance, often with large variance between runs. See:

https://www.servethehome.com/cinebench-r15-is-now-a-broken-as-a-benchmark-and-11-5k-surpassed/

The benchmark needs an update.
 

PaulAlcorn

Managing Editor: News and Emerging Technology
Editor
Feb 24, 2015
858
315
19,360


This is hardly a gaming-focused review. We tested eight games. Compare that to the 41 application tests. Including SPEC workloads for workstations. The games are listed on the first pages, sure, but that is because we apply the same tests and format to all CPUs.

Threadripper 2 X-Series is specifically marketed to gamers. Yes, gaming is relevant.

 

mapesdhs

Distinguished


It's gaming focused if the games are tested first, and a lot of the app testing isn't really all that relevant to where or how this class of CPU would be most useful. They may be marketed to gamers (what a shock) but that won't be their natural core audience. Just because AMD is marketing a CPU in a particular way doesn't mean that actually makes any sense. I was the biggest SGI fanboy around back in the day, but SGI's PR dept. was sometimes completely whacko, claiming all sorts of stuff that wasn't true (or at best misleading), and they were dealing with tech oodles more expensive. I guarantee the engineers who designed these chips think AMD's PR dept. is crazy.

These are not gaming CPUs. To claim otherwise is to ignore the fact that the stock 2990WX performs comparatively poorly for every single one of the gaming tests (remember PBO means operating out of warranty, if I read the info correctly). Can't have it both ways; if the 2990WX is being touted as a gaming CPU then the data suggests it's ruddy awful, so surely instead it makes more sense to say it just isn't that kind of product at all, AMD's marketing is nuts. Have you looked into science focused tasks that could properly push hw of this kind? That would be far more interesting, especially with ECC. Stuff like FEA, CFD, CQD, etc. The 2950X does better for gaming, but that's no surprise, though again the total cost still ignores the more sensible mainstream chipset/CPU choices.

There's a vast range of tasks which would be more suitable for working out what the 2990WX could be really good for, but few of them were tested here. Production rendering isn't Handbrake. Have you looked at how Alfred works on Centos? A cluster of these things could be very potent indeed. What about v-ray? There are so many other possibilities for which this CPU could be a real winner, but in the end it's AMD's fault if they're aiming the marketing at entirely the wrong audience. The kind of people & companies who'd love this CPU in a workstation are exactly those to whom I provided SGI technical & upgrade advice for 15 years while that tech was still current. Sure, such users would go EPYC if they could afford it, but many can't (especially solo pro types, academics on tight budgets, etc.), and for them this could be the best thing since sliced bread. Academic researchers inparticular - biochem, GIS, physics, fire & explosion studies (built environment), all sorts. I've dealt with hundreds of such people over the years, all too often their workstations are budget constrained compromises of the ideal (in some places, just off the shelf consumer builds).

The more pertinent question is why the heck is AMD pushing them as gaming products, it makes no sense. 5 seconds thinking ought to conlude that buying a mainstream platform with an 8700K or looming 9900K (or 2700X, whatever) and spending the cost difference on a more powerful GPU is way more logical for gaming, and there'd be change enough for a 970 Pro C-drive. :D

Reviews used to be more forceful in pointing these things out. Nobody should be advising consumers that any of these TR2 CPUs make sense for gaming, they really don't. I'm delighted that AMD is back in the game, but I'm not going to tell a friend to get a TR2 for gaming when a Ryzen or CL is far more sensible, and one can't point to SLI/CF because the utility of those technologies has been severely diluted in recent years, with poor driver support and even hw lockouts from some quarters.

I mean ye gods, the cost of just a 2990WX (1640 UKP in the UK) is enough to build an entire very nice gaming machine. :D Add in the cost of the RAM & mbd and that's enough for even a prebuilt i7 8086K rig with a 1080 Ti (code LN84893 on Scan). The 2950X does better as I say, but the platform cost difference is still large enough to mean a mainstream build with a better GPU will always be faster for gaming. If one tries to justify such a purchase by saying yes but they would be better if one also wanted to do some content creation, well in that case they marketing focus on gamers is no longer there. Fact is, one can build a far more potent gaming system on the same budget by going mainstream, especially for those playing at higher resolutions (since they won't be so dependent on CPU performance anyway, in which case a non-K CPU is fine and hence an even higher available budget for a better GPU). And remember that even for content creation, these days GPU acceleration has become a lot more relevant, eg. the system I built for the Learn Engineering channel on youtube is eight times faster than their existing best system due to decent GPU power (CUDA in Blender) that blew away the CPU-based setup they'd been using until then.

Btw, SPEC has been flawed for a long time, and CB OGL has been broken for years.

Ian.

 


1) Yeah, it carries weight. Specifically your credibility at chastising Tom's. As well as how long you've been here.

2) If you don't like the way Tom's, AT, or any other tech website reviews things for that matter when it comes to AMD products, then go find your favorite fanboy fluff website like WCCFTech and stop lurking here. I never see you whiners complain about their choice of bench platforms when it comes to Intel chip reviews. Why is that?

3) The majority of Tom's Hardware readers are gamers and only have ONE machine. Many of them want a balanced rig between productivity apps and gaming performance. In fact, most, like me, became members here and learned how to build our first PC. I've been here for nearly 20 years. Game benchmarks WILL be part of a review here. It's called a BALANCED review. This is not a website based solely on deep learning applications or programming. With that said, Toms, and Anandtech for that matter, choose the most widely used productivity programs to test in a CPU review. What you are talking about is relevant to probably about .08% of enthusiast PC users out there.

4) ----> Here's the door. <----

 

logainofhades

Titan
Moderator



Then go fanboy somewhere else. If THG was so Intel biased, as your earlier post suggested, the 2700x would not have been given the Best Overall title for best gaming cpu's. https://www.tomshardware.com/reviews/best-cpus,3986.html

For straight up gaming, no these CPU's don't make sense. Many people don't want multiple machines, or have the room for them. A review such as this shows what to expect from a system that is a workhorse, but also games on the side.

Every CPU article people whine and complain, about methods used. Like complaints about resolution used, when the clearly don't understand that lower resolutions remove graphics bottlenecks, to show a true representation of CPU only performance. It has been a method done since the beginning of CPU reviews. It wasn't all that long ago that 720p was used for such testing. GPU's have become powerful enough, that 1080p can now be used.

 

ROB_DF_MX

Distinguished
Jul 8, 2015
18
0
18,510
At page 1 ( Introduction ) you wrote:
"Today, the platform supports ECC memory and up to 256GB of capacity."

That is something I am very interested.
Could you tell which actual memory modules allows that 256GB of RAM configuration ?
Can you post a link to those RAM sticks ?

Thanks a lot in advance.
 

ROB_DF_MX

Distinguished
Jul 8, 2015
18
0
18,510
@PaulyAlcorn
Mr. Paul Alcorn and Mr. Igor Wallossek,

At page 1 ( Introduction ) you wrote:
"Today, the platform supports ECC memory and up to 256GB of capacity."

That is something I am very interested.
Could you tell which actual memory modules allows that 256GB of RAM configuration ?
Can you post a link to those RAM sticks ?

Thanks a lot in advance.
 
Aug 17, 2018
1
0
10
Thanks for this review PAUL ALCORN & IGOR WALLOSSEK, Very informative and explanatory, I hope in future you guys made a new review with this TPD2 using a X499 mobo or using Linux,

I was impressed that these 2 monsters still have not shown their true power

After seeing these 60 tests, the overall performance order looks something like this:

Core-I9 7960X @4.3
Core-I9 7980XE @4.2
Threadripper 2950X PBO / Core-I9 7960X
Core-I7 8700K
Core-I9 7980XE
Threadripper 2990WX PBO
Core-I9 7900X
Ryzen 7 2700X
Threadripper 2950X
Threadripper 1950X @3.9
Threadripper 1920X @4.0
Threadripper 2990WX
Threadripper 1950X
Threadripper 1920X

For specific cases:
Games: Intel
Single Thread: Intel
Rendering: AMD
Multi Thread: AMD
Price: AMD

My favorites and best cost benefit are:
Ryzen 7 2700X and Intel I7-8700K
 

It's more a matter of how many memory slots you can address - and the biggest I can find are 64Gb sticks. Those MIGHT work, but I just can't find any sTR4 motherboard that has those certified as working. 32 Gb ones also exist and are more likely to work (especially if they're single rank).
Still, considering how Zen(+) is finicky with RAM, I strongly suggest you stop at certified RAM kits for your platform, and right now those seem to top up at 128Gb.

 


Reading promotional material, AMD is advertising these chips to content creators first - including game creators. And yes, you can game on it too. Try gaming on a 28-core Xeon CPU, and suffer. The review, in that sense, is fair - inasmuch as it represents some games among all those tasks, and does indeed show that you can game properly on Threadripper.

I might nitpick on one point, in that we get overclocked results for Intel while there are only "stock" results for Threadripper - but, as said before, those Intel results were included because available. It's unfair as while you need to manually tune up Intel chips to overclock them, merely cooling Zen+ chips will make them ramp up their frequencies. As long as this is covered in a later review, (as said, again), then I'm fine.
Another point is that, as pointed out in other comments, we're reaching max core scaling for many benchmarks; it may be useful to go back to basics again, and have a good long look at how some of the software is compiled - this may explain why they get so much variation in numbers between Windows and Linux at Phoronix', and this time differences aren't only found in I/O intensive tasks (where Windows traditionally gets eaten for breakfast).
 

bobiseverywhere

Distinguished
Jan 7, 2008
28
0
18,530
Turn phone sideways to look at chart. NOPE all ads cant see chart anymore with some stupid video that you can not make go away. No luck for you go read an article from another site
 
Status
Not open for further replies.