Review AMD Threadripper 3990X Review: Battle of the Flagships

A large number of applications don't scale well with NUMA architectures, particularly with Windows, which is the operating system of choice for visual effects artists.
I work in the VFX industry, where I've been at ILM, DNEG, MPC and Cinesite that work on most of the block buster movies, and I can tell you this. Windows is definitely not the OS of choice, that would be linux.
I do however currently work at a smaller vfx studio, and they use Windows.
 
Really enjoyed that one! Great comparison of the HEDT CPU's v Server and Mainstream, the good, the bad, and the ugly!

Although, I don't get the almost apologetic tone in the Gaming Test notes. Yes, we know these CPU's aren't meant for gaming, but HEDT users, I'm sure, like to down tools too and game after a hard days slog! I suspect they'd like to know, along with the majority of the community, and anyone who'd be genuinely interested in these CPU's in the first place, what kind of gaming performance they can expect (and it's pretty damn good, by all accounts! ) from them.

Anyway, including the gaming metrics is just being comprehensive. That's why I come to Tom's. Comprehensive is good. Don't resist the urge to include these benches in future comparison's. Don't mind the detractors! 😀
 
  • Like
Reactions: bit_user
So you could run a Cassandra 21-node cluster on one PC with 21 Virtual Machines each allocated with 6 threads, keeping 2 threads for the host. With a mobo max memory of 256GB, each VM could be allocated 11GB leaving 25GB for the host. AMD enables you to have fun 🆒
 
I work in the VFX industry, where I've been at ILM, DNEG, MPC and Cinesite that work on most of the block buster movies, and I can tell you this. Windows is definitely not the OS of choice, that would be linux.
I do however currently work at a smaller vfx studio, and they use Windows.
Would your line of work actually enjoy using the 3990X, or would it just stick with something Intel again, due to the time and money lost swapping platforms?
 
I have also worked Framestore, ILM, MPC etc & the idea of running windows for vfx on that scale is seriously scary. I think it's fair to say 95%+ of vfx are linux, cause only a few smaller houses run windows, often with horrendous results.
 
  • Like
Reactions: bit_user
Hypothetically with 256 megabytes of L3 you could also have a 128 thread monero miner.

My extrapolation from the 3970X (28900 hashes/second x 2) = 57800 hashes/second x 0.9 (due to scaling not being completely linear due to lower clock speeds) = 52020 hashes per second

Putting that into a monero calculator with a 300 watt power drain for the system and 0.06 Cost per KWh we get $1,364 profit a year.

$3990 / $1364 = 2.9 years to recoup your investment.

https://www.cryptocompare.com/mining/calculator/xmr?HashingPower=52020&HashingUnit=H/s&PowerConsumption=300&CostPerkWh=0.06&MiningPoolFee=1

Comparing this to a Geforce 2080Ti we get a strangely similar 2.89 years to recoup its investment.
$1300 / $1.23 a day = 1057 days / 365 days = 2.89 years

With the 3950x clocking higher and being 35% cheaper per core it would make more sense to use 3 - 3950x in 3 separate rigs than a 3990x.
 
"The 32-core 3970X outperforms the 3990X throughout this suite of tests, but we found that the workloads don't execute across both processor groups. That means the workload executes on 64 threads for both processors, giving the 3970X's higher clock speeds the upper hand. "


It sounds like a lot of workloads would benefit from disabling SMT
 
  • Like
Reactions: bit_user
"The 32-core 3970X outperforms the 3990X throughout this suite of tests, but we found that the workloads don't execute across both processor groups. That means the workload executes on 64 threads for both processors, giving the 3970X's higher clock speeds the upper hand. "


It sounds like a lot of workloads would benefit from disabling SMT

Or starting using windows server 2016 ..,,,,
 
  • Like
Reactions: bit_user
Hypothetically with 256 megabytes of L3 you could also have a 128 thread monero miner.

My extrapolation from the 3970X (28900 hashes/second x 2) = 57800 hashes/second x 0.9 (due to scaling not being completely linear due to lower clock speeds) = 52020 hashes per second

Putting that into a monero calculator with a 300 watt power drain for the system and 0.06 Cost per KWh we get $1,364 profit a year.

$3990 / $1364 = 2.9 years to recoup your investment.

https://www.cryptocompare.com/mining/calculator/xmr?HashingPower=52020&HashingUnit=H/s&PowerConsumption=300&CostPerkWh=0.06&MiningPoolFee=1

Comparing this to a Geforce 2080Ti we get a strangely similar 2.89 years to recoup its investment.
$1300 / $1.23 a day = 1057 days / 365 days = 2.89 years

With the 3950x clocking higher and being 35% cheaper per core it would make more sense to use 3 - 3950x in 3 separate rigs than a 3990x.
I'm like 97% sure you don't just buy a GPU or CPU by itself to mine. You need power supplies, motherboards, fans, etc. And 300w is not the power usage, it's the TDP. And you're forgetting that the network gets bigger over time so you get diminishing returns. It's simply not worth it.
 
  • Like
Reactions: bit_user
Threadripper 3990X performance gape is not enough to justify it over 3970x. which i think is the one to buy.

Its a 64 cores CPU. Its not for everyone. ITs not a gaming CPU and neither is it meant for everyday use.

To get its benefits, you need apps that could utilise these cores. Then you don't run windows 10 Pro on it as well.....

It will perform very well in high speciliased areas such as multi-core rendering, AI, virtualisation.......
 
  • Like
Reactions: bit_user
Hypothetically with 256 megabytes of L3 you could also have a 128 thread monero miner.

My extrapolation from the 3970X (28900 hashes/second x 2) = 57800 hashes/second x 0.9 (due to scaling not being completely linear due to lower clock speeds) = 52020 hashes per second

Putting that into a monero calculator with a 300 watt power drain for the system and 0.06 Cost per KWh we get $1,364 profit a year.

$3990 / $1364 = 2.9 years to recoup your investment.

https://www.cryptocompare.com/mining/calculator/xmr?HashingPower=52020&HashingUnit=H/s&PowerConsumption=300&CostPerkWh=0.06&MiningPoolFee=1

Comparing this to a Geforce 2080Ti we get a strangely similar 2.89 years to recoup its investment.
$1300 / $1.23 a day = 1057 days / 365 days = 2.89 years

With the 3950x clocking higher and being 35% cheaper per core it would make more sense to use 3 - 3950x in 3 separate rigs than a 3990x.

No one in the right mind will buy this CPU for mining.....
 
Threadripper 3990X performance gape is not enough to justify it over 3970x.
I agree. The difference is not gaping enough.

Of course, you should also consider what they said about NUMA vs. processor groups.

Some applications can span across both groups, but many cannot, and we have the added complexity of multiple NUMA nodes. That means we experience sub-par scaling in some workloads with both the 3990X and our server platforms.

On the 32-core version, it sounds like you could just put all the threads within a single processor group and avoid the whole mess.
 
Last edited:
Just three years ago, an eight-core $1,000 chip represented the best the industry had to offer on an HEDT platform,
Huh?

Broadwell E5 Xeons reached up to 22 cores. You could drop an E5-2699 v4 in a single-socket workstation board, if you wanted that many cores. That's certainly why they didn't take the E5-1000 series beyond 8 cores.

With the Skylake Scalable Series, they split off the server CPUs into a new socket, which is why I think they suddenly took the workstation/HEDT CPUs all the way up to 18 cores.


now we have up to 64 cores and 128 threads at our disposal, and AMD says it won't slow down as it shrinks to smaller process nodes. As crazy as it sounds, we'll see higher core counts in the future.
Yes, it does sound crazy. I think it doesn't make a lot of sense to scale up further, at this point.
 
I think it doesn't make a lot of sense to scale up further, at this point.

I really hope their engineers are not agreeing with you. I hope they double the cores every other year! Why? Because the more cores they max out, the more cores the lower priced CPUs will have. So, let them scale as much as possible - I would gladly take 64c/128t CPU for $300.

Be careful, you could become a meme one day... remember these:

"Nobody will ever need more than 640k of RAM"
"There is no reason anyone would want a computer in their home."

😉
 
Be careful, you could become a meme one day... remember these:
Those are absolute statements, while mine is heavily-qualified and well-informed by the poor scaling we frequently saw in the data.

If you still want to add it to the list, I'm fine with that. In fact, let's see how it compares:

"Nobody will ever need more than 640k of RAM"​
"There is no reason anyone would want a computer in their home."​
"I think it doesn't make a lot of sense to scale up further, at this point."​

If you want memes, there's one of Oprah that I think works better ("YOU GET A CORE"), but I can't seem to link it from here.
 
Last edited:
  • Like
Reactions: JohnDon9
Who is"we"? Why would they charge less on something that is a niche product with no real competition? If you want better bang for buck, get a 3700X.

Whos "we" for sure not you lol ... I need the most core one can get for lowest price, I use Handbreak 24/7 on file converting machine non stop ... and I get paid for converting the movie files into HEVC
 
Be careful, you could become a meme one day... remember these:

"Nobody will ever need more than 640k of RAM"
"There is no reason anyone would want a computer in their home."

😉

The first one attributed to Bill Gates was never said by him. The 2nd statement loses all context when printed on its own. Ken Olsen gave a speech in the late 70's about how people would not want a centralized all controlling computer, think HAL 9000, in there home. That is as true today as the day he said it. At the time he gave the speech, he had computers in his home that his family used which were known as personal computers and eventually evolved into the modern PC. He was not talking about those, just a very specific type of computer.
 
  • Like
Reactions: JohnDon9