News AMD vs Intel 2020: Who Makes the Best CPUs?

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
I consider myself an enthusiast, but I'm a developer that doesn't have time to experiment with hardware and worry if it's going to fail. For instance, I've used PBO and been impressed but disabled it to help ensure there aren't problems in the future.
If simply comparing silicon from the two biggest players, which the present article says it does, then to be fair, the 3970X and 3990X should have been included along with benchmarks that don't rely on graphics.
IMHO Intel has clearly made overclocking more of a thing, because they have to.
btw: my only regret with the 3990X is that I didn't trust AMD enough to go all the way to Epyc for increased memory support. tomshardware.com has covered Epyc quite a bit. How many gamers use Epyc processors?

Again you would be better off at Phoronix. TH is an enthusiast site and the mass majority of enthusiasts (which is a small sliver) live in the mainstream and HEDT world. You are comparing server and workstation class products which of the enthusiast world is but a small sliver.

Yes TH covers most everything but have you ever noticed the majority of reviews tend to cover mainstream and HEDT products and include many game benchmarks? How the motherboard reviews tend to cover mostly enthusiast class products? I can't remember the last time TH did a review of a 28 core Xeon server CPU.

https://www.tomshardware.com/reviews

If you look at that and think this site is anything but an enthusiast site then I can't help.

And again Intel made overclocking a thing because thats what people wanted. Its why the Q6600 G0 was one of the best CPUs of the mid 2000s (free OC to 3GHz). Hell we had articles on how to unlock turned off cores on Phenom II X3 CPUs and how to get the best overclocks. Our forums are filled with how to overclock with guides for both Intel and AMD and what components do the best and how to cool properly etc. It was not Intel pushing it, it was enthusiasts pushing it and we have been forever. Much like a car, we are never happy with "stock" performance. We want to tune and get the best we can, squeeze out every last drop. Its why there are LN2 extreme OCers for both sides.

We are just slightly crazy.
 
  • Like
Reactions: Gurg

CerianK

Distinguished
Nov 7, 2008
261
50
18,870
We are just slightly crazy.
I get where both you and Ronald are coming from.
My first overclock doubled the rated speed of the Intel CPU. It was hand-picked from dozens of candidates. Simultaneously, the FPU was running async at 25% faster than the CPU... actually right at its max rating of 20Mhz. I used that setup for a few years to create graphics for short music videos. That is enthusiast, by any definition. Some of you may have been around long enough to know exactly what Intel CPU and non-Intel FPU I am talking about. Only the most advanced would hazard to guess the manufacturer of the passive-backplane CPU card.

Nowadays, if I dare overclock my servers, they will crash within a week of starting a month-long job... so I back them down before starting anything big. As an experienced enthusiast, I know this is necessary.

Also, having been a moderator for years on what was once one of the biggest enthusiast programming websites on the internet, I have some thoughts about promoting inclusivity.
 

Jim90

Distinguished
AMD does not "make" anything. AMD has turned over Intel's cross-licensed patents to TSMC (aka Chinese government) and in return TSMC makes CPUs.
AMD is a shell organization that provides little (or no) value to the development of CPU technology.

-->What an utterly stupid, STUPID thing to say.
-->In deliberately whitewashing the truth, you, sir, earn the 'Dumbass of the Year' badge.
 

Jim90

Distinguished
The issue with the AMD Ryzen 3000 cpu's is the need to overclock the RAM to 3800 with the best B-die so you can have the tightest timings possible. With the EDC bug my 3800x will perform like a 9900ks. Not as fast but so close it does not matter. The same can be done with the 3950x and this makes it the faster CPU money can buy. The issue with the 3950x is its not a gaming cpu really but with the EDC bug it becomes a monster. With the EDC bug you can have multi thread clock of 4.6GHz with the 3800x in games. https://www.overclock.net/forum/13-amd-general/1741052-edc-1-pbo-turbo-boost.html

.....
.....

--> I don't need to quote everything.
In summary! the expensive and extremely power hungry 9900K will gain you a few extra gaming fps - AT 1080p !!! - BUT, increase that screen resolution above 1080p and guess what, in reality for the user, happens to any differential you might see? --> it disappears!! Now, when the average gamer purchases an expensive CPU like this, guess what aspect of a monitor they's be seeking to 'increase' - yup, resolution.
ONLY professional gamer's will see a use case for the 9900K. For non professionals, the vast majority of us will benefit greatly in Zen2's many significant advantages over an Intel CPU.
 

zx128k

Reputable
--> I don't need to quote everything.
In summary! the expensive and extremely power hungry 9900K will gain you a few extra gaming fps - AT 1080p !!! - BUT, increase that screen resolution above 1080p and guess what, in reality for the user, happens to any differential you might see? --> it disappears!! Now, when the average gamer purchases an expensive CPU like this, guess what aspect of a monitor they's be seeking to 'increase' - yup, resolution.
ONLY professional gamer's will see a use case for the 9900K. For non professionals, the vast majority of us will benefit greatly in Zen2's many significant advantages over an Intel CPU.

My 3800x is faster than most review 9900k CPU's. It even matches the 9900ks stock in some reviews. This means there is only a few 9900k overclocked builds that are faster. I am massively faster than the 9900k stock, even at 1080p.
 
Intel does hold a slight edge in gaming at the very top end, but even then, the benefits of AMD CPUs outside of that easily outweigh such a slight lead. They have a better upgrade path too, as AMD promises existing motherboards will continue to work with new AMD chips in 2020
 

zx128k

Reputable
Intel does hold a slight edge in gaming at the very top end, but even then, the benefits of AMD CPUs outside of that easily outweigh such a slight lead. They have a better upgrade path too, as AMD promises existing motherboards will continue to work with new AMD chips in 2020

What game edge would Intel have at the top end?



https://www.tomshardware.com/reviews/intel-core-i9-9900ks-special-edition-review/6

My 3800x OC 3800MT/s CL15 with an RTX 2080 OC is from fraps an AVG 174.910fps. This is stock cores, only an OC on the RAM side and IF @ 1900Mhz, my 3600MT/s 8-pack b-die kit can't even do CL14. Basically I get away with this OC because AMD's stock boost algorithm keeps it all stable. I hope forever.

In the benchmark, fraps was run starting as close to the numbers incrementing at the beginning as possible and stopping as close to the numbers stopping as humanly possible.

The 9900k @5GHz 3600MT/s RAM has a RTX 2080 ti. The AVG score is 176.3FPS. All the CPU scores in the bar chart are the RTX 2080 ti.

Intel has 1.4FPS lead with a faster GPU. Or I did something wrong.

iD2mvMEo4GwoDzUhPKZWg9-650-80.png


Here my 3800x RTX 2080 vs. the mighty 9900ks with 3600MT/s and a RTX 2080 ti.



My 3800x OC and RTX 2080 OC. Nets 147FPS at 1080p ultra. Nice 114FPS lows.

World of Tanks EnCore RT



46115/166.66 = 276.70FPS Not sure about this one. Was expecting Intel to win hands down.

Shadow of the Tomb Raider

1080p

The 9900k system
ComponentsZ370
CPUIntel Core i9-9900k @ 5GHz
Memory16GB G.Skill Trident Z DDR4 3200
MotherboardEVGA Z370 Classified K
StorageCrucial P1 1TB NVMe SSD
PSUCooler Master V1200 Platinum
https://wccftech.com/shadow-of-the-tomb-raider-ray-traced-shadows-and-dlss-performance/

SotTR-2080-1080p-740x416.jpg

127FPS @1080p



3800x OC 2080. 141FPS

For testing we’ve used our usual gaming rig comprised of an Intel Core i9-9900K clocked at 5 GHz with 32GB of DDR4-3400 memory.

4k

TR.png




3800x OC RTX 2080. Average 56FPS 95% 50

Assassin's Creed Origins

intel-9900k-review-aco-1080p.png

https://www.gamersnexus.net/hwrevie...paste-delid-gaming-benchmarks-vs-2700x/page-4



3800x RAM OC RTX 2080 132 FPS average. Low 78FPS 0.1%

Time Spy CPU
https://www.tomshardware.com/uk/reviews/amd-ryzen-7-3800x-review,6226-2.html


5U8i4nQwpDNC7UExUdnwog-650-80.png

9900k @ 5GHz with DDR4-3466

Fastest run 3800x OC


Best out of 15, 3800x boost is funny in 3dmark.
1# 11508
2# 11373
3# 11373
4# 11373
5# 11573
6# 11506
7# 11526
8# 11504
9# 11503
10# 11495
11# 11510
12# 11525
13# 11379
14# 11527
15# 11361

Dialing in the CPU.



If the air temperatures increase to 21c then scores drop to a maximum of 11400 in time spy cpu https://www.3dmark.com/spy/11936283 but if the temperature drops to 5c time spy cpu increases to 11600 maximum https://www.3dmark.com/spy/11990955.



Highest score 6c ambient https://www.3dmark.com/spy/12006927 11611 Time spy cpu. Time spy system info causes issues with score, if systeminfo crashes sometimes. This is not the normal average as well and should be taken as a rare event. Add a chiller to the water loop and this 'could' be the norm. The main take away is ambient temps don't affect game performance that much. The 3800x will hit =>4.4GHz happily. It's also cheaper to overclock too 4.4GHz for a great time spy cpu score or use the EDC bug; than buy a chiller.

Got to love 3dmark were the 9900k gets faster as time goes on. This is a more likely 9900k overclock. Love how the stock 9900k is faster by 1000 points and all overclocked 9900k CPU's are 5GHz all cores. 5.1GHz 9900ks cpu run at 1.4volts vcore.
9900K5.00GHz4.80GHz1.300VTop 30%

9900K5.10GHz4.90GHz1.312VTop 5%
https://siliconlottery.com/pages/statistics


Stock 9900k is the same as a stock 3800x. So in this chart we have a big overclock.

Test System bit-tech.net
  • Motherboard MSI MEG Z390 Ace
  • Memory 16GB (2 x 8GB) Corsair Vengeance RGB Pro 3,466MHz DDR4
  • Graphics card Nvidia RTX 2070 Super graphics card
  • PSU Corsair RM850i
  • SSD Samsung 970 Evo
  • CPU cooler EKWB Phoenix MLC-240
  • Operating system Windows 10 64-bit
See how the 9900k losses to the 3800x OC but the 3800x equals the 9900ks at stock. Then the 9900ks pulls away with overclocking. Good luck getting a 9900ks btw.



Cinebench R20

Single Thread


Mutli-Thread



With the 3800x you need good ventilation in the room. An open windows works well. Scores can drop as low as 5140 if the room has no ventilation. Drop to 6c ambient outside.



The scores ensure that you remain on power with the stock 9900ks and beat it with a room that has lower temps and/or good ventilation. Outside temp was 16c for that result.

CPU-Z
https://valid.x86.fr/5x63d4

Just imagine this system with a RTX 2080 ti if I am trading blows and the 3800x system has just a RTX 2080. Would be nice to bench if I could only afford a RTX 2080 ti. The AMD 3800x is the best 8 core gaming CPU (tongue in cheek but also true to an extent). Based on silicon lottery and the time you put in. A 3800x system overclocked correctly is no sloth in gaming performance. Only the more extreme 9900k/ks overclocks match the 3800x with a RAM overclock with very tight timings in games.

This is without me added the all cores 4.4GHz overclock on top. If I do then may time spy score is ~11800 and the Cinebench r20 will be well above 5300. CPU-z ~6100 Multi core.

With the EDC bug its ~11600-11750 time spy cpu and approx. 5300 in cinebench r20.

Games scores are not massively changed. It's really is not worth overclocking the core manually for gaming. Maybe a RTX 2080 ti will show a difference.

EDC is better but has more affect on non gaming benchmarks. If you use the EDC bug, with scalar x10 and auto-OC +200Mhz then things do change. You can hit 4.6GHz all cores ingame
View: https://www.youtube.com/watch?v=4_zjsbavV7M


.
 
Last edited:

CerianK

Distinguished
Nov 7, 2008
261
50
18,870
With the 3800x you need good ventilation in the room. An open windows works well. ... Outside temp was 16c for that result.
With the 3800x, there are of course other very important factors, as well.
However, 16c might also coincide with much lower humidity, thus amplifying the effect, especially if the room was much higher humidity to start with.
See here: https://physics.stackexchange.com/questions/199350/air-thermal-conductivity-vs-humidity.
However, I would still try to keep the humidity no lower than the 10-20% range, as increasing static potential is not a good idea (just to be careful).
 

zx128k

Reputable
With the 3800x, there are of course other very important factors, as well.
However, 16c might also coincide with much lower humidity, thus amplifying the effect, especially if the room was much higher humidity to start with.
See here: https://physics.stackexchange.com/questions/199350/air-thermal-conductivity-vs-humidity.
However, I would still try to keep the humidity no lower than the 10-20% range, as increasing static potential is not a good idea (just to be careful).

Were I live its always 15c average in May and humidity 50%+. It always rains for 10-11 days most months. The main isue is a heat zone around your PC. You don't want that if you are running time spy cpu. Games will hit 4.4GHz reguardless of the ambient temps. More of less maxing out the GPU (RTX 2080).
 
Last edited:

Awev

Reputable
Jun 4, 2020
89
19
4,535
Just a couple things to think about as you read the review:

1. You should only be overclocking the top end products, just buy what you need. No need to tell us about overclocking unless you are talking about the top-end CPUs, such as the Core i9 or Ryzen 9. Otherwise, purchase what you need.

2. Films are recorded at 24 FPS, tv shows at 29.97 FPS, acceptable (playable - really?) gaming is at 30 FPS. Want a game that can hit 300+ FPS - then you want one that has been properly compiled for Linux - such as Team Fortress. Steam's Hardware survey shows that most of the systems for the latest survey is using a 1080 display, driven by a GTX 1060 - so a Ryzen based APU would meet the needs of more than 60% of those that responded.

3. Intel's Marketing VP Jon Carvill challenged AMD to real benchmarks - games, back at E3 in 2019, just before the Ryzen 3000 series launched. Dr. Su, AMD's CEO accepted the challenge. In a video message on YouTube to the COMPUTEX event, posted May 28, 2020, Intel CEO Robert Shawn states, starting at the 4 minute 39 second mark, that we should do away with benchmarks.

4. Who only does one thing at a time with a computer?
4a. While I don't live stream my gameplay, I do record a lot of it. While recording I can have a browser open on a second screen and see where someone is having trouble, or has a question on how to do something, and make sure I cover it in my video. When I am done editing the video I let the program encode it while I am off to cyberspace in a web browser.
4b. When working on a weekly program and bulletin for church I have a couple tabs open in my web browser so I can find new, royalty free clip-art for the cover, pull in some inspirational comments, and update the abbreviated directory on the last page, all while I have LibreOffice Writer open, along with LibreOffice Math so I can use it as a database.
4c. Who here does not multitask? Really? So, you are a professional eSports player, and you use your computer for nothing else? Sounds like you need a game console instead. Oh, wait, even the newest generation of game consoles can not match a properly spected PC that can stream the gameplay live, and have a discord/twitch/YouTube Live chat going on at the same time, and recording it for later use, while Skypeing with someone half way around the world, all at once. (Sorry Intel, you are only good for gaming, not multitasking or productivity).

5. The same time Intel was challenging AMD to real benchmarks, such as games, it released a list of the top 150 or so programs that people use on a daily basis, just that they omitted most of them, cherry picking which ones to include. Intel claims that Cinebench is used by less than 1% of people, yet will not tell you what the top three programs are, they just start at #4 with the Chrome browser.

6. In the very old days, when mainframes ruled the computing world, you would start by defining what you wanted to do. Then you found software that fit you needs - and hired a programer to fine tune it to your specs. Then you would purchase the hardware to run the program. This is when you had to be aware of every penny, as you only had terminals tying into the mainframe, you did not have PCs or laptops as we do today. Maybe we should take a look back at this, and see if we can learn a lesson from it - this way you only pay for what you need, and when we see reviews we go off of Intel's new standards of no benchmarks, just what feels all warm and fuzzy, um, what you actually need and what the hardware is able to deliver.

If the only thing you want to do is game then get a console and don't worry about the best CPU, you will end up saving some money this way, unless you truly want the best, then break the bank and spend thousands. For the rest of us, unless you are doing something that demands more muscle than a spreadsheet, most four core/eight thread CPUs will serve our needs, even an APU for those that only look at Facebook or check Craigslist and emails.
 

zx128k

Reputable
An AMD Ryzen 3300x overclocked, tightened RAM timings and higher IF (RAM = IF) is enough for good performance in games with a 2070. 4k performance drop with the 2080ti is small vs a 8086 @ 5GHz.



 
Last edited:
4. Who only does one thing at a time with a computer?
4a. While I don't live stream my gameplay, I do record a lot of it. While recording I can have a browser open on a second screen and see where someone is having trouble, or has a question on how to do something, and make sure I cover it in my video. When I am done editing the video I let the program encode it while I am off to cyberspace in a web browser.
4b. When working on a weekly program and bulletin for church I have a couple tabs open in my web browser so I can find new, royalty free clip-art for the cover, pull in some inspirational comments, and update the abbreviated directory on the last page, all while I have LibreOffice Writer open, along with LibreOffice Math so I can use it as a database.
4c. Who here does not multitask? Really? So, you are a professional eSports player, and you use your computer for nothing else? Sounds like you need a game console instead. Oh, wait, even the newest generation of game consoles can not match a properly spected PC that can stream the gameplay live, and have a discord/twitch/YouTube Live chat going on at the same time, and recording it for later use, while Skypeing with someone half way around the world, all at once. (Sorry Intel, you are only good for gaming, not multitasking or productivity).
View: https://www.youtube.com/watch?v=GrqvFM8WN3c

What's your point here? As long as you have enough ram you can have as much stuff open as you like.You don't need lots of cores and threads to multitask unless it's for a very specific reason.
Streaming while using discord etc will mess up your internet connection,this one is down to your ISP more than your CPU.
3. Intel's Marketing VP Jon Carvill challenged AMD to real benchmarks - games, back at E3 in 2019, just before the Ryzen 3000 series launched. Dr. Su, AMD's CEO accepted the challenge. In a video message on YouTube to the COMPUTEX event, posted May 28, 2020, Intel CEO Robert Shawn states, starting at the 4 minute 39 second mark, that we should do away with benchmarks.

6. In the very old days, when mainframes ruled the computing world, you would start by defining what you wanted to do. Then you found software that fit you needs - and hired a programer to fine tune it to your specs. Then you would purchase the hardware to run the program. This is when you had to be aware of every penny, as you only had terminals tying into the mainframe, you did not have PCs or laptops as we do today. Maybe we should take a look back at this, and see if we can learn a lesson from it - this way you only pay for what you need, and when we see reviews we go off of Intel's new standards of no benchmarks, just what feels all warm and fuzzy, um, what you actually need and what the hardware is able to deliver.
Number 6 is exactly what Robert Shawn said in number 3...stop looking at benchmarks (that only show you what the benchmarker considers important) and start looking at what you actually need to do your job.
"Shift from benchmarks to benefits and impacts"
So it's a bad idea if robert had it but it's a great idea if you had it?!
 

Awev

Reputable
Jun 4, 2020
89
19
4,535
View: https://www.youtube.com/watch?v=GrqvFM8WN3c

What's your point here? As long as you have enough ram you can have as much stuff open as you like.You don't need lots of cores and threads to multitask unless it's for a very specific reason.
Streaming while using discord etc will mess up your internet connection,this one is down to your ISP more than your CPU.

Additional RAM is nice, and helps. Glad you found a video that shows things can be done on an Intel chip. I can do fifty tasks by myself, and depending on what it is, I can become overwhelmed. If I am just a task manager and hire a number of people to help me then things move along more smoothly. Same thing with the core and thread counts - you are able to spread the workload over a larger work force. Yes, RAM means you don't have to access storage as much, yet it is not the answer for everything. Nor is more cores and threads if you are only doing one thing. Some games utilize modern equipment better than others, while some games are stuck in the past supporting hardware that is 10+ years old - and chokes on today's CPUs. So many reviews state that Intel is a one trick pony, and if you want to do something more than game at the same time consider the Ryzen equivaliant.

Number 6 is exactly what Robert Shawn said in number 3...stop looking at benchmarks (that only show you what the benchmarker considers important) and start looking at what you actually need to do your job.
"Shift from benchmarks to benefits and impacts"
So it's a bad idea if robert had it but it's a great idea if you had it?!

You are missing an important thing here. Intel wanted you to look at the gaming benchmarks at E3 as per Jon Carvill a year ago, and disregard productity numbers. Now that AMD and the Ryzen chipsets are so much better, and can trade blow for blow with the equivalent Intel chips in gaming Robert Shaw is saying lets forget benchmarks - the unspoken part is that AMD is providing a better value no matter how you measure it with a benchmark. How do you determined "benefits and impacts"?

I am not saying it is a bad idea, I have used that idea a countless number of times - for example, a firewall I build using an old computer with a custom Linux distro. What I am saying is that it is MY opinion that Robert wants to do away with benchmarks because it is not flattering to his products at this point in time - my opinion.
 

martinigm

Distinguished
Feb 1, 2013
6
2
18,515
I was (and still am) the owner i9-7940X/1080Ti but I needed another machine at my exwife+son. So a couple of weeks ago I built a Ryzen 5 3600/2070 Super.. so I'm not a Intel nor AMD fanboi... and the really cool thing is: just a few years ago the writing an article called "AMD vs Intel" would be just... stupid.. because Intel was OP (overpowered).

And the fact that we can even have a AMD vs Intel discussion today... introduces a third winner.. us... the end users... i mean... blah blah blah AMD blah blah blah Intel.. cores.. Hz.. 10 FPS +/-, watts.. +/- 10 senconds to convert a video file… cool, broo...

But WE are the real winners... not AMD or Intel… :hot: and I hope it will be completion between AMD, Intel, Nvidia and maybe even others down the road..

So choose HW from the company you like... and let us hope for even more competition in the future 🍿
 
  • Like
Reactions: Phaaze88 and King_V
But WE are the real winners... not AMD or Intel… :hot:
Aha yeah sure,what will you do with your 10bil (above normal) winnings?
We do finally get CPUs with a lot of cores for lowish prices but we are not the ones winning here,almost nobody needs a render farm at home and only those people get any sort of real benefit from these new CPUs.
YPTeKjm.jpg
 
  • Like
Reactions: Gurg

Arbie

Distinguished
Oct 8, 2007
208
65
18,760
AMD Ryzen "overclocks" automatically, so of course there is no extra room (or need) for manual overclocking !

Yet again - amazingly - you take an AMD strength and turn it into an Intel "win"! Don't you ever tire of pulling the same trick on your readers?

All but a tiny fraction of enthusiasts will be happy to have the smarter silicon so they no longer need to manually overclock. Yet for the niche of a niche of a niche that wants to OC just to be doing it, you award an Intel advantage, giving it the same importance as AMD's major accomplishments - such as their auto overclocking, which is not even credited.

In every single round-up you find trivial things to include or major things to omit, just to keep Intel looking competitive. Intel is constantly described as hitting high points where AMD is presented as an all-round compromise. Gotta wonder why.
 
AMD Ryzen "overclocks" automatically, so of course there is no extra room (or need) for manual overclocking !
No, it doesn't.
They have turbo profiles just like most intel or AMD CPUs in the last decade,they push the CPUs to what the manufacturer deems safe.
But who are you even talking to?The article sings praises to AMDs overclock potential.
 
  • Like
Reactions: Gurg

Arbie

Distinguished
Oct 8, 2007
208
65
18,760
But who are you even talking to?The article sings praises to AMDs overclock potential.
NO. Having begged the question that "manual OC" is important in the first place, Toms proceeds to award a category win to Intel.

"Winner: Intel. When it comes to AMD vs Intel CPU overclocking, Team Blue
has far more headroom and much higher attainable frequencies."

That bullet is the money shot; the takeaway. It's what the entire discussion was crafted to create. And even if people read on, who wants "easier for beginners" stuff when they can get cutting edge?

Do you really not understand how this is done? You need a class in disinformation.
 
  • Like
Reactions: Phaaze88 and King_V

Dave Haynie

Distinguished
Jan 9, 2015
15
16
18,525
AMD does not "make" anything. AMD has turned over Intel's cross-licensed patents to TSMC (aka Chinese government) and in return TSMC makes CPUs.
AMD is a shell organization that provides little (or no) value to the development of CPU technology.
I think you haven't the slightest clue about how CPUs are designed. Do you also believe that TSMC is entirely responsible for the iPhone? That Apple doesn't "make" anything because they have other companies built their designs? How about nVidia, Qualcomm, and almost every other chip company -- they don't "make" anything? It's only Intel, Samsung, and all those analog guys?
 
Last edited:

Dave Haynie

Distinguished
Jan 9, 2015
15
16
18,525
I've built/rebuilt three PCs in the last year+. My Intel socket 2011 machine died in May of 2019. The only logical replacement for it was a Threadripper system. That's my system for working at home (I was so damn presicent, eh?), photography, music, and video.

I also had a music computer, in my studio, mainly used only for recording. My old PC was a 1U rackmount with a 65W processor and fans... too noisy. I rebuilt with a 35W AMD with passive cooling, counting on the on-chip graphics. This weirdly also turned into my main Zoom conferening PC in pandemic times, as my studio is small but quiet. Again, AMD was the right choice.

And again... I just built a work computer for a vacation house. Once again, the price-performance on an AMD 8-core/16-thread at 65W for a small case, fast PCIe links for multiple SSDs and a nVidia low profile GPU, all good stuff.

I could have done it all with Intel. But AMD was better at every step, for what I needed.
 
Status
Not open for further replies.