Question I always had Intel CPUs. Convince me about AMD.

Page 6 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Hi there!

So I'm looking forward to build a new rig for gaming. Playing games in 1080p (21:9, 144Hz monitor) and then later in 1440p possibly will be in the focus. I'm planning to use this setup for the next 4-5 years and I am thinking of spending quite a lot of money, so high-end but not NASA-killer setups are in the talking (i5-9600k, i7-9700k, RTX2080-ish GPU).

I always had Intel processors but I am aware that AMD once again became a heavy-hitter, real-deal in the business. I am glad and everything but it is hard to let loose my initial discomfort of changing manufacturer and platform. However I am willing to do it considering that there are more than enough valid reasons behind it. Although emotions play a part and I surely can't really miss with an i7 or i5, nor can I with a Ryzen 5 (3600 and X) or Ryzen 7 (3700X).

So please, try to convince me buying a new rig with AMD architecture! But please keep in mind: framerate and longevity are the two key factors. I am not really interested in things like faster video export or faster times in WinRAR :D

Thank you :)
Well This post has a lot of replies and I haven't looked through them all yet. But for 1080p 144hz you'd probably be fine with a lower end graphics card than an RTX 2080. I'd say maybe even an RTX 2060 would do the trick.

Ok now about the AMD processor. Framerate is honestly going to be a little lower on a Ryzen 5 3600x because it can't hit the same clock speeds as the intel part. But would the frame rate really matter that much if it's already either very close to your monitor's refresh rate or over it? Not really. In a blind test you couldn't tell the difference between the AMD part and the intel part. As far as longevity goes you're better off with AMD because 1 they offer double the maximum core count on mainstream that intel does and 2 they support upgrades on the motherboard a lot longer than intel does. AMD is ahead now in ipc and as Intel stays with the now ancient 'lake' microarchitecture the gap between AMD and intel in IPC will get wider to the point it won't matter that Intel can hit 5GHz and AMD can't. I'd say we're almost actually there already. AMD CPU's perform excellent in gaming now and when you actually hit the GPU with a good load the GPU will actually most likely be the bottleneck which is ideal for gaming.
 
Aug 30, 2019
5
1
10
0
As always, the answer depends on what you want to do with it. If you really are only going to play games on it and don't mind going for the worse value proposition, go for the Intel part. If you are going to hit it with literally any other workload, the AMD chip is far superior. For what it's worth, I switched from Intel to AMD a few months ago and I have had zero issues and zero regrets.
 
Reactions: drea.drechsler
.... For what it's worth, I switched from Intel to AMD a few months ago and I have had zero issues and zero regrets.
That hits the target. But one takeaway I have from scanning through the foregoing posts: whichever way you go most any choice you'll have an equally useable and even enjoyable experience. But one thing you get with the AMD choice, especially today, is value.

You can put a 3700X cpu on just about any decent B450 motherboard with it's stock cooling and a high-end GPU and get performance that is going to be literally indistinguishable, and therefore providing an equally enjoyable gaming experience, from a 9900K(whatever variant) that will require overclocking and therefore a high-end motherboard and high-end cooling. With the same high-end GPU on either that's a value proposition any reasonable person would not ignore.

But more, and to the heart the argument I make for AMD: Only AMD has consistently produced innovative processors that (bulldozer aside) move the market forward. In contrast, Intel consistently stagnates development whenever they can.

But further AMD and only AMD has had a consumer, market and even industry friendly approach in the face of the blatant, industry destroying monopolostic practices Intel used whenever confronted with superior AMD products. It's really those practices that inhibited AMD's ability to gain traction in the market and resulted with Intel's undeserved reputation as an 'innovator'.

Why anybody would want to continue rewarding Intel with business, especially in an era of such close performance parity, just astounds me. It simply makes no sense.

ADD: maybe that wasn't " 'nuf said" up above....
 
Last edited:

joeblowsmynose

Distinguished
Jun 14, 2011
348
126
18,960
0
Well This post has a lot of replies and I haven't looked through them all yet. But for 1080p 144hz you'd probably be fine with a lower end graphics card than an RTX 2080. I'd say maybe even an RTX 2060 would do the trick.

Ok now about the AMD processor. Framerate is honestly going to be a little lower on a Ryzen 5 3600x because it can't hit the same clock speeds as the intel part. But would the frame rate really matter that much if it's already either very close to your monitor's refresh rate or over it? Not really. In a blind test you couldn't tell the difference between the AMD part and the intel part. As far as longevity goes you're better off with AMD because 1 they offer double the maximum core count on mainstream that intel does and 2 they support upgrades on the motherboard a lot longer than intel does. AMD is ahead now in ipc and as Intel stays with the now ancient 'lake' microarchitecture the gap between AMD and intel in IPC will get wider to the point it won't matter that Intel can hit 5GHz and AMD can't. I'd say we're almost actually there already. AMD CPU's perform excellent in gaming now and when you actually hit the GPU with a good load the GPU will actually most likely be the bottleneck which is ideal for gaming.
An RTX 2060 would be a complete GPU bottlneck with any modern gaming CPU and the CPU would never be the bottleneck, removing the CPU from the equation entirely.

The question of which CPU will game better only ever applies to the highest tier of graphics cards, namely the 2080 series. Else the comments would be full of sentiment that it makes absolutely NO difference on a 2060.

Again, most people don't realize that you only need to really worry about the CPU (as long its a modern model with more than 4 threads) if you have a 2080 series card and a high refresh monitor - if you don't, spending more money on the most expensive CPU thinking it will improve your gaming is like flushing money down the toilet.

If a 2060 is the card in mind, the $200 r5 3600 will game equally as well a $500 9900k - again because you are completely GPU bound. Considering this build is for gaming, that would equate to taking $300 and flushing it down the sewer, when you could have taken that $300 and upgraded the GPU to a 2070 super, and got a far better gaming performance with significantly faster FPS,
 
Last edited:

InvalidError

Titan
Moderator
Only AMD has consistently produced innovative processors that (bulldozer aside) move the market forward. In contrast, Intel consistently stagnates development whenever they can.
You should actually be thankful to Intel for taking a break while AMD fell behind because otherwise, AMD would have been run into the ground so hard it would have had absolutely no chance of bouncing back.

Also, due to anti-monopoly clauses in government contracts threatening to force Intel to split, Intel killing off AMD altogether would have caused major headaches. Intel had a vested interest in lowering its own value per dollar to keep AMD alive and avoid getting flagged as a monopoly on government-essential tech.

Hate Intel all you want for ~10 years of stagnation, It didn't have much of a choice either way. On the plus side, now we get to find out how quickly Intel can react fast enough to AMD bouncing back much harder than expected to keep it under the "budget brand" public opinion.
 
.... On the plus side, now we get to find out how quickly Intel can react fast enough to AMD bouncing back much harder than expected to keep it under the "budget brand" public opinion.
And bounce back they will... I just want AMD to have a fair chance to gain traction in the marketplace this time. ONLY with fair competition will we continue to see effective development that results moving the industry forward.

As an aside I actually believe that Bulldozer was innovative, however misguidedly so, and did have it's place moving the industry forward as it brought high-core count processors to the main stream. That spurred developers to take advantage of them, in however limited a fashion, and that set the stage that Ryzen 1st gen was able to command.

Only then did Intel finally respond with high core count consumer processors. And once again we see how it was AMD's innovations that forced action on Intel's part.
 

RodroX

Prominent
Aug 4, 2019
534
157
640
15
But further AMD and only AMD has had a consumer, market and even industry friendly approach in the face of the blatant, industry destroying monopolostic practices Intel used whenever confronted with superior AMD products. It's really those practices that inhibited AMD's ability to gain traction in the market and resulted with Intel's undeserved reputation as an 'innovator'.
I think this is by far the most important part of whats going on since the launch of Ryzen in 2017, and the reason why Intel is worry to keep up with the raise of Ryzen sells DIY PC market (yes they are having troubles, What other reason to cut prices and add HT to all their new CPUs?).
I think Intel didn't care back then or didn't thought AMD will do good with those first Ryzen, or maybe they were blind to realize the meaning of more cores for the productivity population of PC users (or for the mix case scenarios, productivity by day / gamer by night).

  • While Intel keeps charging a premium for an unlocked CPU, AMD have been giving everyone with a Ryzen an unlocked CPU since day 1.
  • While Intel leaves only the expensive chipset and thus motherboards able to overclock, AMD have lots of cheap B350, B450 and soon B550 motherboards options to pick from.
  • While Intel keeps HT enable only to a few, very selected skus like Core i7 and Core i9 (and not even all of them, in the desktop platform), AMD have SMT enable on all thier Ryzen 5, Ryzen 7 and Ryzen 9 parts since 2017.
  • While Intel add no boxed cooler for the unlocked "K" CPUs for the 6xxxK back in 2015 to this days 9th gen K parts, most of the 1st gen Ryzen and all of the AMD 2nd gen and 3rd gen Ryzen CPUs (but for the very first 16 cores/32 thread desktop CPU the Ryzen 9 3950X) comes with a very decent stock cooler.
But Intel will come around with a solution (eventually). I hope AMD can keep it up so we, the consumers, still get many choices to pick from and at a decent price.
 

InvalidError

Titan
Moderator
And once again we see how it was AMD's innovations that forced action on Intel's part.
As I wrote earlier, Intel couldn't really act until AMD got better since producing significantly better parts at lower prices would have killed AMD altogether (would have been easy picking for Intel while AMD was stuck on 28nm) and caused monopoly headaches for itself.

Architecture-wise Intel is still years ahead of AMD, mainly held back by still sub-par performance on 10nm that forced it to re-jig 14nm designs to buy time.
 
As I wrote earlier, Intel couldn't really act until AMD got better since producing significantly better parts at lower prices would have killed AMD altogether (would have been easy picking for Intel while AMD was stuck on 28nm) and caused monopoly headaches for itself.

Architecture-wise Intel is still years ahead of AMD, mainly held back by still sub-par performance on 10nm that forced it to re-jig 14nm designs to buy time.
Having a 'years ahead' architecture doesn't matter for squat if your best CPU only able to beat your competition's low-end chips in most-favored benchmarks comparisons.

But, I do believe you're missing the point of exactly what AMD's done; it's not just the architecture. It's the building-block approach and flexibility of the Infinity Fabric that allows CPU's...or CCD's as it were...to be separate component from the I/O, among other things. It's those pesky chiplets that allow them to harvest so many useable CPU dies from every single, extremely expensive, 7nm wafer and IF that lets them use them across their entire product stack. From super cheap APU's all the way up through extreme high performing HEDT systems to low-power servers, leaving ever so many fewer scrap CPU's behind than thought possible. It's really about the efficiencies that brings to the manufacturing process that.

That they've done it first, or at all given the financial state they were in, is a story in itself. But doing it and as well essentially matching this 'years ahead' architecture in raw performance at the same time? That seems astounding to me.

Intel's surely going to follow suit, but AMD again leads the way with inovation. As they did with to the 64 bit architecture. BTW...does Intel still have to licence it from AMD?
 
Last edited:
Reactions: RodroX
As I wrote earlier, Intel couldn't really act until AMD got better since producing significantly better parts at lower prices would have killed AMD altogether (would have been easy picking for Intel while AMD was stuck on 28nm) and caused monopoly headaches for itself.
I'm not sure if they would've gotten those headaches simply by producing ever-better products that buried AMD, if that's the ONLY reason that AMD got buried.

I suspect that the reason they would've gotten monopoly headaches would be due to that coupled with the past anti-competitive practices they'd had before, in trying to keep AMD shut out from OEMs.
 

InvalidError

Titan
Moderator
But, I do believe you're missing the point of exactly what AMD's done; it's not just the architecture. It's the building-block approach that allows CPU's...or CCD's as it were...to be separate component from the I/O, among other things.
AMD's chiplets isn't all sunshine and roses either, it comes at the expense of 20-40ns worse latency between CCDs and memory that AMD had to double L3 size to mitigate, which is in itself a rather quite significant die area cost as the L3 accounts for almost 2/3 of the CCD die area.

As for reusing the same chiplets from APUs to HEDT, the Ryzen 4k APU rumors I have seen point towards APUs still using a monolithic die, so no CCD, separate IO or GPU chiplet getting reused there.
 

joeblowsmynose

Distinguished
Jun 14, 2011
348
126
18,960
0
... 64 bit architecture. BTW...does Intel still have to licence it from AMD?
Actually No; I think there was a end date on that licensing and Intel CPUs no longer have "AMD64" stamped on their detailed specifications for x64 compatibility.

That was definitely one of the better moves by AMD, for AMD as a company, to force Intel to license that from them. I think AMD's future would have been pretty grim had that not occurred.
 

InvalidError

Titan
Moderator
I'm not sure if they would've gotten those headaches simply by producing ever-better products that buried AMD, if that's the ONLY reason that AMD got buried.
The US government is prohibited by law from buying essential equipment from single-source vendors, this is a major part of how Intel got forced to give AMD an x86 license after initially contracting AMD to make some of its own chips to meet demand.

If Intel became the only remotely viable x86 CPU manufacturer for whatever reason, then Intel would either have to split into two independent competing units, open up the x86 ISA to would-be competitors if there are still any volunteers left after Intel burnt AMD, Cyrix, NationalSemi, Transmeta and whoever else I may have forgotten, or the US government would need to migrate its infrastructure to non-x86 hardware (preferably RISC-V to avoid future licensing BS) where vendor competition still exists.

The only way Intel might get away from this is if enough of the infrastructure is already migrating to Android, IOS and other software platforms where applications don't give a damn about what hardware they are actually running on since the OS is re-compiling bytecode to whatever is native.
 
....
As for reusing the same chiplets from APUs to HEDT, the Ryzen 4k APU rumors I have seen point towards APUs still using a monolithic die, so no CCD, separate IO or GPU chiplet getting reused there.
I thought they're expecting that to change with the 4000-series APU's.

Ryzen APU's have lagged one Zen generation behind: 2000 used Zen 1, 3000 are using Zen 1+. It wasn't until Zen 2 that chiplets came in, so if the progression holds 4000 series APU' should use Zen 2's chiplets. But rumors are what they are... we'll see i guess!
 
The US government is prohibited by law from buying essential equipment from single-source vendors....
That's not techincally true.

But if you want to be a supplier to the government (and you're sole-source) you have to allow the government to evaluate that the price is fair and reasonable. The only way to do that is basicaly let them in your back pocket: open your books and let them audit your cost basis at will to asses that you're giving them value for money.

I can imagine Intel would run from that just as fast, though. They don't want anybody knowing that 50% (or whatever) of the costs accruing to their processor price comes from kick-backs and bribes to customers to keep them buying them.
 

InvalidError

Titan
Moderator
Ryzen APU's have lagged one Zen generation behind: 2000 used Zen 1, 3000 are using Zen 1+. It wasn't until Zen 2 that chiplets came in, so if the progression holds 4000 series APU' should use Zen 2's chiplets.
Nope, if the progression holds true, 4000-series APUs will use a custom CCX design (half the L3 size) on a custom APU die just like the two previous generations.

BTW, AMD apparently confirmed back in January that there won't be chiplet-based APUs for Zen 2:
https://www.anandtech.com/show/13852/amd-no-chiplet-apu-variant-on-matisse-cpu-tdp-range-same-as-ryzen2000
 

joeblowsmynose

Distinguished
Jun 14, 2011
348
126
18,960
0
Nope, if the progression holds true, 4000-series APUs will use a custom CCX design (half the L3 size) on a custom APU die just like the two previous generations.
...
If this holds true this will create an increasing dichotomy between Ryzen CPUs and APUs in design (at each architecture revision, which is happening yearly at current pace). That might be a gamble for AMD in the mobile space as it could begin to increase cost of design and production for APUs.

I guess we'll have to wait and see ...
 
Reactions: drea.drechsler
Yup, this thread went off the rails ...
Not entirely off the rails. After all, the titular question being answered remains the same: "Why switch to AMD?"

There've been many arguments about which processor is technically better in which scenario that, in summary, fail to present a compelling case for one brand over the other in as universal a way as the question itself is phrased. And after all, with such closely performing products especially, socially conscious consumerism is more relevant than ever. It's good to bring those issues out.
 

InvalidError

Titan
Moderator
That might be a gamble for AMD in the mobile space as it could begin to increase cost of design and production for APUs.
Well, AMD does not have any GPU designed to fit on a CCD footprint, so AMD would need to design a GPU die specifically for that. The IOD may not be designed to have GPU dies attached to it and may also be lacking the ability to drive motherboard display outputs, so that would require IOD and CPU substrate re-designs too.

Then you have to consider the die size: in previous APUs, the die area dedicated to the GPU is about even with a single CCX, which would put the hypothetical 4000-series IGP around the 40sqmm mark. You are going to have a hard time cramming the CCD interconnect's pads (hundreds) and however many power/ground are necessary to power the thing in this little space, then cool it. Another issue at such a small scale is that the CCD interconnect also ends up consuming a disproportionate amount of total die space.

Chiplets don't always make sense.
 

TCA_ChinChin

Reputable
Feb 15, 2015
310
69
4,990
27
If this holds true this will create an increasing dichotomy between Ryzen CPUs and APUs in design (at each architecture revision, which is happening yearly at current pace). That might be a gamble for AMD in the mobile space as it could begin to increase cost of design and production for APUs.

I guess we'll have to wait and see ...
Samsung now has a strategic GPU partnership with AMD for the next couple of years so it's probably gonna get interesting in the mobile space for rDNA derived GPUs.
 
An RTX 2060 would be a complete GPU bottlneck with any modern gaming CPU and the CPU would never be the bottleneck, removing the CPU from the equation entirely.

The question of which CPU will game better only ever applies to the highest tier of graphics cards, namely the 2080 series. Else the comments would be full of sentiment that it makes absolutely NO difference on a 2060.

Again, most people don't realize that you only need to really worry about the CPU (as long its a modern model with more than 4 threads) if you have a 2080 series card and a high refresh monitor - if you don't, spending more money on the most expensive CPU thinking it will improve your gaming is like flushing money down the toilet.

If a 2060 is the card in mind, the $200 r5 3600 will game equally as well a $500 9900k - again because you are completely GPU bound. Considering this build is for gaming, that would equate to taking $300 and flushing it down the sewer, when you could have taken that $300 and upgraded the GPU to a 2070 super, and got a far better gaming performance with significantly faster FPS,
It's not 100% always true that you would always be GPU bound with an RTX 2060 at 1080p. While it is true with some newer titles and even a few older titles those are outliers for now. After a quick search on google I was able to find some games that are CPU bottlenecked with an RTX 2060 at 1080p. Some very popular titles. The list includes Fortnite, GTAV, Battlefield V, COD: WWII, Assassin's Creed Unity, CSGO, Fallout 76, Assassin's Creed Origins/odyssey, Far Cry 5/New Dawn, Shadow of the tomb raider, Rainbow Six Siege, World of Warcraft, and there are many more. Sure Some others are more GPU bound but most games will be CPU bound at 1080p with an RTX 2060. About future titles yeah those will probably become more GPU bound. But if you have $700 or more for an RTX 2080 why not save money by getting say an RTX 2060 Super? But anyway... GPU bound is what we want though right? The RTX 2060 can even play a good amount of games in 4K at greater than 40 fps with some settings tweaked. At 1440p the card is good for 60 fps or somewhere just north of that at high/ ultra. And at 1080p it's pretty good for somewhere over 100 fps most of the time. I know exactly about where the card should perform.
 

Mahisse

Distinguished
Nov 26, 2012
884
1
19,165
84
The only reason he's telling you to wait is because this will be Intels first response to recent AMD Ryzen competition in over a decade. IMO this is the one exception to the waiting game.
This is one of the things I don't get about the hype with AMD. AMD has been trying to come to pair with Intel's CPUs for quite some years. Now that Intel see a competition from AMD I wouldn't be surprised if Intel will release a new series that will beat AMD's recent procesessors pretty easily.
 

ASK THE COMMUNITY

TRENDING THREADS