Well This post has a lot of replies and I haven't looked through them all yet. But for 1080p 144hz you'd probably be fine with a lower end graphics card than an RTX 2080. I'd say maybe even an RTX 2060 would do the trick.Hi there!
So I'm looking forward to build a new rig for gaming. Playing games in 1080p (21:9, 144Hz monitor) and then later in 1440p possibly will be in the focus. I'm planning to use this setup for the next 4-5 years and I am thinking of spending quite a lot of money, so high-end but not NASA-killer setups are in the talking (i5-9600k, i7-9700k, RTX2080-ish GPU).
I always had Intel processors but I am aware that AMD once again became a heavy-hitter, real-deal in the business. I am glad and everything but it is hard to let loose my initial discomfort of changing manufacturer and platform. However I am willing to do it considering that there are more than enough valid reasons behind it. Although emotions play a part and I surely can't really miss with an i7 or i5, nor can I with a Ryzen 5 (3600 and X) or Ryzen 7 (3700X).
So please, try to convince me buying a new rig with AMD architecture! But please keep in mind: framerate and longevity are the two key factors. I am not really interested in things like faster video export or faster times in WinRAR
Thank you![]()
That hits the target. But one takeaway I have from scanning through the foregoing posts: whichever way you go most any choice you'll have an equally useable and even enjoyable experience. But one thing you get with the AMD choice, especially today, is value..... For what it's worth, I switched from Intel to AMD a few months ago and I have had zero issues and zero regrets.
An RTX 2060 would be a complete GPU bottlneck with any modern gaming CPU and the CPU would never be the bottleneck, removing the CPU from the equation entirely.Well This post has a lot of replies and I haven't looked through them all yet. But for 1080p 144hz you'd probably be fine with a lower end graphics card than an RTX 2080. I'd say maybe even an RTX 2060 would do the trick.
Ok now about the AMD processor. Framerate is honestly going to be a little lower on a Ryzen 5 3600x because it can't hit the same clock speeds as the intel part. But would the frame rate really matter that much if it's already either very close to your monitor's refresh rate or over it? Not really. In a blind test you couldn't tell the difference between the AMD part and the intel part. As far as longevity goes you're better off with AMD because 1 they offer double the maximum core count on mainstream that intel does and 2 they support upgrades on the motherboard a lot longer than intel does. AMD is ahead now in ipc and as Intel stays with the now ancient 'lake' microarchitecture the gap between AMD and intel in IPC will get wider to the point it won't matter that Intel can hit 5GHz and AMD can't. I'd say we're almost actually there already. AMD CPU's perform excellent in gaming now and when you actually hit the GPU with a good load the GPU will actually most likely be the bottleneck which is ideal for gaming.
You should actually be thankful to Intel for taking a break while AMD fell behind because otherwise, AMD would have been run into the ground so hard it would have had absolutely no chance of bouncing back.Only AMD has consistently produced innovative processors that (bulldozer aside) move the market forward. In contrast, Intel consistently stagnates development whenever they can.
And bounce back they will... I just want AMD to have a fair chance to gain traction in the marketplace this time. ONLY with fair competition will we continue to see effective development that results moving the industry forward..... On the plus side, now we get to find out how quickly Intel can react fast enough to AMD bouncing back much harder than expected to keep it under the "budget brand" public opinion.
I think this is by far the most important part of whats going on since the launch of Ryzen in 2017, and the reason why Intel is worry to keep up with the raise of Ryzen sells DIY PC market (yes they are having troubles, What other reason to cut prices and add HT to all their new CPUs?).But further AMD and only AMD has had a consumer, market and even industry friendly approach in the face of the blatant, industry destroying monopolostic practices Intel used whenever confronted with superior AMD products. It's really those practices that inhibited AMD's ability to gain traction in the market and resulted with Intel's undeserved reputation as an 'innovator'.
As I wrote earlier, Intel couldn't really act until AMD got better since producing significantly better parts at lower prices would have killed AMD altogether (would have been easy picking for Intel while AMD was stuck on 28nm) and caused monopoly headaches for itself.And once again we see how it was AMD's innovations that forced action on Intel's part.
Having a 'years ahead' architecture doesn't matter for squat if your best CPU only able to beat your competition's low-end chips in most-favored benchmarks comparisons.As I wrote earlier, Intel couldn't really act until AMD got better since producing significantly better parts at lower prices would have killed AMD altogether (would have been easy picking for Intel while AMD was stuck on 28nm) and caused monopoly headaches for itself.
Architecture-wise Intel is still years ahead of AMD, mainly held back by still sub-par performance on 10nm that forced it to re-jig 14nm designs to buy time.
I'm not sure if they would've gotten those headaches simply by producing ever-better products that buried AMD, if that's the ONLY reason that AMD got buried.As I wrote earlier, Intel couldn't really act until AMD got better since producing significantly better parts at lower prices would have killed AMD altogether (would have been easy picking for Intel while AMD was stuck on 28nm) and caused monopoly headaches for itself.
AMD's chiplets isn't all sunshine and roses either, it comes at the expense of 20-40ns worse latency between CCDs and memory that AMD had to double L3 size to mitigate, which is in itself a rather quite significant die area cost as the L3 accounts for almost 2/3 of the CCD die area.But, I do believe you're missing the point of exactly what AMD's done; it's not just the architecture. It's the building-block approach that allows CPU's...or CCD's as it were...to be separate component from the I/O, among other things.
Actually No; I think there was a end date on that licensing and Intel CPUs no longer have "AMD64" stamped on their detailed specifications for x64 compatibility.... 64 bit architecture. BTW...does Intel still have to licence it from AMD?
The US government is prohibited by law from buying essential equipment from single-source vendors, this is a major part of how Intel got forced to give AMD an x86 license after initially contracting AMD to make some of its own chips to meet demand.I'm not sure if they would've gotten those headaches simply by producing ever-better products that buried AMD, if that's the ONLY reason that AMD got buried.
I thought they're expecting that to change with the 4000-series APU's.....
As for reusing the same chiplets from APUs to HEDT, the Ryzen 4k APU rumors I have seen point towards APUs still using a monolithic die, so no CCD, separate IO or GPU chiplet getting reused there.
That's not techincally true.The US government is prohibited by law from buying essential equipment from single-source vendors....
Yup, this thread went off the rails ...I think the OP probably already left the house and has become a proud owner of a new MAC.
Nope, if the progression holds true, 4000-series APUs will use a custom CCX design (half the L3 size) on a custom APU die just like the two previous generations.Ryzen APU's have lagged one Zen generation behind: 2000 used Zen 1, 3000 are using Zen 1+. It wasn't until Zen 2 that chiplets came in, so if the progression holds 4000 series APU' should use Zen 2's chiplets.
If this holds true this will create an increasing dichotomy between Ryzen CPUs and APUs in design (at each architecture revision, which is happening yearly at current pace). That might be a gamble for AMD in the mobile space as it could begin to increase cost of design and production for APUs.Nope, if the progression holds true, 4000-series APUs will use a custom CCX design (half the L3 size) on a custom APU die just like the two previous generations.
...
Not entirely off the rails. After all, the titular question being answered remains the same: "Why switch to AMD?"Yup, this thread went off the rails ...
Well, AMD does not have any GPU designed to fit on a CCD footprint, so AMD would need to design a GPU die specifically for that. The IOD may not be designed to have GPU dies attached to it and may also be lacking the ability to drive motherboard display outputs, so that would require IOD and CPU substrate re-designs too.That might be a gamble for AMD in the mobile space as it could begin to increase cost of design and production for APUs.
Samsung now has a strategic GPU partnership with AMD for the next couple of years so it's probably gonna get interesting in the mobile space for rDNA derived GPUs.If this holds true this will create an increasing dichotomy between Ryzen CPUs and APUs in design (at each architecture revision, which is happening yearly at current pace). That might be a gamble for AMD in the mobile space as it could begin to increase cost of design and production for APUs.
I guess we'll have to wait and see ...
It's not 100% always true that you would always be GPU bound with an RTX 2060 at 1080p. While it is true with some newer titles and even a few older titles those are outliers for now. After a quick search on google I was able to find some games that are CPU bottlenecked with an RTX 2060 at 1080p. Some very popular titles. The list includes Fortnite, GTAV, Battlefield V, COD: WWII, Assassin's Creed Unity, CSGO, Fallout 76, Assassin's Creed Origins/odyssey, Far Cry 5/New Dawn, Shadow of the tomb raider, Rainbow Six Siege, World of Warcraft, and there are many more. Sure Some others are more GPU bound but most games will be CPU bound at 1080p with an RTX 2060. About future titles yeah those will probably become more GPU bound. But if you have $700 or more for an RTX 2080 why not save money by getting say an RTX 2060 Super? But anyway... GPU bound is what we want though right? The RTX 2060 can even play a good amount of games in 4K at greater than 40 fps with some settings tweaked. At 1440p the card is good for 60 fps or somewhere just north of that at high/ ultra. And at 1080p it's pretty good for somewhere over 100 fps most of the time. I know exactly about where the card should perform.An RTX 2060 would be a complete GPU bottlneck with any modern gaming CPU and the CPU would never be the bottleneck, removing the CPU from the equation entirely.
The question of which CPU will game better only ever applies to the highest tier of graphics cards, namely the 2080 series. Else the comments would be full of sentiment that it makes absolutely NO difference on a 2060.
Again, most people don't realize that you only need to really worry about the CPU (as long its a modern model with more than 4 threads) if you have a 2080 series card and a high refresh monitor - if you don't, spending more money on the most expensive CPU thinking it will improve your gaming is like flushing money down the toilet.
If a 2060 is the card in mind, the $200 r5 3600 will game equally as well a $500 9900k - again because you are completely GPU bound. Considering this build is for gaming, that would equate to taking $300 and flushing it down the sewer, when you could have taken that $300 and upgraded the GPU to a 2070 super, and got a far better gaming performance with significantly faster FPS,
This is one of the things I don't get about the hype with AMD. AMD has been trying to come to pair with Intel's CPUs for quite some years. Now that Intel see a competition from AMD I wouldn't be surprised if Intel will release a new series that will beat AMD's recent procesessors pretty easily.The only reason he's telling you to wait is because this will be Intels first response to recent AMD Ryzen competition in over a decade. IMO this is the one exception to the waiting game.