Question Intel or AMD

sharkdog3

Distinguished
Sep 27, 2016
19
0
18,510
thinking about building a new system(fairly novice built a system once before) and just curious if anyone has an opinion on the age old question of intel or amd. I know amd's ryzen is all the talk right now but do i really need it? I don't do a lot of gamming and really only old games from the xp era though i do plan on playing some modern games in the future potentially(such as the new total war).

anyway if i go intel i'm thinking i7 or maybe a top of the line i5(any thoughts on if i5 is enough or should i go with i7)
anyway these were the cpus i was looking at

Intel Core i7-9700K Coffee Lake 8-Core 3.6 GHz (4.9 GHz Turbo) LGA 1151 (300 Series) 95W BX80684I79700K Desktop Processor Intel UHD Graphics 630

AMD RYZEN 5 3600X 6-Core 3.8 GHz (4.4 GHz Max Boost) Socket AM4 95W 100-100000022BOX Desktop Processor


any thoughts would be nice. thanks
 
You don't really do much on your PC from what you're telling us.
So that means an i5 should be enough also. Do you work on your PC or anything else? Cause it seems to me you are wasting money for something you will never use unless you suddenly find a game you like.

I mean theoretically the new Ryzen has PCI-E 4 but then again it also heats up like an oven. It's new tech and untested by normal consumers like the Intel is. For years and years and years. That's also why I went for 8700k rather than 9700k since it was still new back then and the differences aren't as huge as in the case of Ryzen gen 1,2,3 or GPU releases.

So unless you plan on changing the way you play or think and do other stuff, even an older Intel is good enough. Theoretically you are more future proof with a new system but then again, you don't really care about that and my old TW games ran perfectly fine with my old 4770k (until they ruined the game using garbage 10 years old models for units in all their TW titles making the game look ugly as hell even on maximum settings but that's another issue).

And if you ever plan to overclock, I do not recommend the AMD either. It's hard to do, doesn't work that well, you need specific RAM speeds etc etc. With my old and new Intel, I just set the value I wanted in bios, done.
Performance wise in gaming the two are pretty close nowadays, Intel is still better overall, but the margin is so small it doesn't matter.

So in the end it's all about how much money do you want to spend, you could get a 8700k easily on sale and it's kinda the same thing, but maybe much cheaper since the motherboards will also be on sale. If you do not do any workloads at all and don't run 100 apps you can even get the 8600k.
 
You don't really do much on your PC from what you're telling us.
So that means an i5 should be enough also. Do you work on your PC or anything else? Cause it seems to me you are wasting money for something you will never use unless you suddenly find a game you like.

I mean theoretically the new Ryzen has PCI-E 4 but then again it also heats up like an oven. It's new tech and untested by normal consumers like the Intel is. For years and years and years. That's also why I went for 8700k rather than 9700k since it was still new back then and the differences aren't as huge as in the case of Ryzen gen 1,2,3 or GPU releases.

So unless you plan on changing the way you play or think and do other stuff, even an older Intel is good enough. Theoretically you are more future proof with a new system but then again, you don't really care about that and my old TW games ran perfectly fine with my old 4770k (until they ruined the game using garbage 10 years old models for units in all their TW titles making the game look ugly as hell even on maximum settings but that's another issue).

And if you ever plan to overclock, I do not recommend the AMD either. It's hard to do, doesn't work that well, you need specific RAM speeds etc etc. With my old and new Intel, I just set the value I wanted in bios, done.
Performance wise in gaming the two are pretty close nowadays, Intel is still better overall, but the margin is so small it doesn't matter.

So in the end it's all about how much money do you want to spend, you could get a 8700k easily on sale and it's kinda the same thing, but maybe much cheaper since the motherboards will also be on sale. If you do not do any workloads at all and don't run 100 apps you can even get the 8600k.

thanks for the input. and no i don't do work on my pc basically surf web check e-mail play games and use i tunes for music .and yes I know an older chip would work just fine, but I really want to be some what furture proof(in case I find that one game) and well really i just like having newer tech. also i have no plan to overclock
 
Last edited:
I believe the general consensus, at least from the various sites I follow, is the Ryzen 5 series has displaced the Intel i5s; a different scenario for the CPUs in their respective 7 and 9 series.

Future proofing is an odd concept generally. By the time you're willing to upgrade the current system has been surpassed. That said, it's partially why the Ryzen 5 series is a bit better than Intel i5 due to thread count. If software does use multiple threads more then the Ryzen 5s are better placed to use them than the Intel i5s, so in theory last a bit longer.

Older games will likely favour Intel CPUs.
 
  • Like
Reactions: King_V
You cannot really go wrong with Intel or AMD is it is very hard to by a bad CPU these days. The performance on offer is huge! saying that, the i7-9700K is near about as good as it gets for gaming and will see you through a fair few years of top line gaming, though for your specific use case, I would spend a bit more on the GPU as you do not have any pro workloads, so even the Ryzen 3600 and upwards etc will more than do the job.

If you are looking to upgrade then my usual way of proceeding is to buy the best hardware you can for the budget you have and then you have what you need for now and the near term future. My upgrade cycle is usually 4 years...
 
9700 would be a better choice for your needs. But I would not buy a Intel today (maybe only 9900k) since current platform is dead. I hate the idea that I'm buying something new that doesnt give me the perspective to update it in the future if I want to.
 
.... I hate the idea that I'm buying something new that doesnt give me the perspective to update it in the future if I want to.

Since the consensus seems to be that there is no performance advantage either way, that's a compelling argument. AMD generally gives you better, and cost effective, upgrade paths.

But another argument I think works is this: the only way we can even consider an Intel CPU as a cost-effective option for upgrades today is because AMD has put such good products at such low prices on offer. Since performance is (at worst) essentially equivalent I'd reward AMD for introducing some sanity to the market and with a sincere hope they continue with it so that those upgrade paths can continue to be cost effective.
 
Future proofing is an odd concept generally. By the time you're willing to upgrade the current system has been surpassed.
I wouldn't quite say that, 7-8 years old i7 are still mostly decent by today's standards as long as you aren't chasing 100+Hz refresh, so someone buying one tier above the minimum they perceive as comfortable margin for what they need to do today should be fine for a very long time. I'm still happy with my seven years old i5-3470. Would probably have been fine with an i3 back then but I knew I'd get the upgrade itch 2-3 years in if I went that low and the i5 was only $40 extra, only ~10% on top of the up-front total CPU+MoBo+RAM upgrade price for 50+% longer expected useful life. (Which has translated into +200% longer actual useful life so far.)

I'm expecting a return to incremental yearly performance gains at any given price point now that AMD has caught up with Intel and wouldn't be too surprised if the 3600/3600X remained relevant for the next 7-10 years for people who aren't into heavily multi-threaded stuff.
 
...AMD has caught up with Intel and wouldn't be too surprised if the 3600/3600X remained relevant for the next 7-10 years...

THAT is really gonna be a curious thing to watch for. Now that Intel's nose is genuinely, and seriously, tweaked they are leveling their cash guns at the problem of CPU performance. I've also noted that AMD's not sitting idle, with a Zen3/4/5 all in the works with considerable improvements rumored.

I really do believe things are going to be changing, so much so I'm of a mind that in 7-10 years we may not recognize what performance today is like. Or even that it is relevant with what they might be working on... not just 'faster' in terms of raw clocks or IPC, but more cores/threads (and apps/OS that DEMANDS them) along with interesting ways to do it with even lower power consumption.

Something big might be happening, it could be quite exciting. Or may just turn out to be product churn with new colors and LED bling on the heatspreader to convince us it's 'better'.

I sincerely hope it's the former!
 
I wouldn't quite say that, 7-8 years old i7 are still mostly decent by today's standards as long as you aren't chasing 100+Hz refresh, so someone buying one tier above the minimum they perceive as comfortable margin for what they need to do today should be fine for a very long time. I'm still happy with my seven years old i5-3470. Would probably have been fine with an i3 back then but I knew I'd get the upgrade itch 2-3 years in if I went that low and the i5 was only $40 extra, only ~10% on top of the up-front total CPU+MoBo+RAM upgrade price for 50+% longer expected useful life. (Which has translated into +200% longer actual useful life so far.)

I'm expecting a return to incremental yearly performance gains at any given price point now that AMD has caught up with Intel and wouldn't be too surprised if the 3600/3600X remained relevant for the next 7-10 years for people who aren't into heavily multi-threaded stuff.
It would surprise me by a lot since Sony PS5 will have a Zen2 CPU with 8 cores and 16 threads. This will be the devs target for the next gen. So everyone that have something less than this will suffers. So I don't expect Ryzen 6/12 will last too long.

Xbox One and PS4 had a Bulldozer CPU (AMD FX) with 8 cores. But, as everyone knows, these 8 cores works like 4 cores. Not only that, trash IPC too.

That's the only reason why Core i5 4/4 died only last and Core i7 4/8 still can keep good framerates.

Everything will change in 2021. That time, for good. Consoles will finally have a great CPU.
 
Current gen i5 CPUs do have more cores than they used to, which is helpful.

That said, I think Ryzen 5 is probably the way to go. Plenty of extra threads, and competitive in gaming, unless you really feel the need to chase down that last half-dozen or less frames per second. Probably just fine to use a 3600 rather than a 3600X. The latter does come with a beefier cooler, and has a performance edge, but is $35-$40 more on average. May or may not be worth the extra cost.
 
I really do believe things are going to be changing, so much so I'm of a mind that in 7-10 years we may not recognize what performance today is like. Or even that it is relevant with what they might be working on... not just 'faster' in terms of raw clocks or IPC, but more cores/threads (and apps/OS that DEMANDS them) along with interesting ways to do it with even lower power consumption.
It took the better part of 20 years from the first multi-core mainstream platforms to get to the light to moderate amount of threading (much of which being idle automatic threads from framework and libraries) in most modern mainstream software, so I'd say there are no reasons to expect heavily threaded mainstream software (and I mean active threads, not the 20+ idle threads common in current-day software) to become the norm within the next 10+ years. From a software developer's point of view, each extra thread is extra complexity to manage and debug, so there is a strong incentive to avoid creating more threads than absolutely necessary.

It would surprise me by a lot since Sony PS5 will have a Zen2 CPU with 8 cores and 16 threads. This will be the devs target for the next gen. So everyone that have something less than this will suffers. So I don't expect Ryzen 6/12 will last too long.
Just because you make an infinite number of cores available does not automatically mean all software developers will make use of all cores all the time. Most algorithms have a limit on the number of cores they can practically scale to before overhead exceeds scaling gains, especially in user-interactive stuff where threads spend a lot of their time syncing with user inputs/outputs and each other. The embarrassingly parallel stuff can just get delegated to GPGPU shaders.

Also, if game developers want to have a shot at decent sales on PC, requiring a $300+ CPU and $500+ GPU isn't particularly realistic. They'll still have to make accommodations for the lowest common denominator they want to support and right now, requiring more than quad-core would mean giving up on ~70% of the potential PC market.
 
It took the better part of 20 years from the first multi-core mainstream platforms to get to the light to moderate amount of threading (much of which being idle automatic threads from framework and libraries) in most modern mainstream software, so I'd say there are no reasons to expect heavily threaded mainstream software (and I mean active threads, not the 20+ idle threads common in current-day software) to become the norm within the next 10+ years. From a software developer's point of view, each extra thread is extra complexity to manage and debug, so there is a strong incentive to avoid creating more threads than absolutely necessary.


Just because you make an infinite number of cores available does not automatically mean all software developers will make use of all cores all the time. Most algorithms have a limit on the number of cores they can practically scale to before overhead exceeds scaling gains, especially in user-interactive stuff where threads spend a lot of their time syncing with user inputs/outputs and each other. The embarrassingly parallel stuff can just get delegated to GPGPU shaders.

Also, if game developers want to have a shot at decent sales on PC, requiring a $300+ CPU and $500+ GPU isn't particularly realistic. They'll still have to make accommodations for the lowest common denominator they want to support and right now, requiring more than quad-core would mean giving up on ~70% of the potential PC market.

You should look on what's happening to 4/4 CPUs since 2018 to here, after Ryzen 6/12 launch. Core i5 is the most common and we have at least 10 to 15 games that needs much more than 4 cores to get stable 60 frames or higher frames.

Absolutely the same occours to Core 2 Duo 10/12 years ago. At the time when first Core i5 was launched, games was starting to demand more cores even with Core 2 Duo being the most popular CPU in the PC games by far. Maybe much more than 4/4 today.

Times past, things change. CPUs with more cores gets cheap since demands (from market and devs) grows. So 8/12 now maybe costs $200, but next years will cost less. We are seeing this now. 6/12 CPUs are costing less than 4/4 CPUs from 3 years ago. Demands changes, so industry changes.

But, anyway, I'm not saying that people shouldnt buy 6/12 now. Specially with a Ryzen 3600 that cheap and reaching incredible performance in every game. It's the right choice for most people.

I'm just saying: "hey, things will change soon, so if necessary, it would be nice to upgrade just changing CPU".
 
....From a software developer's point of view, each extra thread is extra complexity to manage and debug, so there is a strong incentive to avoid creating more threads than absolutely necessary.
....

I'm not looking at it from an app-centric perspective, but from from the viewpoint of the OS. MS (with Windows) is already stepping up it's game, using more and more threads in the background as it goes about it's increasingly intrusive business. I'm waiting for Defender to go multi-threaded for instance as it would have little reason to bother with notifications. With 24 threads at it's disposal it's hardly going to matter when it kicks off a background scan that's completed in a few seconds (as MalwareBytes is now) even while I'm browsing the web with 20 tabs open.

Windows hasn't earned the moniker 'bloatware' by not using up resources as they came available to the consumer desktop... look what it demands of drive space and memory for instance. And even CPU cycles, from time to time, if you look at IE's history.

Why should CPU cores/threads be any different? The only reason MS hasn't moved on it before is because the other half of the WinTel duopoly was so slow to get on it, preferring to leave core-counts in the extreme peformance segments. But now Intel seems to be following AMD's lead. In 7 or 8 years 8 core/16 threads may be the minimum needed for decent OS performance.
 
  • Like
Reactions: rodrigoxm49
You should look on what's happening to 4/4 CPUs since 2018 to here, after Ryzen 6/12 launch. Core i5 is the most common and we have at least 10 to 15 games that needs much more than 4 cores to get stable 60 frames or higher frames.
10-15 games that "require more than" quad-cores after ~10 years of quad-cores in the mainstream. I'd call that a horrible case for claiming that the sky is about to drop on hex-cores.

The only reason MS hasn't moved on it before is because the other half of the WinTel duopoly was so slow to get on it
Nope, the APIs and SDKs already have plenty of automatic threading for background processes like garbage collection. It is quite common for modern software to have 50+ threads except the vast majority of those threads are sleeping 99+% of the time. The reason the OS isn't using more CPU time is simply because it does not need any more than it already uses if you don't enable crap like voice-activated Cortana. I disabled all of the assistant stuff that I could find and my system's baseline CPU usage is down to 2-3%. Definitely not looking like my i5-3470 is going to struggle with "decent OS performance" any time soon even if Microsoft doubled baseline CPU usage every year starting from now.
 
  • Like
Reactions: King_V
10-15 games that "require more than" quad-cores after ~10 years of quad-cores in the mainstream. I'd call that a horrible case for claiming that the sky is about to drop on hex-cores.
6-cores is a thing since 2018 only. It was launched on 2017. We already have 8 cores. Very different story. Terrible is what you're trying to do, reducing an incredible well elaborated post with a lot of crossed arguments, facts and data, in a six word sentence.
 
6-cores is a thing since 2018 only. It was launched on 2017. We already have 8 cores. Very different story.
Not really, more cores were already there years ago for those with deep enough pockets, just like 12+ cores are available today for people with similarly deep pockets. Most people however don't want to spend more than $150-200 on a CPU. Unless AMD and Intel go into a sustained all-out thread count war in a race to bottom pricing like used to have 15+ years ago, the mainstream core count is not going to increase fast enough to obsolesce hex-core any time soon.
 
  • Like
Reactions: King_V
Not really, more cores were already there years ago for those with deep enough pockets
I hope you're not talking about LGA1366 Xeon or something.

6 cores is there for common people and personal computers only since 2018.

Anyway, again, you're manipulating the statement that was made in a wrong way.

Let's end this here. Now it looks only like a offtopic talk. Bye!
 
I hope you're not talking about LGA1366 Xeon or something.

6 cores is there for common people and personal computers only since 2018.

Anyway, again, you're manipulating the statement that was made in a wrong way.

Let's end this here. Now it looks only like a offtopic talk. Bye!

There were Phenom II X6 CPUs available under $200 on mainstream socket/chipset in 2010.
 
  • Like
Reactions: King_V
6 cores is there for common people and personal computers only since 2018.
Quad cores were in the mainstream a decade earlier and are barely getting into "sometimes inadequate" territory.

I would love to be proven wrong about my skepticism on octa-cores becoming "the norm" in the mainstream (~$200) any time soon as far as available hardware is concerned - I was ready to buy one of those rumored $200 8C16T Ryzen 3600 if those turned out to be true, might still buy a Ryzen 3700 when that come down to $200. Software-wise though, hex-cores may have been "only available since 2017" (if you ignore the X6 from 2010) but mainstream games target established specs (what people can be reasonably expected to have) at the time of launch, not bleeding-edge specs for people on 300+Hz monitors, so publisher requirements tend to lag about five years behind current hardware with minimum requirements either putting on an extra two years or knocking everything down a peg.
 
  • Like
Reactions: King_V
.. if you don't enable crap like voice-activated Cortana....

But what of those who WANT voice-activated Cortana and every other crazy thing MS can think of to add in to? After all, they BOUGHT this massively threaded monster machine, they're really so cheap now, so there's really no reason NOT to enjoy those crazy features.

I also have to think that if MS knew they didn't have to sleep so many of the 50+ threads (I'd actually read 80+ in another article recently, just saying) already present in the OS they wouldn't feel necessary to do it. Given more thread resources MS will think of things they can do with them, I'm just waiting for it.
 
Last edited: