Should you upgrade your GPU or CPU for faster gaming? We tested many hardware combos to find out

Status
Not open for further replies.
Very cool to see these charts. My 2080 will be very limited once I upgrade to a 9x00X3D. I'll see if next gen GPUs are a "good deal". If so, I'll have to upgrade my GPU at that time, or at least get the current gen for a potential discount.

I've always upgraded my CPU and GPU at the same time, but with the price shenanigans the past 4+ years of GPUs I haven't felt the need to participate.
 
I'am using low power cpu to play 35w cpu from intel... only helldivers 2 have some deeps... see some cpu using 150w of power in this game...

The 35w cpu with a 4060ti make it a gaming monster with same power draw of a ps5 or xbox series X

You can find these T series cheap cheap on ebay.
 
  • Like
Reactions: artk2219
Good article
Its nice to see an occasional review of the actual system bottlenecks at settings most people use, even if it just confirms what most readers believe.

I have an AMD 5800x and a 6700XT - close enough to a 11900k and 2080 in gaming to check the impact of upgrades.
I game at 1440p and use high/ultra settings and I'm considering moving to either 4k or ultrawide

If I keep my 6700xt and upgrade my CPU, I'll see no real difference

If I upgrade to a current mid-range card (4070 super/5800xt), I'll see some performance improvements but since I'm already at 50FPS min in raster, it's probably not worth it for me. I still won't need a CPU upgrade.

If I upgrade to a 4k monitor, my framerates will be unacceptable to me.
I'd need a GPU upgrade to a card at least as fast as a 3080 and I still wouldn't need a CPU upgrade.
 
Last edited:
Building a computer and micro updating is something I used to love doing. But for the last several years I find it best just to build completely new. Optimizing the cpu/gpu for the best performance at build budget.
I have built computers since my first computer class - yes it was called computer class - way back in 1997-98, and with my first build using windows 98' 2nd edition.
Updating a part here and there to eek out a bit more performance was so awesome.

But now... building a computer has become so much simpler and also much more defined.... I guess. It is more cost effective to buy a great combination of hardware, to get the best performance for your budget. And then sale the entire rig -within a few years, or give it to family, friends, repurpose. etc,- and take that money and invest into a new gaming rig/ build.
I mean buying used parts have become insanely risky and easy at the same time.
It is so hard getting any kind of money out of selling your ram, storage, psu, or mobo. Because the used market is so toxic or specialized, and high in price. If you find a good seller, on a good platform like amazon, ebay, reselling a graphics card... they want too much, and if you find a good deal on a graphics card... there is probably something wrong with it.

But I can resale my (under 3 year old) gaming pc for about 70%+ of what I put in it, if not more. I say, come and try it out, play on it. Here are some pics, etc. Clean, non smoking home, look inside it. I meet people at my place of business's main office, or hotel lobby office, or a library. -- NOT a parking lot. I also sale my monitor with it if possible. Because all of the parts are in working order - the hardware is worth more, then individual parts alone. Again, who is going to buy a used PSU... or stick of ram. During covid, I actually sold my 3 yr old build for more then I had in it.

If a part dies... upgrade. But if your going to upgrade your cpu/gpu... what do you do with those old parts, ... resale, e-waste, horde them in a closet. Now with the best ram, that goes with this cpu, that goes with this mobo, that works with this gpu and psu... you are losing too much investment by just upgrading one part. I would rather sale it all, and build new.
 
  • Like
Reactions: artk2219
"So, if you're rocking a top-tier GPU like the RTX 4080 or above, or the RX 7900 XTX, but you're running a five or six years old CPU, you're still giving up a lot of performance at 1440p ultra and it's time for an upgrade — or at least, it will be time to upgrade once AMD's Zen 5 and Ryzen 9000 CPUs arrive, unless you want to wait a bit longer for Intel's Arrow Lake CPUs."

I didn't see any AMD GPUs in that test, so can I safely assume that the above statement is subjective?
 
I'm hoping Jarred would consider doing a similar piece to this, focusing current gen mainstream gaming hardware.

For mainstream users building a new gaming PC, GPU is the first consideration, and $300-500 (or perhaps $600) is the most opted-for range. 1080p & 1440p would be the target. Then the question is what's the most cost-effective CPU to couple with this.

I find the typical gaming benchmarks to be unhelpful, as it pegs CPU perf against the 4090 as a base, which are out of reach for mainstream gamers. Using the 4060/4070 (or 76/77/7800) as the base would be more relevant.

Of course, we have rule-of-thumb to go by. Conventional consensus is that i5/Ryzen 5 suffices for midrange, with i7/Ryzen 7 for high-end (and i9/Ryzen 9 for the money-no-object crowd).

But it would be good to see this consensus put to the test.
 
I really appreciate articles and content like this in general as it has such a wide range of useful information. Minimum frame rate definitely seems to be predominantly where older platforms let down newer video cards.

I've really needed to upgrade my platform for a few years now, but just wasn't impressed enough with ADL. Then the Zen 4 IHS and RPL peak power consumption happened so it's just been a waiting game. Hopefully Zen 5/ARL will be all around better than their predecessors.

One thing I would like to see is an AMD GPU tested simply because they still have less CPU driver overhead so it would be interesting to see if the 8700K fares better with AMD.
 
I'm hoping Jarred would consider doing a similar piece to this, focusing current gen mainstream gaming hardware.

For mainstream users building a new gaming PC, GPU is the first consideration, and $300-500 (or perhaps $600) is the most opted-for range. 1080p & 1440p would be the target. Then the question is what's the most cost-effective CPU to couple with this.

I find the typical gaming benchmarks to be unhelpful, as it pegs CPU perf against the 4090 as a base, which are out of reach for mainstream gamers. Using the 4060/4070 (or 76/77/7800) as the base would be more relevant.

Of course, we have rule-of-thumb to go by. Conventional consensus is that i5/Ryzen 5 suffices for midrange, with i7/Ryzen 7 for high-end (and i9/Ryzen 9 for the money-no-object crowd).

But it would be good to see this consensus put to the test.
RTX 2080 is about the same as a RTX 4060 (except RT but RT isn't really a thing at that performance level in my opinion), RTX 3080 is faster than a RTX 4070, but a bit slower than a RTX 4070 Super so you can extrapolate performance from that. 11900K is going to be around the 12100-12400 performance range or 5600 on the AMD side so there's your budget CPU performance (for gaming). Current generation midrange CPU parts vary too much to get a good guess on those but they should all be within 5-15% of the performance of the current top.

I'm not saying that an article with more mainstream hardware couldn't be good, but that you can get a lot of pertinent information out of this one already.
 
3060 not even in the graph, far FAR more people using 3060s.. also who the actual F has a 13900K or a 7800X3D and a 3050... that's madness, just put them out to pasture..
 
3060 not even in the graph, far FAR more people using 3060s.
It's not to hard of a stretch to mentally see where the RTX 3060 would sit on the charts. Without having the low man on the totem pole the RTX 3050 we would not have a place of reference to balance off for higher cards.

I do agree with you though the RTX 3060 is and has been a champion card to pull off being affordable/able to game and still hanging in there taking the blows as time goes on.
 
Core i7-14700K and i9-13900K

I don't know in what universe a 14700K matches or beats a 13900K. The 14700K barely beats a 13700K.
It's not that kind of improvement/comparison.

14700K does beat the pants off a 12900K though. As does a 13600K.
There is a big gap between the 12th and 13th gen of processors, maybe that's what the statement was about? Or about the 11900K?

I'm confused because the 14th generation barely moved the needle compared to the 13th. Slightly more efficient and maybe an extra core or thread here and there, but the performance is very similar.


A 3070 GPU does creep up on a 2080, and a 4070 matches a 3080, but i7 14700K and i9 13900K?
I can't see it.
 
The article does mention in passing that AMD's AM4 socket is a different beast and indeed, it is - I built a PC in 2018 with X470, Ryzen 2700X and a Rx480 8Gb, and 16 GB of DDR4 3200. Since then, I doubled the RAM, doubled CPU performance (5900X) and GPU too (6600 XT). Is it top of the line ? No. Does it beat the pants off most machines out there ? Not really. Can I game with good details at 1440p ? Yes.
Cost ? Well, initial cost in 2018 + upgrades over the years represent a cost per year that is... Well... Rather competitive. Crypto crises included.
 
  • Like
Reactions: RodroX
That is true, it has been easier to stay relatively fast without buying an entire new PC - though I only have 3 original parts from my pc I bought in 2020. Its main reason I am not looking at upgrading either until next PC.
 
I'm pretty sure a 3070 is roughly equivalent to a 2080 ti
Even better, but that's the GPUs. On the CPU side, a newer 13600K does beat the older 12900K, but the newer 14600K, does not come anywhere near the older 13900K. It doesn't even scratch the 13700K.

The 13th and 14th generation intels are 99.99% identical, model vs model. Buy whichever one is cheaper because they will perform the same.
On paper the 14700K has extra cores and more threads but even with that it still doesn't move the needle over the 13700K.
13600K and 14600K are the same, ditto 13900K and 14900K, or 13500 and 14500, and 13400 and 14400.
Identical performance except for some marginal efficiency gains and polish.

If upgrading, it's not worth it at all. If buying new it would make sense to go with the newer, but you know... Look at the price difference!
 
I just want to shout out how genuinely useful and powerful this testing was. I absolutely recognize the effort that went into compiling all this data. I think it’s going to be immensely useful to readers for many more years, even after this generation of hardware obsolescence, because the trends are all clearly there.

It’s refreshing to see outlets like this still innovating and adding value for its community.
 
Status
Not open for further replies.