[SOLVED] Theoretically, how will a OC’d i7-5820k perform alongside a RTX 3080

Benovation

Reputable
Nov 15, 2015
38
0
4,530
Hey!
My name’s Ben, I’m an avid PC builder and I have a general idea of how things work—but wanted to get a theoretical opinion on how my rig will perform if I was to purchase a 3080.
I currently run a GTX 1080Ti, with 16gb of 2700MHz DDR4 Ram, and a i7-5820k OC’d go around 4.4GHz. Think it’s technically like 4.38, but whatever.
I generally play games with G-Sync enabled at 1440p, 1, Max settings. My 1080Ti chews these up to at least 60 usually, generally much higher than that. That said, it is starting to show its age at 1440p with games like MFS2020, and some other major titles (atleast at 1440p) and both my cpu and GPU are always around 95%-100 % usage.

With all that out of the way, I am currently eyeing on upgrading to the 3080 for future titles, including the highly anticipated cyberpunk, and to generally get better performance at higher resolutions. While my CPU is now old, it generally manages most benchmarks pretty well, and doesn’t seem to bottleneck my 1080Ti except under pretty extreme circumstances. Since we don’t have benchmarks yet a lot of this is guess work, but how well would you imagine a OC’d i7-5820k would pair up with a RTX 3080? I anticipate at least a little bit of a bottleneck in really strenuous sections of some games, but aside from that I think it should be alright. Would you agree with this opinion, or is a CPU that old probably gonna start showings it’s Grey hairs when running alongside such a bleeding edge card? Especially with something like RTX enabled?

TL;DR: How bad of a bottleneck would you anticipate seeing with a i7-5820k at 4.4 GHz paired with an upcoming RTX 3080?
 
Solution
I think that your CPU will start showing its age much more with a RTX 3080.

Your CPU has 6 cores / 12 threads and is overclocked to 4.4 GHz which is pretty good; you're playing at 1440p with maximum settings, so you will definitely give a powerful GPU a pretty good workout under most scenarios.

However, the fact that your CPU is at 95%-100 % usage under most scenarios already shows that its not enough; this will become much more painfully obvious when you buy a GPU like the RTX 3080 which is about twice as fast as your 1080 Ti.

You mentioned MSFS2020; I play that at 4K with my RTX 2080 and an Intel i7-9700k (8 cores / 8 threads, Overclocked to 5.1 GHz on all cores). Even though the game is GPU limited in most cases, there are...
I think that your CPU will start showing its age much more with a RTX 3080.

Your CPU has 6 cores / 12 threads and is overclocked to 4.4 GHz which is pretty good; you're playing at 1440p with maximum settings, so you will definitely give a powerful GPU a pretty good workout under most scenarios.

However, the fact that your CPU is at 95%-100 % usage under most scenarios already shows that its not enough; this will become much more painfully obvious when you buy a GPU like the RTX 3080 which is about twice as fast as your 1080 Ti.

You mentioned MSFS2020; I play that at 4K with my RTX 2080 and an Intel i7-9700k (8 cores / 8 threads, Overclocked to 5.1 GHz on all cores). Even though the game is GPU limited in most cases, there are times when my GPU utilization drops because I am being CPU limited: there are just so many objects for my CPU to draw that it's bottlenecking my GPU. Now, your 1080 Ti and my RTX 2080 are pretty evenly matched, so if my 9700k can bottleneck my RTX 2080 with a game that you are interested in, than your 5820k will definitely bottleneck an RTX 3080.

I'm not saying don't buy a RTX 3080; I am saying that you'll probably have a hard time getting the most out of it with your current CPU.

Seeing as how your current CPU is already bottlenecking your 1080 Ti (and the fact that it'll probably be a while before RTX 3080's are easy to find in stock), perhaps the prudent thing to do would be to upgrade your current motherboard and CPU first, and then when stock/prices have stabilized and AMD's new GPUs have come out and caused prices to settle even further, then purchase a RTX 3080. Just my 2 cents.
 
Solution

TRENDING THREADS