[SOLVED] Most budget missing parts for RTX2080 (for no bottleneck)

r0dster

Commendable
Jul 22, 2019
29
1
1,535
Hi, I have a kind of specific situation that I was hoping you guys could help with. I currently have an RTX 2080, great cpu cooling (noctua nh-d15), a power supply, a case and a monitor.

However I do not have a suitable CPU, motherboard and ram. So my question would be, could anyone recommend the absolute cheapest components that I could buy in order to complete the system without bottlenecking the RTX 2080?

My goal would ideally be games no older than 1-2 years running at 1440p 144hz as well as 4k 60hz. I have tried to do research on this, however I am finding it difficult due to the specific situation but also the lack of information on what would bottleneck the 2080.

Thanks
 
Solution
Any i5 or i7, or Ryzen 5, 7 or 9, from the last two generations (Maybe even three or four generations for the Intel i5 or i7) should be sufficient. 16GB of suitable memory and the best board you can reasonably afford that is compatible with whatever CPU you choose and doesn't cause you to have to appreciably change your CPU selection.

Optimally you want an i7 or i9, or Ryzen 7 or 9, from the last two generations of those product families, but you could get away with an i5 or Ryzen 5 if you had to.
Any i5 or i7, or Ryzen 5, 7 or 9, from the last two generations (Maybe even three or four generations for the Intel i5 or i7) should be sufficient. 16GB of suitable memory and the best board you can reasonably afford that is compatible with whatever CPU you choose and doesn't cause you to have to appreciably change your CPU selection.

Optimally you want an i7 or i9, or Ryzen 7 or 9, from the last two generations of those product families, but you could get away with an i5 or Ryzen 5 if you had to.
 
Solution

r0dster

Commendable
Jul 22, 2019
29
1
1,535
Any i5 or i7, or Ryzen 5, 7 or 9, from the last two generations (Maybe even three or four generations for the Intel i5 or i7) should be sufficient. 16GB of suitable memory and the best board you can reasonably afford that is compatible with whatever CPU you choose and doesn't cause you to have to appreciably change your CPU selection.

Optimally you want an i7 or i9, or Ryzen 7 or 9, from the last two generations of those product families, but you could get away with an i5 or Ryzen 5 if you had to.

I can find an i3-10100f for 85 euros, which is surprisingly cheap. Do you think this would bottleneck the 2080?

As the next step up would be the i5-10400f but for around 155 euros. Aside from intel, ryzen 5 3600 and 7 2700x for example, start at over 200 euros each.
 
The 10100F COULD work, but it's much like having a 6th Gen Skylake i7 and the fact is that those parts are really just not capable enough for a GOOD experience. The 10400F would be a MUCH better choice. It has double the onboard cache (Smart cache), two more physical cores and four more hyperthreads. That would be really the minimum, if buying today, that you'd want to use for a card like that. But it also depends on what you play.

Games that are almost entirely GPU bound don't need extremely powerful processors, while other games will fall flat on their faces if you don't have a CPU that is pretty capable. Also depends on how many frames per second you are trying to stay within range of. For a 60FPS configuration you can get away with a much less capable CPU than for one that you are trying to get at or near 120-144FPS or more.
 
I agree that the i5-10400 / 10400F is probably the minimum worth considering. A 6-core, 12-thread processor like that should hold up pretty well in current games, and probably nearly all games coming out within at least the next few years, if not longer. A 4-core, 8-thread CPU like the i3-10100 likely isn't going to fare as well in the long term, and there are already a few AAA games that can see some performance instability on a processor like that.

The Ryzen 3600 was also a good option when it was priced more competitively with the 10400, as it often tended to be slightly faster when properly configured with faster memory, but its pricing hasn't been particularly good since the latter part of last year. It's arguably not worth paying that much more for what would be an indistinguishable performance difference in games.

The 2700X does offer a couple more cores compared to the 3600 or 10400, but the vast majority of existing games won't fully utilize them, and each of its cores tend to not perform quite as fast, resulting in slightly lower performance in most of today's (CPU-limited) titles. It's still a decent option if found at the right price, but probably not as good of a value as the 10400 currently.
 
I agree that the i5-10400 / 10400F is probably the minimum worth considering. A 6-core, 12-thread processor like that should hold up pretty well in current games, and probably nearly all games coming out within at least the next few years, if not longer. A 4-core, 8-thread CPU like the i3-10100 likely isn't going to fare as well in the long term, and there are already a few AAA games that can see some performance instability on a processor like that.

The Ryzen 3600 was also a good option when it was priced more competitively with the 10400, as it often tended to be slightly faster when properly configured with faster memory, but its pricing hasn't been particularly good since the latter part of last year. It's arguably not worth paying that much more for what would be an indistinguishable performance difference in games.

The 2700X does offer a couple more cores compared to the 3600 or 10400, but the vast majority of existing games won't fully utilize them, and each of its cores tend to not perform quite as fast, resulting in slightly lower performance in most of today's (CPU-limited) titles. It's still a decent option if found at the right price, but probably not as good of a value as the 10400 currently.
Right. This is a sufficiently accurate assessment. I would definitely avoid the 2000 series Ryzen CPUs. They were honestly never that great anyhow, even though they were much improved from previous AMD generations. They just weren't very good compared to even less expensive Intel CPUs and now they are practically like FX was compared to Ivy bridge when compared to what's out now, or most of what's been released over the past two or three years.
 
Hi, I have a kind of specific situation that I was hoping you guys could help with. I currently have an RTX 2080, great cpu cooling (noctua nh-d15), a power supply, a case and a monitor.

However I do not have a suitable CPU, motherboard and ram. So my question would be, could anyone recommend the absolute cheapest components that I could buy in order to complete the system without bottlenecking the RTX 2080?

My goal would ideally be games no older than 1-2 years running at 1440p 144hz as well as 4k 60hz. I have tried to do research on this, however I am finding it difficult due to the specific situation but also the lack of information on what would bottleneck the 2080.

Thanks
Probably worth asking, what other hardware does the system currently have that you feel is limiting the 2080?

Right. This is a sufficiently accurate assessment. I would definitely avoid the 2000 series Ryzen CPUs. They were honestly never that great anyhow, even though they were much improved from previous AMD generations. They just weren't very good compared to even less expensive Intel CPUs and now they are practically like FX was compared to Ivy bridge when compared to what's out now, or most of what's been released over the past two or three years.
I wouldn't exactly say that. Again, something like a 2700X should perform relatively close to a 10400 in most of today's games and applications, and might potentially have a little more to offer down the line. Intel's higher-end parts can clock higher, but the somewhat limited clocks of the 10400 prevent it from holding any major advantage there.

And when they were new, the 2000-series Ryzens arguably did a reasonable enough job competing with the 8th-gen parts. At launch, I felt Intel's 8th-gen offerings were pretty strong, but there were some ongoing availability and pricing issues with them, while AMD adjusted prices to be more competitive, resulting in them being an arguably better value at many price points. Especially when comparing what was then a typically ~$180 Ryzen 2600 against a $300+ i7-8700 offering the same numbers of cores and threads. The 8700 was a little faster, sure, but arguably not enough to be worth that kind of premium.

However, now we are seeing pretty much the opposite situation, with AMD offering a bit more at the high-end with some of their 5000-series CPUs, but having pricing and availability issues at the more mid-range to lower-end price points, making Intel the better value in this range. The i5-10400 performs close to the i7-8700, and the remaining 2700X stock is now priced too high relative to it. The same goes for the 3600, which was a good value when it was priced closer to $170 for much of last year, but not as much at it's more recent $200+ pricing, especially given Intel's price reductions. The 5600X is good, but it's questionable whether its worth the large price premium over some of the 10th and 11th-gen i5s. And the new 5000-series APUs also seem priced a bit too high, especially for anyone not utilizing the integrated graphics. I would have rather seen a more competitively-priced or better-performing 5600 (non-X) rather than the 5600G. But I suppose AMD's 7nm production is limited, so lower-margin parts have likely been set aside in favor of higher-margin ones.
 
I agree the performance was better due to the increase in cores, but the single core performance was still not very good at that time. Zen 2 was leaps ahead of Zen+ in the area of single core performance. But anyhow, any of these might be good choices at certain price points. As you asked, it really depends on what they had before or currently has, if anything, since he indicated he doesn't have a suitable CPU, memory and RAM. But maybe he just means what he has is really, really old.

Still be nice to know what it is for comparative reasons.
 
10400f
B460 board of your choosing, or pay additional cash for memory overclocking on a B560, this seems worth it,
Quality ddr4 3200-3600 , 3733 seems to be where the cost becomes prohibitive
a decent psu around 600 watts 650 couldn't hurt ideal would be silverstone/seasonic
A case of your preference
1TB SSD
 

r0dster

Commendable
Jul 22, 2019
29
1
1,535
Sorry for not replying sooner guys, I have been very busy and for some reason I did not get email notifications for replies. So forgot to check back.
The 10100F COULD work, but it's much like having a 6th Gen Skylake i7 and the fact is that those parts are really just not capable enough for a GOOD experience. The 10400F would be a MUCH better choice. It has double the onboard cache (Smart cache), two more physical cores and four more hyperthreads. That would be really the minimum, if buying today, that you'd want to use for a card like that. But it also depends on what you play.
Games that are almost entirely GPU bound don't need extremely powerful processors, while other games will fall flat on their faces if you don't have a CPU that is pretty capable. Also depends on how many frames per second you are trying to stay within range of. For a 60FPS configuration you can get away with a much less capable CPU than for one that you are trying to get at or near 120-144FPS or more.
The only reason for wanting the i3 10100 was it’s attractive price point of $100 + $100 more for a B560 motherboard. With the hope that it would be enough to use together with an RTX2080 that I have, and get pretty much full performance out of it on the “cheap” (due to already having the gpu). With future proofing not really being much of a concern. The ideal FPS would be 1440p 144hz for modern shooters, whilst also 4K 60hz for more adventures-type games.
I agree that the i5-10400 / 10400F is probably the minimum worth considering. A 6-core, 12-thread processor like that should hold up pretty well in current games, and probably nearly all games coming out within at least the next few years, if not longer. A 4-core, 8-thread CPU like the i3-10100 likely isn't going to fare as well in the long term, and there are already a few AAA games that can see some performance instability on a processor like that.
The Ryzen 3600 was also a good option when it was priced more competitively with the 10400, as it often tended to be slightly faster when properly configured with faster memory, but its pricing hasn't been particularly good since the latter part of last year. It's arguably not worth paying that much more for what would be an indistinguishable performance difference in games.
The 2700X does offer a couple more cores compared to the 3600 or 10400, but the vast majority of existing games won't fully utilize them, and each of its cores tend to not perform quite as fast, resulting in slightly lower performance in most of today's (CPU-limited) titles. It's still a decent option if found at the right price, but probably not as good of a value as the 10400 currently.
Yes, however I was looking to get something for like 6 months or so really, without caring to give much thought to any future proofing and only focusing strictly on lowest cost. But without bottlenecking the rtx2080 at all.
Probably worth asking, what other hardware does the system currently have that you feel is limiting the 2080?
I actually don’t have a system for it. I just have loose parts and components which I had planned to sell over a year ago but never did and I left to gather dust (rtx 2080, noctua cpu cooling, corsair psu, ssd, 4k monitor and pc case). So recently, I was hoping to cheaply put it all together into a complete, solid enough system for a few months of playing. As I haven’t gamed in ages and just wanted to enjoy trying some games for a few months at highest settings if possible. But not planing for much more after that really.
I agree the performance was better due to the increase in cores, but the single core performance was still not very good at that time. Zen 2 was leaps ahead of Zen+ in the area of single core performance. But anyhow, any of these might be good choices at certain price points. As you asked, it really depends on what they had before or currently has, if anything, since he indicated he doesn't have a suitable CPU, memory and RAM. But maybe he just means what he has is really, really old.
Still be nice to know what it is for comparative reasons.
It’s just those loose random parts that I have, not a complete system. I do happen to have a normal desktop but it’s never been a gaming one and is very very old for nowadays, so I can’t really put anything together from it (I’m talking i7 860, pcie 2.0, ddr3 etc).
10400f

B460 board of your choosing, or pay additional cash for memory overclocking on a B560, this seems worth it,

Quality ddr4 3200-3600 , 3733 seems to be where the cost becomes prohibitive

a decent psu around 600 watts 650 couldn't hurt ideal would be silverstone/seasonic

A case of your preference

1TB SSD

Thanks. I was really struggling to choose between the i3 10100 and i5 10400f (for $100 and $180 respectively), I only wanted the cheapest possible for not bottlenecking the 2080, nothing more.

Yesterday however, I miraculously happen to get a deal for a ryzen 5 3600 together with an x570 motherboard for just 250 dollars (whereas before a ryzen 5 3600 was $235 alone, and I couldn’t find it cheaper). And considering that it was only 50 dollars more than an i3 10100 + B560 (and with the i5 10400 + B560 being 30 dollars more than the ryzen 5 3600 + x570), I decided it was worth the 50 dollars in case the i3 10100 didn’t manage to hold up to even today’s games. Plus much better value if it ever comes to selling I think.


Thank you all for your input and apologies for the late replies, as I don’t really post on here and therefore I forgot to check back after a day or two.
 
Last edited:
The ideal FPS would be 1440p 144hz for modern shooters
Then you don't want the i3. It's not going to give you what you want. It's almost exactly like my 6700k and that's exactly where it begins to fall short is when high frame rates are necessary. It's more than capable enough for highly graphical 60fps games but over 100fps, if the game is AT ALL demanding on CPU performance, it's going to make you wish you'd spent the extra money.
 
You may need to adjust your expectations. 1440p 144Hz with a 2080 in modern shooters is going to take a fair amount of compromise on game settings. I had a 2080 Super and I found it underwhelming for 1440p 144Hz. Take Modern Warefare for example, I had to run mainly medium settings with RT off to be able to average 120fps. I upgraded to a 3080 as I wasn’t satisfied with the 2080S.
 

r0dster

Commendable
Jul 22, 2019
29
1
1,535
Then you don't want the i3. It's not going to give you what you want. It's almost exactly like my 6700k and that's exactly where it begins to fall short is when high frame rates are necessary. It's more than capable enough for highly graphical 60fps games but over 100fps, if the game is AT ALL demanding on CPU performance, it's going to make you wish you'd spent the extra money.

Ah, yea I did fear this. I was constantly going back and forth wondering if buying anything higher just to run the 2080 was even worth it, since I only want it for a few months. Which is why I was putting such a high focus on literally the bare minimum that I can get away with (i.e. the bare minimum for getting the gpu to max out before the cpu does).
 

r0dster

Commendable
Jul 22, 2019
29
1
1,535
You may need to adjust your expectations. 1440p 144Hz with a 2080 in modern shooters is going to take a fair amount of compromise on game settings. I had a 2080 Super and I found it underwhelming for 1440p 144Hz. Take Modern Warefare for example, I had to run mainly medium settings with RT off to be able to average 120fps. I upgraded to a 3080 as I wasn’t satisfied with the 2080S.

Oh…damn. With which cpu and how much ram? As then people with anything lower than a 2080 need to play it on low or under 100hz? Whilst I wasn’t necessarily planning to mainly play that, I was definitely looking to spend a good chunk of time with it.
 
If you only want it "for a few months", then it's probably not even worth bothering with. If you are already planning to get something newer, or better, then it's probably in YOUR best interests to simply save the money and wait, and get what you REALLY want, not waste money and time on a temporary band aid.

And as for the 2080 not being capable enough for 1440p 144hz, I don't agree, at all. I have a 2060 Super and it runs my 1440p 144hz primary monitor, which is used for gaming, and my other two 1440p 144hz monitors, which are simply additional real estate for now, with no problems so long as I am realistic with my expectations regarding quality settings. Some games can easily do Ultra or high quality while other do require moving a fair number of sliders to the left or dropping to medium altogether. Having to drop to medium with an RTX 2080 tells me that somebody probably had other issues going on as well. But I suspect there will be a denial of that, which is fine, I'm simply stating what I know to be true.
 
Oh…damn. With which cpu and how much ram? As then people with anything lower than a 2080 need to play it on low or under 100hz? Whilst I wasn’t necessarily planning to mainly play that, I was definitely looking to spend a good chunk of time with it.
I was running a 3700x and 2x16gb 3200mhz (OC,d to 3600mhz). However the RAM amount is for my work with large Excel files. You can still use a 144Hz monitor but if you want to average 120 FPS or higher you will need to compromise on settings on some games. I just remembered, a key for me in Modern Warefare was running the field of view at 100 and not lower which also puts extra load on the gpu and I wanted to keep an average of 120fps in all maps. My performance was bang on where it should be, Tom’s review which shows a wide range of average FPS depending on the game https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2080-super-turing-ray-tracing,6243-2.html

I am not saying you cannot run 1440p 144Hz, I was just underwhelmed that in some games I wanted to play I ended up compromising settings more than I’d have liked.
 
I was running a 3700x and 2x16gb 3200mhz (OC,d to 3600mhz). However the RAM amount is for my work with large Excel files. You can still use a 144Hz monitor but if you want to average 120 FPS or higher you will need to compromise on settings on some games. I just remembered, a key for me in Modern Warefare was running the field of view at 100 and not lower which also puts extra load on the gpu and I wanted to keep an average of 120fps in all maps. My performance was bang on where it should be, Tom’s review which shows a wide range of average FPS depending on the game https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2080-super-turing-ray-tracing,6243-2.html

I am not saying you cannot run 1440p 144Hz, I was just underwhelmed that in some games I wanted to play I ended up compromising settings more than I’d have liked.
No argument here except that I think a lot of results will be based on highly individualized settings. Small changes in settings COULD result in major changes in playability.
 

r0dster

Commendable
Jul 22, 2019
29
1
1,535
If you only want it "for a few months", then it's probably not even worth bothering with. If you are already planning to get something newer, or better, then it's probably in YOUR best interests to simply save the money and wait, and get what you REALLY want, not waste money and time on a temporary band aid.

And as for the 2080 not being capable enough for 1440p 144hz, I don't agree, at all. I have a 2060 Super and it runs my 1440p 144hz primary monitor, which is used for gaming, and my other two 1440p 144hz monitors, which are simply additional real estate for now, with no problems so long as I am realistic with my expectations regarding quality settings. Some games can easily do Ultra or high quality while other do require moving a fair number of sliders to the left or dropping to medium altogether. Having to drop to medium with an RTX 2080 tells me that somebody probably had other issues going on as well. But I suspect there will be a denial of that, which is fine, I'm simply stating what I know to be true.
I do indeed want it for just a few months only, but I’m not planning for something newer after it as I won’t have much time to play games. It was just me feeling like I’ve been sitting on a really good graphics card for over a year just letting it waste away. And since I had a bit more time atm, I could perhaps enjoy it a little on some of the game titles that I also have just wasting away that I never played.

As far as the fps, I certainly would be pretty surprised (and disappointed) if it can’t do better than medium at 120hz. As that would make most of the graphic quality and high fps inaccessible to almost everyone.

I was running a 3700x and 2x16gb 3200mhz (OC,d to 3600mhz). However the RAM amount is for my work with large Excel files. You can still use a 144Hz monitor but if you want to average 120 FPS or higher you will need to compromise on settings on some games. I just remembered, a key for me in Modern Warefare was running the field of view at 100 and not lower which also puts extra load on the gpu and I wanted to keep an average of 120fps in all maps. My performance was bang on where it should be, Tom’s review which shows a wide range of average FPS depending on the game https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2080-super-turing-ray-tracing,6243-2.html

I am not saying you cannot run 1440p 144Hz, I was just underwhelmed that in some games I wanted to play I ended up compromising settings more than I’d have liked.
Were you maybe using a really big or ultra-wide monitor or multiple? I also want to be using high FOVs for shooters. Thanks for the link, I certainly will have a look later.

No argument here except that I think a lot of results will be based on highly individualized settings. Small changes in settings COULD result in major changes in playability.
Yea I’m sure hoping I only have to sacrifice a few settings from the highest setting, and not actually having to go all the way down to basically medium settings overall.
 
SIZE of monitor, has no bearing, at all, on anything. ONLY pixel density affects the load presented to the GPU. And to some extent, the refresh rate.

You shouldn't have to, but MUCH will depend on the severity of the load presented by the game itself. Some games, especially those that are not yet (Or ever) well optimized, might present HUGE resource demands compared to other games, and then again, some are simply VERY much demanding compared to others, even though and despite being well optimized. Some games are just fricking demanding, especially when you get into environments like towns or areas where large gatherings happen, or there is exceptional detail involved, and that is just what it is. That is exactly why dropping some settings, like for example, Hairworks when Witcher 3 was new. This was something that tanked almost every card available at the time, but when turned off those systems did fine in the same areas. It is just something you have to play with BUT when it comes to frame rates if the CPU is simply not up to the task THEN often you might have to slide TOO MUCH stuff to the left, in order to maintain acceptable frame rates, which is why with newer more demanding titles it has become paramount to also have as decent of a CPU as possible compared to in the past when you really just needed "something decent" and a good graphics card.

I'm still (Barely) getting by with my 6700k but I would surely not buy one today if I was looking for a new system. Six cores, minimum. Plus hyperthreading if possible, if not, then maybe even more cores. Buy what you need, ONCE, and then don't look back for 6 or 7 years.
 

r0dster

Commendable
Jul 22, 2019
29
1
1,535
But if two monitors have the same pixel density per inch of screen, wouldn’t a bigger size monitor require more gpu power? As it’ll have to change/create more pixels per frame, since it has more inches of screen.

I hope so. I know I’ll definitely be safe with a good amount of them, however for cod warzone since it’s so heavily played and has such deep pocket funding, I was expecting it to be optimised extremely well to allow the only slightly above average gaming pc to hit high fps on medium. And a 2080 is far from average, so I’m going to be shocked if I can’t hit around 1440p 144hz mostly on high settings unless something else bottlenecks the gpu. But maybe I overestimate the 2080 for nowadays, I hope not.

Nice, you must have planned it well when you bought it all, sounds like it kept strong for a long time. Yea, however for me gaming is more of a past thing as I rarely would have any time for it and so I just stopped completely. And with some of the games that have come out and having a bit more free time at the moment, I just wanted to have a few more high quality sessions before throwing in the towel for good haha.
 
Last edited:
A 32" 1440p and a 27" 1440p display both present the same load to the GPU as both require the SAME number of pixels to be rendered. The fact that one of them has a larger physical size has no bearing on the number of pixels that must be rendered. Pixels, not physical size, determines this.
 

r0dster

Commendable
Jul 22, 2019
29
1
1,535
A 32" 1440p and a 27" 1440p display both present the same load to the GPU as both require the SAME number of pixels to be rendered. The fact that one of them has a larger physical size has no bearing on the number of pixels that must be rendered. Pixels, not physical size, determines this.
“1440p” only refers to the height, if any of those monitors use a wider (or narrower) resolution than the other, then the pixel density and pixel amount totally changes despite still being 1440p…so if two both monitors have the same pixel density per inch, then obviously whichever one has more inches, will have more pixels
 
“1440p” only refers to the height, if any of those monitors use a wider (or narrower) resolution than the other, then the pixel density and pixel amount totally changes despite still being 1440p…so if two both monitors have the same pixel density per inch, then obviously whichever one has more inches, will have more pixels
Although correct, most people say 1440p and mean 2560x1440 and when they want to differentiate they will just refer to it as ultra wide 1440p (3440x1440) or simply mention the aspect ratio and 1440p in the end.
 

r0dster

Commendable
Jul 22, 2019
29
1
1,535
Ah okay I guess I wasn’t clear, since the convo started from here:

Were you maybe using a really big or ultra-wide monitor or multiple? I also want to be using high FOVs for shooters. Thanks for the link, I certainly will have a look later.

I guess it probably would have been clearer to put “ultra wide” in brackets, but just wanted to know if the person that commented would have considered their monitor to be “really big” in layman’s terms, as I wouldn’t have expected them to think of its size as really big otherwise as far monitors go.
 
Although correct, most people say 1440p and mean 2560x1440 and when they want to differentiate they will just refer to it as ultra wide 1440p (3440x1440) or simply mention the aspect ratio and 1440p in the end.
Right. "1440p" is generally assumed to mean 2560x1440. So, if we need to be THAT clear about it, then assume to mean that if you have two monitors, both of which are 2560x1440, but one is, say, 24" while the other is 32", and both have the same refresh rate, then both monitors will present the SAME load to the GPU assuming all else is equal. The point being that the physical size of the display is not relevant to the graphical load seen by the graphics card.