[SOLVED] $2000-$2500 4k gaming rig

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

Avanju

Honorable
Oct 1, 2012
26
1
10,535
Looking for any suggestions before I pull the trigger on this tentative build

PCPartPicker Part List

CPU: AMD Ryzen 7 3700X 3.6 GHz 8-Core Processor ($329.00 @ B&H)
CPU Cooler: be quiet! Dark Rock Pro 4 50.5 CFM CPU Cooler ($88.09 @ Amazon)
Motherboard: Asus ROG STRIX X470-F Gaming ATX AM4 Motherboard ($198.85 @ OutletPC)
Memory: G.Skill Ripjaws V Series 16 GB (2 x 8 GB) DDR4-3200 Memory ($74.99 @ Newegg)
Storage: Intel 660p Series 1.02 TB M.2-2280 Solid State Drive ($94.89 @ OutletPC)
Storage: Hitachi Ultrastar 7K3000 3 TB 3.5" 7200RPM Internal Hard Drive ($46.99 @ Amazon)
Video Card: EVGA GeForce RTX 2080 8 GB Black Video Card ($629.99 @ Walmart)
Case: Fractal Design Meshify C ATX Mid Tower Case ($78.99 @ Amazon)
Power Supply: EVGA SuperNOVA G2 650 W 80+ Gold Certified Fully Modular ATX Power Supply ($99.99 @ B&H)
Monitor: LG 27UD58-B 27.0" 3840x2160 60 Hz Monitor ($279.00 @ B&H)
Monitor: LG 27UD58-B 27.0" 3840x2160 60 Hz Monitor ($279.00 @ B&H)
Total: $2199.78
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2019-07-23 22:03 EDT-0400


Approximate Purchase Date: Within a week or two

Budget Range: $2000-$2500 before tax, shipping, etc.

System Usage from Most to Least Important: 1. Gaming

Are you buying a monitor: Yes, 2 4k monitors @60hz

Parts to Upgrade: New build

Do you need to buy OS: No

Preferred Website(s) for Parts: Prefer Amazon (prime), but any reputable merchant that has the best price

Location: US, UT (no Microcenter)

Parts Preferences: See tentative build.

Overclocking: Yes (any good comprehensive guides you all recommend?)

SLI or Crossfire: No

Your Monitor Resolution: 4k

Any suggestions are most welcome!
 
Solution
Good question. For reference, I’m currently gaming at 1080p on a Radeon 290. For upgrading, I’ve gone back and forth between the two, and opinions are generally all over the place on the topic of 4k versus 1440p 144 hz capable machines. The most common opinion is that 1440p 144 hz is the sweet spot, but many say you still need a 2080ti for that as well. However, my personal preference is for immersive gaming experiences rather than competitive FPS. I believe I would get little personal satisfaction from a high refresh rate, and a greater degree of satisfaction from higher resolution.
If you look at my signature, you'll see the monitor for my son's PC is a large 2560x1080 that has FreeSync up to 144Hz. Now, neither he nor I...

Avanju

Honorable
Oct 1, 2012
26
1
10,535
Let me step back a bit on this and ask: particularly at such a relatively small screen size, why do you want/need 4K?

Good question. For reference, I’m currently gaming at 1080p on a Radeon 290. For upgrading, I’ve gone back and forth between the two, and opinions are generally all over the place on the topic of 4k versus 1440p 144 hz capable machines. The most common opinion is that 1440p 144 hz is the sweet spot, but many say you still need a 2080ti for that as well. However, my personal preference is for immersive gaming experiences rather than competitive FPS. I believe I would get little personal satisfaction from a high refresh rate, and a greater degree of satisfaction from higher resolution.
 

Avanju

Honorable
Oct 1, 2012
26
1
10,535
PCPartPicker Part List
CPU: AMD Ryzen 7 3700X 3.6 GHz 8-Core Processor ($329.00 @ B&H)
Motherboard: Asus PRIME X570-P ATX AM4 Motherboard ($169.99 @ Newegg)
Memory: G.Skill Ripjaws V Series 16 GB (2 x 8 GB) DDR4-3200 Memory ($64.99 @ Newegg)
Storage: Intel 660p Series 1.02 TB M.2-2280 Solid State Drive ($94.99 @ Adorama)
Storage: Seagate Barracuda 3 TB 3.5" 7200RPM Internal Hard Drive ($79.89 @ OutletPC)
Video Card: EVGA GeForce RTX 2080 Ti 11 GB XC GAMING Video Card ($1189.99 @ Amazon)
Case: Fractal Design Meshify C ATX Mid Tower Case ($78.99 @ Amazon)
Power Supply: EVGA SuperNOVA G3 650 W 80+ Gold Certified Fully Modular ATX Power Supply ($85.88 @ OutletPC)
Monitor: LG 27UD58-B 27.0" 3840x2160 60 Hz Monitor ($279.00 @ B&H)
Total: $2372.72
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2019-07-24 16:18 EDT-0400

Great food for thought. Thanks!
 

hftvhftv

Distinguished
Ambassador
I have read that the gains on the 2080 super are achieved by getting a 2080 on sale and overclocking it. Not true?
No, the 2080 Super has more CUDA cores, yes I'd get a 2080 if you can get it for around $600, but the 2080 Super is $700 and has more performance. The 2080 Ti is still significantly better than the 2080 Super though, but I'd get a board partner model for around $1000 rather than waste money on the Nvidia cooler.
 
  • Like
Reactions: Avanju

Avanju

Honorable
Oct 1, 2012
26
1
10,535
Even if I were to ultimately go with a 2080ti PCPartPicker is saying the power consumption is only around 440. Why do I need almost double that?
You don't need double that. Even using a power supply calculator that puts usage of all components the load wattage of that system is 455W with a recommended 505W PSU. That is with figuring 8hr/day gaming. While the best efficiency of a PSU occurs at 50% load, the difference in efficiency between 50% & 75% load is far smaller than 50% and almost idle. Even with a 20% loss of capacity due to capacitor aging over several years, a 650W will still be plenty for that system. That being said I wouldn't go less than 650W PSU due to capacitor aging, but if you aren't going to do any over clocking you will be fine.
 
  • Like
Reactions: Avanju

hftvhftv

Distinguished
Ambassador
You don't need double that. Even using a power supply calculator that puts usage of all components the load wattage of that system is 455W with a recommended 505W PSU. That is with figuring 8hr/day gaming. While the best efficiency of a PSU occurs at 50% load, the difference in efficiency between 50% & 75% load is far smaller than 50% and almost idle. Even with a 20% loss of capacity due to capacitor aging over several years, a 650W will still be plenty for that system. That being said I wouldn't go less than 650W PSU due to capacitor aging, but if you aren't going to do any over clocking you will be fine.
Even with some overclocking a 650W unit would be fine.
 
  • Like
Reactions: jeremyj_83
PCPartPicker Part List

CPU: AMD Ryzen 7 3700X 3.6 GHz 8-Core Processor ($329.99 @ Amazon)
Motherboard: ASRock X570 Pro4 ATX AM4 Motherboard ($169.99 @ Newegg)
Memory: Crucial Ballistix Sport LT 16 GB (2 x 8 GB) DDR4-3200 Memory ($69.99 @ Amazon)
Storage: Crucial BX500 480 GB 2.5" Solid State Drive ($55.99 @ Newegg)
Storage: Intel 660p Series 2.048 TB M.2-2280 Solid State Drive ($181.99 @ Amazon)
Storage: Seagate Constellation CS ISE 3 TB 3.5" 7200RPM Internal Hard Drive ($57.50 @ Amazon)
Video Card: EVGA GeForce RTX 2080 8 GB Black Video Card ($639.99 @ Newegg)
Case: Fractal Design Define C ATX Mid Tower Case ($85.98 @ Newegg)
Power Supply: SeaSonic FOCUS Plus Gold 650 W 80+ Gold Certified Fully Modular ATX Power Supply ($94.99 @ Amazon)
Case Fan: Fractal Design X2 GP-14 (White) 68.4 CFM 140 mm Fan ($14.99 @ Amazon)
Monitor: AOC AG352UCG6 35.0" 3440x1440 120 Hz Monitor ($649.99 @ Walmart)
Total: $2351.39
Prices include shipping, taxes, and discounts when available
Generated by PCPartPicker 2019-07-25 09:52 EDT-0400

This is a different choice with an ultrawide monitor.
 
I'm gonna go against the mainstream and say that tweaking the 3700x is definitely a good idea.
Under my own pretty extensive testing i found out that the cpu is fed 1.45v even without PBO during a single core load to achieve a speed of 4.275GHz on a single core. On a multicore load that drops to 1.325v which at the 4GHz it was boosting to on all cores is a lot more than what's necessary.

I managed to get 4.4GHz fully stable allcore oc on my 3700x at 1.43v.
And while that might not be possible on all chips, getting 4.3GHz allcore under 1.4v should be very possible on most chips.

As for the PSU 650W is totally fine, just make sure the OEM of the psu is a known brand like Super flower, Seasonic or CWT.
 

w_o_t_q

Commendable
Jul 24, 2019
44
2
1,545
Even if I were to ultimately go with a 2080ti PCPartPicker is saying the power consumption is only around 440. Why do I need almost double that?
CPU , MB, RAM, HDD other need power No OC as well ? 80 efficiency means that from Your 650 wat actual power only 520 is left ... just before everything.
 
CPU , MB, RAM, HDD other need power No OC as well ? 80 efficiency means that from Your 650 wat actual power only 520 is left ... just before everything.
Wrong on every level of how 80+ works. You don't remove wattage from your max to find out how much it can give for efficiency. What 80+ says is that you have a 100W PSU that is 80% efficient you will draw 125W to give out 100W. https://www.techpowerup.com/forums/...lly-need-an-80-plus-gold-power-supply.129456/ The only time you lose capacity is due to capacitor aging. Capacitor aging happens the longer the PSU is used and the aging tops out around 20%.
 

King_V

Illustrious
Ambassador
Good question. For reference, I’m currently gaming at 1080p on a Radeon 290. For upgrading, I’ve gone back and forth between the two, and opinions are generally all over the place on the topic of 4k versus 1440p 144 hz capable machines. The most common opinion is that 1440p 144 hz is the sweet spot, but many say you still need a 2080ti for that as well. However, my personal preference is for immersive gaming experiences rather than competitive FPS. I believe I would get little personal satisfaction from a high refresh rate, and a greater degree of satisfaction from higher resolution.
If you look at my signature, you'll see the monitor for my son's PC is a large 2560x1080 that has FreeSync up to 144Hz. Now, neither he nor I were interested in refresh rates that fast, the only reason we got it at the time was for the good reviews of that particular model, and that it was the size and resolution we wanted. To wit: we didn't really pay extra for the higher refresh rate.

This brings up FreeSync. FreeSync monitors (and GSync, but that's less relevant now as the 10, 16, and 20 series Nvidia cards support FreeSync) allows the video card to adjust the monitor's refresh rate on the fly to match what the video card can manage to do.

It's usually given in a range - and if the maximum refresh rate is more than 2-1/2 times (or more than 2 times? I'm not clear on this) the minimum, then the monitor can also implement Low Framerate Compensation (LFC) and adapt even lower, and go down as low as 1/2 the minimum.

For example, my son's monitor has a 50-144Hz. Because of this, LFC allows adaptive sync down to 25fps (basically, it takes any point where the fps dips below 50, and doubles the value for the refresh, so, say, at 33 fps, the monitor will switch to 66Hz and display each frame twice). The details probably aren't important, but it's that it extends the minimum usable adjustment.

Are there any brick and mortar stores around with various sizes/models/resolutions of monitors available on display? If so, I very highly recommend going and taking an in-person look, to see what your eyes are comfortable with.

My own monitor is the resolution that it is due to my working-from-home needs. I'd rather have had a large screen and lower resolution.

Higher resolution means you need a more powerful (and expensive) video card to get the same performance.


Now, don't get me wrong, if a specific size/resolution monitor offers a range of up to 144Hz and is only a little more, or the same price as, a monitor with a lower max refresh, then go for it.

In my son's case, with an AMD video card that is a mid-range card (RX 580), we set the Chill to use a minimum of 30 fps, and maximum of 60fps. There's no way his card is going to run even 75fps, much less anywhere in the 100-144 range, with the resolution he has on the games he plays.

If you're happy with a lower res screen and max refresh rates at say 60 or 75Hz, then make your monitor selection with that in mind, and, more specifically, make your video card selection with that in mind.

I'm personally a fan of ultra-wide resolutions (approx 21:9 aspect ratio, so, 2560x1080, 3440x1440, and 3840x1600). Ultra-wide, especially when large and with a curve, are a bit pricey for their size and resolution, but if you're going for immersive, that is the golden ticket. And, while I am praising it, I used to be of the opinion that it was just a gimmick.... until I experienced it.

With my son's machine, a GTX 1660Ti or RX 5700 (non-Ti) would probably be an ideal video card, but with the adaptive FreeSync, smoothness remains even when the frames per second dip low.


The monitor and GPU selections are really tied very closely to each other, so this should be considered carefully.
 
Solution

Avanju

Honorable
Oct 1, 2012
26
1
10,535
If you look at my signature, you'll see the monitor for my son's PC is a large 2560x1080 that has FreeSync up to 144Hz. Now, neither he nor I were interested in refresh rates that fast, the only reason we got it at the time was for the good reviews of that particular model, and that it was the size and resolution we wanted. To wit: we didn't really pay extra for the higher refresh rate.

This brings up FreeSync. FreeSync monitors (and GSync, but that's less relevant now as the 10, 16, and 20 series Nvidia cards support FreeSync) allows the video card to adjust the monitor's refresh rate on the fly to match what the video card can manage to do.

It's usually given in a range - and if the maximum refresh rate is more than 2-1/2 times (or more than 2 times? I'm not clear on this) the minimum, then the monitor can also implement Low Framerate Compensation (LFC) and adapt even lower, and go down as low as 1/2 the minimum.

For example, my son's monitor has a 50-144Hz. Because of this, LFC allows adaptive sync down to 25fps (basically, it takes any point where the fps dips below 50, and doubles the value for the refresh, so, say, at 33 fps, the monitor will switch to 66Hz and display each frame twice). The details probably aren't important, but it's that it extends the minimum usable adjustment.

Are there any brick and mortar stores around with various sizes/models/resolutions of monitors available on display? If so, I very highly recommend going and taking an in-person look, to see what your eyes are comfortable with.

My own monitor is the resolution that it is due to my working-from-home needs. I'd rather have had a large screen and lower resolution.

Higher resolution means you need a more powerful (and expensive) video card to get the same performance.


Now, don't get me wrong, if a specific size/resolution monitor offers a range of up to 144Hz and is only a little more, or the same price as, a monitor with a lower max refresh, then go for it.

In my son's case, with an AMD video card that is a mid-range card (RX 580), we set the Chill to use a minimum of 30 fps, and maximum of 60fps. There's no way his card is going to run even 75fps, much less anywhere in the 100-144 range, with the resolution he has on the games he plays.

If you're happy with a lower res screen and max refresh rates at say 60 or 75Hz, then make your monitor selection with that in mind, and, more specifically, make your video card selection with that in mind.

I'm personally a fan of ultra-wide resolutions (approx 21:9 aspect ratio, so, 2560x1080, 3440x1440, and 3840x1600). Ultra-wide, especially when large and with a curve, are a bit pricey for their size and resolution, but if you're going for immersive, that is the golden ticket. And, while I am praising it, I used to be of the opinion that it was just a gimmick.... until I experienced it.

With my son's machine, a GTX 1660Ti or RX 5700 (non-Ti) would probably be an ideal video card, but with the adaptive FreeSync, smoothness remains even when the frames per second dip low.


The monitor and GPU selections are really tied very closely to each other, so this should be considered carefully.

Sound advice. I'm not sure where I could go to check out screen resolutions...Best Buy? Given the advice I've received here, I'm leaning towards a 2080 super (I've heard EVGA FTW3 is good, but is out of stock for who knows how long) but I'm still torn between 2 16:9 screens, or 1 ultrawide, and at which resolution. I do occasionally work from home, and I'm worried that a single ultrawide would slow productivity. You said you use your UW for work, so I assume it hasn't been a problem for you or you would change it. Do you game at all on the 4k UW? does it hold up with the GTX 1080 FE?

Apparently, there's been problems with B450 and X470 boards, and the suggestion is to go with a X570 (expensive!) or wait for MSI max (when??) so I feel like I don't have enough info to pick a board that's currently available.
 
Sound advice. I'm not sure where I could go to check out screen resolutions...Best Buy? Given the advice I've received here, I'm leaning towards a 2080 super (I've heard EVGA FTW3 is good, but is out of stock for who knows how long) but I'm still torn between 2 16:9 screens, or 1 ultrawide, and at which resolution. I do occasionally work from home, and I'm worried that a single ultrawide would slow productivity. You said you use your UW for work, so I assume it hasn't been a problem for you or you would change it. Do you game at all on the 4k UW? does it hold up with the GTX 1080 FE?

Apparently, there's been problems with B450 and X470 boards, and the suggestion is to go with a X570 (expensive!) or wait for MSI max (when??) so I feel like I don't have enough info to pick a board that's currently available.
If you go ultrawide I would go with the 3440x1440 ones. The added pixels are very nice for productivity work and gaming.
 
  • Like
Reactions: Avanju

King_V

Illustrious
Ambassador
Sound advice. I'm not sure where I could go to check out screen resolutions...Best Buy? Given the advice I've received here, I'm leaning towards a 2080 super (I've heard EVGA FTW3 is good, but is out of stock for who knows how long) but I'm still torn between 2 16:9 screens, or 1 ultrawide, and at which resolution. I do occasionally work from home, and I'm worried that a single ultrawide would slow productivity. You said you use your UW for work, so I assume it hasn't been a problem for you or you would change it. Do you game at all on the 4k UW? does it hold up with the GTX 1080 FE?

Apparently, there's been problems with B450 and X470 boards, and the suggestion is to go with a X570 (expensive!) or wait for MSI max (when??) so I feel like I don't have enough info to pick a board that's currently available.
At my local place, BestBuy had a limited selection of smaller monitors.

I went to MicroCenter (about 45 minutes from me), though they have a limited number of stores overall. I'm told Fry's might be a good place to try as well, though I'm not that familiar with them.

For me, I thought I needed the width to match what two 1920x1080 monitors side by side that I have at the office would be. Then I typically use Windows-LeftArrow or Windows-RightArrow to get things to fill the left and right half of the screens, respectively. The extra vertical space by the 1600 vertical resolution vs the 1080 from the in-office monitor setup work screens is.

I think it's worth considering what @jeremyj_83 mentioned, and going with 3440x1440. In hindsight, I probably could've lived with 3440x1440. Still gives me extra vertical space, and I would basically have had the equivalent of a pair of 1720 rather than 1920 monitors side by side. I think I could've managed fine with the loss of about 11% of my horizontal resolution.

Gaming-wise, I play older games, and the 1080FE managed reasonably. The most "recent" games I played were Borderlands 2 and Borderlands Pre-Sequel, The Witcher 2, and Portal 2. They're not particularly demanding by today's standards. With a 3440x1440 having only 80% of the number of pixels that a 3840x1600 has, and only about 60% of what a full 4K monitor has, I think a 1080 or 2070 would manage nicely. The 2070 Super now goes for less than the original 2070 did, if I understand correctly, so that might be the way to go.

I guess it might be good to start by looking at this page for 2560x1440: https://www.tomshardware.com/review...2060-super-geforce-rtx-2070-super,6207-3.html

And this page for full 4k: https://www.tomshardware.com/review...2060-super-geforce-rtx-2070-super,6207-4.html

Assume that with a 3440x1440, that you'll get somewhere between about 2/3 of the frame rates they list for 2560 and 1-1/2 times the number of what the 4K rates are listed at. Kind of a ballpark.
 
  • Like
Reactions: Avanju

mossberg

Distinguished
Jun 13, 2007
159
32
18,720
CPU , MB, RAM, HDD other need power No OC as well ? 80 efficiency means that from Your 650 wat actual power only 520 is left ... just before everything.


CDqD1KV.jpg
 
80% efficiency means that if the system requires 300W of power to run, the PSU will have to pull 20% extra power from the wall because 20% will be wasted as heat. 80-Plus can be any range of efficiencies including White, Bronze, Silver, Gold, Platinum, and Titanium, and those range anywhere between 85% and 96% and will all differ depending on how much power is being drawn. Basically, there's a lot to it.
 

w_o_t_q

Commendable
Jul 24, 2019
44
2
1,545
80% efficiency means that if the system requires 300W of power to run, the PSU will have to pull 20% extra power from the wall because 20% will be wasted as heat. 80-Plus can be any range of efficiencies including White, Bronze, Silver, Gold, Platinum, and Titanium, and those range anywhere between 85% and 96% and will all differ depending on how much power is being drawn. Basically, there's a lot to it.
There You go if all components needs say 100 w, it will require incoming power 115 w and if psu will be rated say 110 w it can pull out of wall only 110 w no matter what so internal available power will less than 100 w. No wonder anybody sensible didn put anything less than 800 W for RTX 2080Ti and AMD power hungry 3000 series components say hello north bridge ventilator freaking 15 w consumption instead of 5-7 for Intel counterparts.
 

King_V

Illustrious
Ambassador
What are you talking about?

If a PSU says it will provide, say, 600W on the 12V rail, then at 80% you are still getting 600W, but it's pulling 750W from the wall in the worst case scenario. As you go up to 80+ bronze, 80+ silver, etc., that discrepancy goes downward, but it's still a 600W PSU.

I'm not sure how you jumped from getting the 80+ efficiency calculation backward to pissing and moaning about AMD suddenly? Really? 8-10 extra watts from the North Bridge is your complaint against AMD and in support of Intel, when the latter posts misleading TDP values for their current CPUs?

I don't know what issue you have, but you come into this thread posting erroneous information, get caught on that, then instead of admitting your mistake, you go on a brand-bashing rant?
 

w_o_t_q

Commendable
Jul 24, 2019
44
2
1,545
If Your PSU is rated 600W it can pull only 600W out of wall. PSU is not overclock able ... 600W mean only 600W ... if need pull out of a wall 750W get 750W block ...
 
If Your PSU is rated 600W it can pull only 600W out of wall. PSU is not overclock able ... 600W mean only 600W ... if need pull out of a wall 750W get 750W block ...
600W DOES NOT mean how much it pulls from the wall, it means how much it can convert from AC to DC. The more efficient the PSU the less it draws from the wall for the same amount of DC power. You will always have loss from the wall during the conversion so what you pull from the wall will ALWAYS be more than what the components are using from the PSU.

No wonder anybody sensible didn put anything less than 800 W for RTX 2080Ti and AMD power hungry 3000 series components say hello north bridge ventilator freaking 15 w consumption instead of 5-7 for Intel counterparts.
You can run a 2080Ti on a 650W PSU just fine and still have room to overclock. The only people who need 750W or more are running multiple GPUs, extremely high TDP CPUs (2950X for example), extreme overclockers, or custom loop cooling. While the northbridge of the X570 MOTHERBOARD has a 15W TDP, that is due to PCIe 4. The motherboard northbridge is not the Ryzen 3000 series. If fact the 3000 series is far less power hungry than Intel. The Intel Core architecture is very efficient at 4 cores or less, once you hit 6+ its efficiency is far worse than AMD at the same core count. Word of the wise, listen to what the people on here are saying that have more experience than you and you might learn something.
 

w_o_t_q

Commendable
Jul 24, 2019
44
2
1,545
https://www.tomshardware.com/reviews/nvidia-geforce-rtx-2080-ti-founders-edition,5805-10.html 277W ... consider half of PSU max supported power is gone. OC and other play with Power levels (Manufacturer done some small OC ) easy will reach 300W or more ... Custom loop pump at best 10-15w for most cases. Buying 1200 USD card and not support it with proper power source is a pretty strange move ... May be! 3000 series is less power hungry only if count cores, otherwise TDP is higher say hello to poor design and the 7 nm process which is not mature enough. Even nvidia cards on worst die process outperform AMD GPU by performance and TDP ...
 
3000 series is less power hungry only if count cores, otherwise TDP is higher say hello to poor design and the 7 nm process which is not mature enough.
Only if count cores... What does that even mean? If you mean like the 9900K that has a "95W" TDP whereas the 3800X has a 105W TDP you don't know what you are talking about. Intel rates their TDP based on the base clock alone (basically minimum usage) whereas AMD's TDP is much more in line with what we think about where 105W is max usage. https://www.tomshardware.com/reviews/amd-ryzen-7-3800x-review,6226-3.html The 3900X with 12 cores uses about the same power as the 9900k with 8 cores.

I would be happy to run a 3700X with 2080Ti on a 650W PSU. It would be just fine. Also don't forget that not everyone wants to overclock their system.
 
  • Like
Reactions: King_V