Question Need GPU for IPS 1440p /165hz monitor

vot4pedro

Reputable
Mar 20, 2017
28
0
4,530
0
Hi

The GPU is the last component on my shopping list. I have no plans to purchase a 4k res monitor anytime soon so for now my 1440p/165hz monitor is plan of record. Main uses include 3D cad modeling and gaming (Apex Legends, Fortnite). Any suggestions what GPU I can pair a Ryzen 5800x with for optimal monitor settings? FYI… rtx 3090 out of $ range.

Ryzen 5800x
GIGABYTE B550I AORUS PRO AX AM4 AMD B550 Mini-ITX
be quiet! 250W TDP Dark Rock Pro 4 CPU Cooler
Samsung 980 PRO 1TB Internal NVMe SSD
Crucial 32GB Ballistix DDR4 3600 MHz UDIMM Gaming Desktop Memory Kit (2 x 16GB, Black)
SilverStone SFX Series SST-SX800-LTI 800W SFX-L 80 PLUS TITANIUM
Motif Monument open case
Dell - 27" IPS QHD S2721DG 1440p 165hz FreeSync and G-SYNC

Thanks
 
Dec 28, 2019
50
3
45
2
If a 3090 is out of the budget, what about the 3080? Also what do you need an 800W PSU for? The max requirement I have ever seen in a consumer computer was rarely hitting 700W.
 

RTX 2080

Notable
Jun 8, 2020
1,016
246
1,190
45
Hi

The GPU is the last component on my shopping list. I have no plans to purchase a 4k res monitor anytime soon so for now my 1440p/165hz monitor is plan of record. Main uses include 3D cad modeling and gaming (Apex Legends, Fortnite). Any suggestions what GPU I can pair a Ryzen 5800x with for optimal monitor settings? FYI… rtx 3090 out of $ range.

Ryzen 5800x
GIGABYTE B550I AORUS PRO AX AM4 AMD B550 Mini-ITX
be quiet! 250W TDP Dark Rock Pro 4 CPU Cooler
Samsung 980 PRO 1TB Internal NVMe SSD
Crucial 32GB Ballistix DDR4 3600 MHz UDIMM Gaming Desktop Memory Kit (2 x 16GB, Black)
SilverStone SFX Series SST-SX800-LTI 800W SFX-L 80 PLUS TITANIUM
Motif Monument open case
Dell - 27" IPS QHD S2721DG 1440p 165hz FreeSync and G-SYNC

Thanks
Perhaps you have no plans for a 4k monitor, but hitting 1440p at 165 fps is going to require a similar GPU as hitting 4k 60 fps.

In any case, I would recommend you either a RTX 3080 or a 6800 XT.
 
Reactions: digitalgriffin

vot4pedro

Reputable
Mar 20, 2017
28
0
4,530
0
If a 3090 is out of the budget, what about the 3080? Also what do you need an 800W PSU for? The max requirement I have ever seen in a consumer computer was rarely hitting 700W.
Agree but at the time I started to purchase the comps, SFX PSUs in 700W range were out of stock everywhere so I pulled the trigger on a 800W unit. Its overkill for sure.
 

RTX 2080

Notable
Jun 8, 2020
1,016
246
1,190
45
If a 3090 is out of the budget, what about the 3080? Also what do you need an 800W PSU for? The max requirement I have ever seen in a consumer computer was rarely hitting 700W.
There are several reasons someone might by a more powerful PSU than they need:
  1. Buying a more powerful PSU than you think you need ensures that if you upgrade your CPU or GPU to something with a higher power draw down the line, you won't need to upgrade your PSU. By thinking ahead, you only buy one PSU instead of two. Just a year ago, some people were buying 650 watt PSUs because those were "enough." Now, some of those same people are upgrading those 650 watt PSUs to 750 watt PSUs because of how much power the new RTX 3080 draws. Thinking ahead saves you money.
  2. PSUs have an efficiency curve. Basically, the closer you are to using the full wattage of the power supply, the less efficient it gets. Buy buying a more powerful power supply than you need, you can run at a more efficient spot on the curve and save a bit of electricity.
  3. An underutilized PSU is an unstressed PSU. His PSU is likely to last longer than the PSU of someone who is right on the edge of their power requirements.
 
Reactions: digitalgriffin

vot4pedro

Reputable
Mar 20, 2017
28
0
4,530
0
Perhaps you have no plans for a 4k monitor, but hitting 1440p at 165 fps is going to require a similar GPU as hitting 4k 60 fps.

In any case, I would recommend you either a RTX 3080 or a 6800 XT.
RTX 3080 or a 6800 XT would be my first choices but just wondering if one level lower GPUs like RTX 3070 or RX 6800 would also max my monitor?
 

vot4pedro

Reputable
Mar 20, 2017
28
0
4,530
0
They might, but you might have to turn your in-game settings down to medium/high in order to compensate. Hitting 165 fps consistently at 1440p isn't easy.
Compromising with lower in game settings is not desirable so better play it safe... RTX 3080 or 6800XT, assuming I can buy one anytime soon.
 
RTX 3080 or a 6800 XT would be my first choices but just wondering if one level lower GPUs like RTX 3070 or RX 6800 would also max my monitor?
Out of curiosity, do you think a constant 165 FPS will look better than a freesync 100->144FPS?

One of the points of these freesync monitors is to create a smoother motion with frame time variances so you don't always have to run at max frame rate. You will still get a buttery smooth experience (especially with LFC). All the higher refresh does is lower the jitter time.
 
Last edited:
Dec 28, 2019
50
3
45
2
As long as you're not intrested in RTX go for the AMD card. Its probably easier and cheaper to get AMD than Nvidia. Getting 165 FPS in AAA-Titles at 1440p is a wet dream since most engines in Single-player games usually try to stay around 100 FPS.
 

InvalidError

Titan
Moderator
One of the points of these freesync monitors is to create a smoother motion with frame time variances so you don't always have to run at max frame rate.
The point of variable sync (when it works properly, which isn't always the case in every game, GPU and monitor combination) is that you get the responsiveness benefit of vsync off (up to the maximum refresh rate) without the tearing. Doesn't really matter if it is AMD-flavored sync or Nvidia-flavored sync which are basically the same now that both have settled on VESA Adaptive Sync, albeit with different house branding.
 

Zerk2012

Titan
Ambassador
Hi

The GPU is the last component on my shopping list. I have no plans to purchase a 4k res monitor anytime soon so for now my 1440p/165hz monitor is plan of record. Main uses include 3D cad modeling and gaming (Apex Legends, Fortnite). Any suggestions what GPU I can pair a Ryzen 5800x with for optimal monitor settings? FYI… rtx 3090 out of $ range.

Ryzen 5800x
GIGABYTE B550I AORUS PRO AX AM4 AMD B550 Mini-ITX
be quiet! 250W TDP Dark Rock Pro 4 CPU Cooler
Samsung 980 PRO 1TB Internal NVMe SSD
Crucial 32GB Ballistix DDR4 3600 MHz UDIMM Gaming Desktop Memory Kit (2 x 16GB, Black)
SilverStone SFX Series SST-SX800-LTI 800W SFX-L 80 PLUS TITANIUM
Motif Monument open case
Dell - 27" IPS QHD S2721DG 1440p 165hz FreeSync and G-SYNC

Thanks
RTX 3080, the 3090 was more made to replace theTitan as a work card that's why it's double the price but nothing near double the gaming performance.

I can't recommend any AMD card since it might take them a year to fix the drivers.
 
The point of variable sync (when it works properly, which isn't always the case in every game, GPU and monitor combination) is that you get the responsiveness benefit of vsync off (up to the maximum refresh rate) without the tearing. Doesn't really matter if it is AMD-flavored sync or Nvidia-flavored sync which are basically the same now that both have settled on VESA Adaptive Sync, albeit with different house branding.
QFT also. (Quite freaking true)
 
RTX 3080, the 3090 was more made to replace theTitan as a work card that's why it's double the price but nothing near double the gaming performance.

I can't recommend any AMD card since it might take them a year to fix the drivers.
Said NVIDIA with crashing RTX 30 series cards on release, before lowering the boost tables.

And while AMD has had their issues, their cards usually hold up better in the long run. Compare modern games of a RX580 to a 1060 (RX580 wins the majority). A Vega 64 against a 1080. (Virtual tie now)

The 5000 series crashes were due to an internal silicon hardware bug. (Errata) Those kind of errors are extremely hard to fix. Believe me. I've had to code around them before. Not that I'm making excuses. It's really no different than NVIDIA's 30 series being extremely sensitive to voltage droop due to current surge. Like I said, it's just one of those things you have to work around.
 
Last edited:
Reactions: Shadowclash10

Zerk2012

Titan
Ambassador
Said NVIDIA with crashing RTX 30 series cards on release, before lowering the boost tables.

And while AMD has had their issues, their cards usually hold up better in the long run. Compare modern games of a RX580 to a 1060 (RX580 wins the majority). A Vega 64 against a 1080. (Virtual tie now)

The 5000 series crashes were due to an internal silicon hardware bug. (Errata) Those kind of errors are extremely hard to fix. Believe me. I've had to code around them before. Not that I'm making excuses. It's really no different than NVIDIA's 30 series being extremely sensitive to voltage droop due to current surge. Like I said, it's just one of those things you have to work around.
Actually seen a post from Jonny Guru where he said it was power supply problems, but not all of them just random ones.

From what he said they sent a bunch in different models and some worked fine and some crashed.

Found the post.
https://forums.tomshardware.com/threads/is-gamer-storm-dq-850m-80-gold-good.3665709/#post-22082279
 
Last edited:
Actually seen a post from Jonny Guru where he said it was power supply problems, but not all of them just random ones.

From what he said they sent a bunch in different models and some worked fine and some crashed.

Found the post.
https://forums.tomshardware.com/threads/is-gamer-storm-dq-850m-80-gold-good.3665709/#post-22082279
That is correct. The back of the GPU has a series of capacitors which are used to stabilize voltages. Think of them like little tiny miniature energy storage devices. There was a series of high frequency capacitors which NVIDIA used on the reference design that board partners skipped over for cheaper capacitors. Well when you switch those cheaper capacitors at high enough frequency they act like inductors and not capacitors. And that ISN'T what you want as this causes voltage sag and crashing.

The design is pushed to such limits with such wide power swings (transients) that the voltage was inherently under extreme stress. (Even the founders reference editions) making it more prone to crashes when these transients occurred.

Basically NVIDIA pushed the design to the end of the envelope because they knew Big Navi was coming. The end result is it bit NVIDIA in the tail when one or two minor corners were cut by AIB's. The solution was to reduce the boost tables so large transients were not so prevalent. But this also leads to a minor reduction in performance.

Don't get me wrong. The 3080's are great cards. They are just bleeding edge. I have a deposit down for one because I want the RT performance. The 5000 series were great cards too. It was a huge improvement in efficiency and speed.
 
Reactions: Shadowclash10

ASK THE COMMUNITY

TRENDING THREADS