[SOLVED] ASUS TUF GAMING VG27AQ with 3060ti

23dexter89

Distinguished
Jul 14, 2010
123
1
18,685
Hello Experts!

Planning to upgrade from a 24inch, 1080p @75Hz monitor to the Asus VG27AQ, 27inch, 1440p @ 165Hz.

Will this be a good upgrade or will it just strain my CPU & GPU?

GPU: Asus ROG 3060ti
CPU: Ryzen 5 3600

No plans to change the CPU or GPU any time soon.

Edit: My Use case is purely Gaming, Movies & Entertainment.

Regards,
Roach
 
Solution
If I stay at 1080p and go for 165Hz what will be the load on CPU and GPU? will there be a bottleneck?

Depends.

If you currently have FPS capped at 60 FPS, then better refresh rate monitor means that your GPU can output up to 165 FPS, since that's what monitor supports.

But if you don't have FPS capped and let GPU work at it's hardest, despite your monitor being 60 Hz and you ever seeing 60 FPS, then there would be 0 impact on your hardware. Positive side of this is, that now, you can actually see the FPS your GPU produced. (Up to the level of monitor refresh rate. Any FPS over what monitor can display = frame tearing.)

Will 165Hz at 1080p even be noticeable?

Depends, again. :LOL:

If you play mostly fast-paced games...

Aeacus

Titan
Ambassador
Will this be a good upgrade or will it just strain my CPU & GPU?

If your new monitor would be 1080p as well, then there wouldn't be any issue. But since you plan to go with 1440p monitor, your GPU will output less FPS since there are far more pixels to render.

If you want to get ~165 FPS on 1440p, you'll need RTX 4070 Ti minimum.
Comparison between RTX 3060 Ti and RTX 4070 Ti,
link: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-Ti-vs-Nvidia-RTX-4070-Ti/4090vs4146

With RTX 3060 Ti, you'd be getting ~70 FPS on 1440p.

Here's review of RTX 4070 Ti,
link: https://www.techspot.com/review/2601-nvidia-geforce-rtx-4070-ti/

Check the results of 1440p. Your RTX 3060 Ti is essentially equal to RTX 3070.
 
  • Like
Reactions: 23dexter89

23dexter89

Distinguished
Jul 14, 2010
123
1
18,685
If your new monitor would be 1080p as well, then there wouldn't be any issue. But since you plan to go with 1440p monitor, your GPU will output less FPS since there are far more pixels to render.

If you want to get ~165 FPS on 1440p, you'll need RTX 4070 Ti minimum.
Comparison between RTX 3060 Ti and RTX 4070 Ti,
link: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3060-Ti-vs-Nvidia-RTX-4070-Ti/4090vs4146

With RTX 3060 Ti, you'd be getting ~70 FPS on 1440p.

Here's review of RTX 4070 Ti,
link: https://www.techspot.com/review/2601-nvidia-geforce-rtx-4070-ti/

Check the results of 1440p. Your RTX 3060 Ti is essentially equal to RTX 3070.
This was a very insightful response; you definitely saved me from making a huge mistake. Thank you!

If I may add a few follow-up questions:
  1. If I stay at 1080p and go for 165Hz what will be the load on CPU and GPU? will there be a bottleneck?
  2. Will 165Hz at 1080p even be noticeable?
  3. If I go for 1440p at lower refresh rate let's say 60Hz or 75Hz will it be a noticeable change?
 

Aeacus

Titan
Ambassador
If I stay at 1080p and go for 165Hz what will be the load on CPU and GPU? will there be a bottleneck?

Depends.

If you currently have FPS capped at 60 FPS, then better refresh rate monitor means that your GPU can output up to 165 FPS, since that's what monitor supports.

But if you don't have FPS capped and let GPU work at it's hardest, despite your monitor being 60 Hz and you ever seeing 60 FPS, then there would be 0 impact on your hardware. Positive side of this is, that now, you can actually see the FPS your GPU produced. (Up to the level of monitor refresh rate. Any FPS over what monitor can display = frame tearing.)

Will 165Hz at 1080p even be noticeable?

Depends, again. :LOL:

If you play mostly fast-paced games (1st person shooters, racing etc) then you'll see a diff. Gameplay is smoother. And even more so, when you have your eyes trained to see diff in FPS.

But in slow paced games (RTS, strategy), diff isn't noticeable.

For example, i also used to have 60 Hz, TN panel monitor, which i upgraded to 144 Hz, VA panel monitor. Smoothness wise, i hardly ever noticed a difference. Then again, i'm casual player and my eyes aren't trained for fast action. But biggest impact for me, was far better color accuracy and contrast ratio, compared to my old TN panel monitor. (Since one of my hobbies is image editing, new monitor made my hobby far more enjoyable).

If I go for 1440p at lower refresh rate let's say 60Hz or 75Hz will it be a noticeable change?

Yes.

Like i said, more pixels to render = more impact on GPU, even when you have the same refresh rate.

1920x1080 = 2.073.600 pixels.
2160x1440 = 3.110.400 pixels. That's 1.5 times more pixels over 1080p.

Still, if given your GPU can produce ~70 FPS at 1440p, there would be little point to go with 1440p, 60/75 Hz monitor. Only increase would be pixel density, which can give you sharper image. But 144/165 Hz 1080p monitor would give you smoother image (less motion blur/screen tearing), and is overall better option.
 
Last edited:
  • Like
Reactions: 23dexter89
Solution

23dexter89

Distinguished
Jul 14, 2010
123
1
18,685
Depends.

If you currently have FPS capped at 60 FPS, then better refresh rate monitor means that your GPU can output up to 165 FPS, since that's what monitor supports.

But if you don't have FPS capped and let GPU work at it's hardest, despite your monitor being 60 Hz and you ever seeing 60 FPS, then there would be 0 impact on your hardware. Positive side of this is, that now, you can actually see the FPS your GPU produced. (Up to the level of monitor refresh rate. Any FPS over what monitor can display = frame tearing.)



Depends, again. :LOL:

If you play mostly fast-paced games (1st person shooters, racing etc) then you'll see a diff. Gameplay is smoother. And even more so, when you have your eyes trained to see diff in FPS.

But in slow paced games (RTS, strategy), diff isn't noticeable.

For example, i also used to have 60 Hz, TN panel monitor, which i upgraded to 144 Hz, VA panel monitor. Smoothness wise, i hardly ever noticed a difference. Then again, i'm casual player and my eyes aren't trained for fast action. But biggest impact for me, was far better color accuracy and contrast ratio, compared to my old TN panel monitor. (Since one of my hobbies is image editing, new monitor made my hobby far more enjoyable).



Yes.

Like i said, more pixels to render = more impact on GPU, even when you have the same refresh rate.

1920x1080 = 2.073.600 pixels.
2160x1440 = 3.110.400 pixels. That's 1.5 times more pixels over 1080p.

Still, if given your GPU can produce ~70 FPS at 1440p, there would be little point to go with 1440p, 60/75 Hz monitor. Only increase would be pixel density, which can give you sharper image. But 144/165 Hz 1080p monitor would give you smoother image (less motion blur/screen tearing), and is overall better option.
Hmm... it's a tough call to make. But I'm definitely not going for 1440p. It's definitely not practical for me. I might give the 144Hz 1080p a try though. I do need a 2nd monitor for work so I was looking at an upgrade so I can repurpose my current one.

Thank you so much for your help!
 
  • Like
Reactions: Aeacus