[SOLVED] What card does 4K gaming take after all?

Shujee

Distinguished
Oct 24, 2013
40
0
18,530
I remember the time when I bought a new shining GTX1060 (back 4-5 years ago) and it has worked smoothly for the games that I play at 2K resolution with Ultra settings. I remember people recommending nothing less than GTX1080 for 4K gaming.

Then came the 20XX series and people started telling how 2070 was equal or better than 1080, but somehow 4K gaming would still not run smoothly on 2070 and it would at least take 2080 (or Ti version) for real 4K experience.

Then came the 30XX series. Once again I read articles telling how 3070 Ti squarely beats 2080 in every benchmark, but somehow was not enough for 4K gaming and that I should go for at least 3080 for 4K.

And here we are with the 40XX knocking at our doors. One more time, 4070 is supposed to perform equivalent of 3090 according to one popular review site, but once again, it is not enough to do smooth 4k gaming.

So I keep wondering. Are companies (together with review sites hands in hands) persuading us to keep buying newer stuff when we really don't need to? What card after all is enough to do smooth 4k gaming with Ultra settings? I acknowledge that different titles come with different workloads, but I'm just talking about the generic bottom-line requirement that would work with most games.

Should I go for a 3080 deal for around $700 mark, or wait for 4070 that is supposedly priced around $400? My only goal is to have a card that will run games at 4K with high settings for the next 5-7 years to come.
 
Last edited:
Solution
@Phaaze88 that's exactly what I do. I buy the GPU I want, then sell the current one I have. Which I'll be buying a RTX 4090 - (I'm splurging this year) and will sell my 2080. It won't offset it by much but money back is money back. @Shujee I don't know what to tell you. Honestly, I game at 1440p (Super Sampling to 2560x1440 on a 32 in 1080 p native monitor) but some games will run in 4k on my 2080. I plan to buy a 4090 this year and will still game in 1440p due to the super high FPS I should get. But I play competitive FPS. If I play god of war for example, I think 4k would be killer on that title. It would look stunning. So what I'm saying is this boils down to the genre you like to play, what hardware requirements are for...
Do you even enjoy gaming in 4K? Sometimes overkill is just nothing to be justified, i used to use my friends 4K Monitor (i forgor which), it does give me alot of pleasure but 1, in the same size (27 inch), 4K native and 1440p native monitor is not that much different for keen eyes. or if you dont care about fps, as long as it's good enough, then a 4K60Hz monitor would be enough, play the game throu 60fps vsynced.
 
"I'm just talking about the generic bottom-line requirement that would work with most games."
That's very broad... from the well optimized Doom, that runs very well on a wide combo of cpu+gpu, to Cyberpunk 2077, which uhh... is a bit rough around the edges...
Anyways, what's 'smooth' to you? 30fps, 60, 90, 120, etc?


Smooth 4k gaming with Ultra settings
for the next 5-7 years
With those kinds of targets, you'll never be satisfied with one card in that time frame unless you stick to older titles.
Run your gpus 'on a lease' instead; get the greatest one you can currently afford. When the next gen arrives, get the greatest one you can afford first* and then sell the older one to offset the cost. Rinse and repeat every gpu generation.
[If you try to do the reverse, there's a chance of bad luck; the new card you desire ends up sold out, and you're without a gpu for a time.]

Gpus show their age faster than the cpu, and they show it even faster the higher the resolution is. That also means playing at higher resolutions is more expensive.
 
  • Like
Reactions: Shujee
@Phaaze88 that's exactly what I do. I buy the GPU I want, then sell the current one I have. Which I'll be buying a RTX 4090 - (I'm splurging this year) and will sell my 2080. It won't offset it by much but money back is money back. @Shujee I don't know what to tell you. Honestly, I game at 1440p (Super Sampling to 2560x1440 on a 32 in 1080 p native monitor) but some games will run in 4k on my 2080. I plan to buy a 4090 this year and will still game in 1440p due to the super high FPS I should get. But I play competitive FPS. If I play god of war for example, I think 4k would be killer on that title. It would look stunning. So what I'm saying is this boils down to the genre you like to play, what hardware requirements are for that genre, and if it's worth it to you to upgrade sooner than 5-7 years. I got 3 years out of my 2080. It's just now showing it's age. For example, Unreal Engine 5 brings my PC down to it's knees. I get maybe 40 fps at 1080p. This is for the Matrix city demo. I know even a 4090 won't last me 5-7 years. Not with how fast new graphic engines come out and developers push the graphics envelope more and more each year.
 
  • Like
Reactions: Shujee
Solution
I remember the time when I bought a new shining GTX1060 (back 4-5 years ago) and it has worked smoothly for the games that I play at 2K resolution with Ultra settings. I remember people recommending nothing less than GTX1080 for 4K gaming.

Then came the 20XX series and people started telling how 2070 was equal or better than 1080, but somehow 4K gaming would still not run smoothly on 2070 and it would at least take 2080 (or Ti version) for real 4K experience.

Then came the 30XX series. Once again I read articles telling how 3070 Ti squarely beats 2080 in every benchmark, but somehow was not enough for 4K gaming and that I should go for at least 3080 for 4K.

And here we are with the 40XX knocking at our doors. One more time, 4070 is supposed to perform equivalent of 3090 according to one popular review site, but once again, it is not enough to do smooth 4k gaming.

So I keep wondering. Are companies (together with review sites hands in hands) persuading us to keep buying newer stuff when we really don't need to? What card after all is enough to do smooth 4k gaming with Ultra settings? I acknowledge that different titles come with different workloads, but I'm just talking about the generic bottom-line requirement that would work with most games.

Should I go for a 3080 deal for around $700 mark, or wait for 4070 that is supposedly priced around $400? My only goal is to have a card that will run games at 4K with high settings for the next 5-7 years to come.

Keep in mind as the card years go by, so do the modern games system requirements go up through the years to need more power. The top two tier cards from the past year or upcoming ones should be OK for 4k gaming. How well, there are benchmarks out there to show us. Being able to run games at 4k and 4k "smooth on Ultra" can be a huge difference. 40 fps on Medium is very playable in most titles, but it's a far cry from running at a steady 60+ at higher settings.
 
My suggestion is to buy your 4k monitor first and then see how you do with what you have.
The monitor is a longer term purchase than the graphics card.
You may find the performance and image perfectly acceptable.

If not, then look at a graphics upgrade. Today, 4000 series price/performance is all speculation.
You can expect there to be a early adopter price premium for the top cards as well as shortages.

Earlier, when the 3000 series cards launched, the 3080 was about the right card for 4k gaming.
What is good will depend on what kinds of games you play.
Fast action games are one thing while sims mmo and strategy games may play well on a lesser card.
 
Thanks everyone. I think I get my answers. So the fact that 1080 was good enough for 4K gaming was because the available 4K titles at that time were less demanding than they are today. And secondly, I should buy a 4K monitor before buying a 4K supporting graphics card. Sounds reasonable.