Question Recommend a GPU for 1440p gaming until PS6 raises gaming standards

Status
Not open for further replies.
Sep 26, 2023
5
2
15
I build a pc every ~5 years. I'm not graphics freak (I just played Baldur's Gate 2 for the first time and LOVED it), but if a game can look better, and I can appreciate the difference, I'll pay for it.
1. I game at 1440p.
2. Goal is 60 fps and smooth. I can't appreciate better. My monitor is 60hz.
3. Some games look a lot better with ray tracing, so that's interesting.
4. Games are made for consoles, so it doesn't make a lot of sense to spend a lot to be way more powerful than them.
5. I tried AMD once a decade ago and was disappointed. It seems games are optimized for Nvidia.

I'm thinking GTX 4070. It's significantly more powerful than the PS5 but not overkill. I'll upgrade in 2028, and games won't be significantly more demanding until the PS6 comes out in ~2030.

Thoughts?
 
  • Like
Reactions: Order 66
Solution
I build a pc every ~5 years. I'm not graphics freak (I just played Baldur's Gate 2 for the first time and LOVED it), but if a game can look better, and I can appreciate the difference, I'll pay for it.
1. I game at 1440p.
2. Goal is 60 fps and smooth. I can't appreciate better. My monitor is 60hz.
Having a 60Hz monitor makes things a lot easier, eh? ;)
3. Some games look a lot better with ray tracing, so that's interesting.
They look a lot better for the two seconds that you're not looking out for enemies, otherwise you'd never notice, except that your frame rate tanks dramatically if your card doesn't have "RTX 4090" printed on it somewhere.
4. Games are made for consoles, so it doesn't make a lot of sense to spend...

Order 66

Grand Moff
Apr 13, 2023
2,164
909
2,570
First of all, AMD GPUs have changed a lot in the last decade. Nvidia's 40 series has been disappointing because of the skimpy VRAM for the price. I would go with a 7800xt which beats the 4070 by about 8% for $80 less. second of all, if you are planning on playing Starfield, the 4070 cannot run starfield at 1440p 60.
 
Games being made for consoles doesn't really mean much, especially when the PC port often has higher graphical quality options available. In addition, you can still eke out much more performance for those games, assuming the developer didn't do a straight port and locked the FPS to 60.

In any case, if your goal is something long term, then consider a 4080 instead, assuming you can find one closer to the $1200 MSRP. However, it would be helpful to know what budget you have.

second of all, if you are planning on playing Starfield, the 4070 cannot run starfield at 1440p 60.
Starfield is kind of an outlier because the developer only focused on AMD optimizations.
 
  • Like
Reactions: Order 66
If you really want longevity try this 7900xtx which has 24GB of VRAM which will last a lot longer at 1440p than 16GB and the 7900xtx is $300 cheaper than the 4080.
Historically VRAM has never been an indicator of longevity, because by the time games actually need it, the performance of the GPU has fallen off anyway. OP isn't running 4K either which is the usual go-to to say you need more VRAM. And if OP is also playing games with RTX, DLSS will further reduce the need for VRAM.

I will still continue to say VRAM usage is a nebulous thing and just because you see it at some level doesn't really mean anything. I'll also point to games that have an option that literally says "Fill remaining VRAM"
 

Order 66

Grand Moff
Apr 13, 2023
2,164
909
2,570
I will still continue to say VRAM usage is a nebulous thing and just because you see it at some level doesn't really mean anything. I'll also point to games that have an option that literally says "Fill remaining VRAM"
which games have this option?
Historically VRAM has never been an indicator of longevity, because by the time games actually need it, the performance of the GPU has fallen off anyway. OP isn't running 4K either which is the usual go-to to say you need more VRAM. And if OP is also playing games with RTX, DLSS will further reduce the need for VRAM.
Not neccessarily true, as the performance of the 3070's GPU for example, benefits massively from 16GB of VRAM.
 
Sep 26, 2023
5
2
15
Thanks for the help guys. I really don't have a budget, but I HATE wasting money. If I can appreciate the difference, I will pay for it, but I don't want to pay 2x for a difference I will barely notice. I donate a lot of money to charities etc. I'd rather feed 10 kids than pay an extra $1000 for something I won't really appreciate.
 
which games have this option?
Call of Duty 4: Modern Warfare Remastered

20161103234717_1htakv.jpg


Plus games with something that caches shaders will have more or less the same effect.

Not neccessarily true, as the performance of the 3070's GPU for example, benefits massively from 16GB of VRAM.
The problem with this argument of "you need X amount of VRAM for the future!" is that it only applies if your requirement is to max out all of the graphical settings. And if that is your requirement, sure. But I'm pretty sure most people here would be willing to tweak with the settings to get optimal performance with minimal image quality loss

For example with Resident Evil 4, you can gain a significant amount of performance with probably a small impact to image quality:

So the whole idea that VRAM is an indicator of longevity to me is bollocks unless you require everything to be maximum quality settings at 1440p or higher all the time. In which case, more power to you, but I think your requirements are not practical.

Thanks for the help guys. I really don't have a budget, but I HATE wasting money. If I can appreciate the difference, I will pay for it, but I don't want to pay 2x for a difference I will barely notice. I donate a lot of money to charities etc. I'd rather feed 10 kids than pay an extra $1000 for something I won't really appreciate.
At the very least then start with a CPU with strong single core performance, because that will likely last a lot longer than a video card. Or I guess to make things easy, get a 7800X3D.

Case in point, my 5600X can deliver 200+ FPS in Cyberpunk 2077. Granted it's really chugging to do that, but it at least tells me I have plenty of performance left in the thing.
 
Sep 26, 2023
5
2
15
First of all, AMD GPUs have changed a lot in the last decade. Nvidia's 40 series has been disappointing because of the skimpy VRAM for the price. I would go with a 7800xt which beats the 4070 by about 8% for $80 less. second of all, if you are planning on playing Starfield, the 4070 cannot run starfield at 1440p 60.
Assuming equal optimization (I don't know), it seems Nvidia is better at raytracing@1440p for the $, and AMD is better at rasterization @1440p for the $. Raytracing really seems to be game dependent, but I expect it will look better and better, and I'm suspicious of AMD anyway. https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
 
I build a pc every ~5 years. I'm not graphics freak (I just played Baldur's Gate 2 for the first time and LOVED it), but if a game can look better, and I can appreciate the difference, I'll pay for it.
1. I game at 1440p.
2. Goal is 60 fps and smooth. I can't appreciate better. My monitor is 60hz.
Having a 60Hz monitor makes things a lot easier, eh? ;)
3. Some games look a lot better with ray tracing, so that's interesting.
They look a lot better for the two seconds that you're not looking out for enemies, otherwise you'd never notice, except that your frame rate tanks dramatically if your card doesn't have "RTX 4090" printed on it somewhere.
4. Games are made for consoles, so it doesn't make a lot of sense to spend a lot to be way more powerful than them.
Consoles often run at 30FPS where a PC will do 60-240FPS so it all depends on what you're after.
5. I tried AMD once a decade ago and was disappointed.
Do you honestly believe that nothing in the tech world changes in 10 years? :ROFLMAO:
It seems games are optimized for Nvidia.
Well, considering that, as you yourself said, games are made for consoles. Since both Xbox and Playstation have both used Radeon graphics for two generations, then games are made for Radeon GPUs.
I'm thinking GTX 4070.
I'm going to assume that you mean RTX 4070 because the GTX cards ended three generations ago.
It's significantly more powerful than the PS5 but not overkill. I'll upgrade in 2028, and games won't be significantly more demanding until the PS6 comes out in ~2030.

Thoughts?
Well, I have a feeling that you've been out of the loop for awhile because some of your assumptions are completely outdated and I can prove it. This year, the biggest AAA releases have arguably been Hogwarts Legacy, Jedi Survivor, Starfield and CP2077 Phantom Liberty.

Let's compare the RTX 4070 to its nearest rival, the RX 7800 XT at 1440p Ultra. Note that the RX 7800 XT hadn't yet been released when HL and JS came out so it's not in their benchmark lists but SF and CPPL are both less than a month old and are both extremely demanding games.

The RTX 4070 costs $550 and has 12GB of VRAM.
The RX 7800 XT costs $500 and has 16GB of VRAM.

Let's look at Cyberpunk 2077: Phantom Liberty, a brand-new nVidia title:
Ultra_1440p-p.webp

I kinda think that I'd be more disappointed if I paid $550 to get only 68FPS when I could've had 89FPS for $500. Remember that, in the tech world, 10 years might as well be 100 or 1000 because things tend to change completely roughly every five years and right now, unless you're buying an RTX 4090, GeForce cards get beat at every price point (sometimes even by a lower-priced Radeon like you see here).

Now, if you want a GeForce card, by all means, go for it. I just don't want you making the decision based on bad information because that never turns out well. It's always better to have options, especially when you don't know that those options exist.
 
Last edited:
Solution
D

Deleted member 2958512

Guest
I build a pc every ~5 years. I'm not graphics freak (I just played Baldur's Gate 2 for the first time and LOVED it), but if a game can look better, and I can appreciate the difference, I'll pay for it.
1. I game at 1440p.
2. Goal is 60 fps and smooth. I can't appreciate better. My monitor is 60hz.
3. Some games look a lot better with ray tracing, so that's interesting.
4. Games are made for consoles, so it doesn't make a lot of sense to spend a lot to be way more powerful than them.
5. I tried AMD once a decade ago and was disappointed. It seems games are optimized for Nvidia.

I'm thinking GTX 4070. It's significantly more powerful than the PS5 but not overkill. I'll upgrade in 2028, and games won't be significantly more demanding until the PS6 comes out in ~2030.

Thoughts?

Please, try to stay away from 4070.

Buying a GPU with just 12GB VRAM, is a huge mistake. 16GB, is bare minimum, IMO.

My 1080 Ti, in 2017, had 11GB: and that was 6 years ago.

12GBs, just won't cut it with 2023's AAA titles.

Having bought 3090 Ti a year ago, i will never again go with anything less than 24GB. Current AAA titles, i've noticed using 14-16GB.

If i were you, i'd spend a few more dollars and just get a 4090. You'll be good to go for the next 6-7 years.
 
Thanks for the help guys. I really don't have a budget, but I HATE wasting money. If I can appreciate the difference, I will pay for it, but I don't want to pay 2x for a difference I will barely notice. I donate a lot of money to charities etc. I'd rather feed 10 kids than pay an extra $1000 for something I won't really appreciate.

If you are cost conscious then skip the 40 series entirely as nVidia deliberately made them suck to upsell the 4090 at it's obscene price. AMD has decent offerings recently and are the better value this generation.

Also I can't believe people still on the graphics VRAM myth. Memory bandwidth is far more important the VRAM size, all you need is enough VRAM to hold all graphics resources of the existing level / scene. Anything more then that is used as a type of cache with previously loaded data being kept around incase it's asked for, which to be fair it often is. Since we want the exact same DRAM chip on each 32-bit memory channel, wider memory bandwidth usually means more memory chips and therefor more memory. A 384-bit memory bus is using 12 32-bit memory chips, if those are 1GB (8Gb) each then we have 12GB of DRAM, if they are 2GB (16Gb) each then we have 24GB DRAM. At 192-bits we have only 6 memory chips, at 1GB we get 6GB of DRAM, at 2GB chips we get 12GB of DRAM.

Those numbers should start to look familiar along with a sudden understanding of the previous posted performance chart.
 
Last edited:
  • Like
Reactions: Avro Arrow
D

Deleted member 2958512

Guest
I really don't have a budget, but I HATE wasting money.

Trust me: the real waste of money, would be settling for an insufficient GPU, that you'll be looking to get rid of, after a few months have passed.

If I can appreciate the difference, I will pay for it, but I don't want to pay 2x for a difference I will barely notice.

Correct me if i'm wrong, but, to my understanding, you can even afford a flagship GPU, so long as you 're convinced it's worth it.

That's exactly why i'm recommending 4090. You'll get up to 80% better performance, compared to its previous generation counterpart: the 3090.

Also I can't believe people still on the graphics VRAM myth.

It's not a myth, if you get D3D fatal error CTDs from games. What good is bandwidth alone, when modern game titles require more than 12 GBs of VRAM, even at 1080p?
 
Last edited by a moderator:
Status
Not open for further replies.

TRENDING THREADS