Question RTX 4070 TI Upgrade - Any other bottle necks? Any better recommendation?

fnordtr0l

Honorable
Feb 11, 2017
13
1
10,515
Dear all,

I am looking for more FPS while playing COD and thinking of upgrading my RTX 3070 to a RTX 4070 TI.

Is a GPU upgrade worth it?
Is the considered GPU to small?
Do you see any other bottle necks which should be changed before?

Thank you!

My current setup (using this template):

APPROXIMATE PURCHASE DATE: soon BUDGET RANGE: 1000-1500$ After Rebates

USAGE FROM MOST TO LEAST IMPORTANT: Gaming COD Warzone 2.0

CURRENT GPU AND POWER SUPPLY: RTX3070 + PSU BQT e9-700w (i assume i need to upgrade to +850 Watt, or?)

OTHER RELEVANT SYSTEM SPECS:MoBo Asus Prime Z370-A, CPU HexaCore Intel Core i7-8700K, 4600 MHz (46 x 100), RAM HyperX 32 GB

MONITOR RESOLUTION: Screen Samsung Neo G9 5120x1440 samsung 240hz

ADDITIONAL COMMENTS: Currently running Warzone 2.0 in 3840x1080 (render resolution) appox 60-90 FPS

REMEMBER: The more information we have about your system, your budget, and what you're using the card for, the better the advice we can give you - providing it up front gets you your answers and puts a card in your hands that much faster!
 

Aeacus

Titan
Ambassador
Is a GPU upgrade worth it?

Yes.
Comparison: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-3070-vs-Nvidia-RTX-4070-Ti/4083vs4146

Is the considered GPU to small?

"Small" by which metric? GPU dimensions? GPU performance? Something else?

i assume i need to upgrade to +850 Watt, or?

For RTX 4070 Ti, 750W unit would be preferred. You have 700W unit and you "should" be safe.

Currently running Warzone 2.0 in 3840x1080 (render resolution) appox 60-90 FPS

RTX 4070 Ti should give you ~90-120 FPS.
Review: https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4070-ti-review-a-costly-70-class-gpu/4

Do you see any other bottle necks which should be changed before?

Your RAM DIMM amount, frequency and CAS Latency is? Since currently we know that in total, you have 32GB of RAM.

But next in line, after GPU upgrade, would be new CPU-MoBo combo, since your CPU is getting old-ish.
 
  • Like
Reactions: fnordtr0l

fnordtr0l

Honorable
Feb 11, 2017
13
1
10,515
Thank you for your swift and helpful reply!

"Small" by which metric? GPU dimensions? GPU performance? Something else?
GPU performance in mid-/long-term.
The question is, what would be the next best bigger option? RTX 4080?

But next in line, after GPU upgrade, would be new CPU-MoBo combo, since your CPU is getting old-ish.
That was one of the main reasons why I asked here... I was worried that this is the real bottle neck and an expensive GPU upgrade would not have a significant better performance.

Your RAM DIMM amount, frequency and CAS Latency is? Since currently we know that in total, you have 32GB of RAM.
Correct, I do not have the data available right now but from the RAM usage it seems to be okay. I would anyway not upgrade the RAM only.
 

Aeacus

Titan
Ambassador
GPU performance in mid-/long-term.
The question is, what would be the next best bigger option? RTX 4080?

Next step up would be RX 7900 XTX, but diff is small,
comparison: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4070-Ti-vs-AMD-RX-7900-XTX/4146vs4142

And after Radeon, next in line would be RTX 4080,
comparison: https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4070-Ti-vs-Nvidia-RTX-4080/4146vs4138

GPU performance in mid- to long term requires seeing into the future. If you have that ability, take a look and see what future brings. The rest of the common folk can only guess what is to come.

Regarding guessing, we can only watch what was in the past, and then make our assumption based on what was.
For example:
GTX 1080 Ti (flagship of Pascal architecture), released in Q1 2017, is still viable GPU even at today's standards (6 years later). Sure, it's 4K performance with current games isn't as good, but GTX 1080 Ti is equal to RTX 2080/3060 Ti and thus, is still good GPU for 2K (1440p) gaming.

GPU viability also depends on your gaming preference. If you play AAA titles and want to have high/ultra settings at all times, with 100+ FPS, RTX 4070 Ti may last you few years only. But if you are willing to reduce the FPS target (e.g ~60 FPS), lowering in-game quality, GPU will be viable for far longer. I'd estimate 4-6 years. And if you don't play AAA titles, instead small/casual/old games, you can get easy 10+ years out of RTX 4070 Ti, especially when you don't mind the ~60FPS and/or reducing in-game quality.

With all that being said, no-one can tell for how long a certain GPU would be viable. There is no cookie cutter answer for that.
 

fnordtr0l

Honorable
Feb 11, 2017
13
1
10,515
Thank you Aeacus for your detailed answer, highly appreciated.
Then I will go for a 4070 TI with the alternative RTX 4080 if I find the one or the other for a good price.
Later PSU and CPU (+RAM).

Thanks a lot!
 
  • Like
Reactions: Aeacus
Take the time to run this simple test:
Run YOUR games, but lower your resolution and eye candy.
This makes the graphics card loaf a bit.
If your FPS increases, it indicates that your cpu is strong enough to drive a better graphics configuration.
If your FPS stays the same, you are likely more cpu limited.
I suspect that a I7-8700, as good as it is, is not appropriate for a 4000 series graphics card.
 
I think for the same money I’d look at the 7900xt. Not as fast as the xtx, but I think still requires just a 750 watt power supply. If I recall they cut the memory bus down on the 4070ti plus you are limited to 12gb of vram. I think the 4070ti does ray tracing better but I think that the extra vram may help the 7900xt over the 4070ti in the long term to be relevant longer.
 
I think that the extra vram may help the 7900xt over the 4070ti in the long term to be relevant longer.
Nitpicking on this point but I've heard this reasoning for the past decade and I've yet to actually see a point where it actually came true. And how much memory is absolutely needed before the GPU starts to getting hiccups is still not well understood as most metrics only report how much memory is reserved.
 
Well a lot of titles will attempt to configure automatically. So a lot of the systems the card could be turning down textures or something.

I guess I just think about when games like hogwharts legacy can use almost 10gb vram at 1080p, then what about 1440p or 4K? You can say that game wasn’t optimized well, but how many will work like that in the next few years? And the card is likely fine today. But how about 2-3 years since the op is saying this is a longer term card? Just my 2 cents. Either card should be good but that is probably my train of thought if I’m the buyer.
 
  • Like
Reactions: fnordtr0l
I guess I just think about when games like hogwharts legacy can use almost 10gb vram at 1080p, then what about 1440p or 4K? You can say that game wasn’t optimized well, but how many will work like that in the next few years? And the card is likely fine today. But how about 2-3 years since the op is saying this is a longer term card? Just my 2 cents. Either card should be good but that is probably my train of thought if I’m the buyer.
But again, we don't know how VRAM is actually being used. If you're using something like GPU-z or MSI Afterburner to look at VRAM usage, it's only reporting how much VRAM is reserved for use by applications. As an example, let's look at this from an application point of view:
SKbb0Ua.png


The important memory statics are:
  • Working Set: The working set of a process is the set of pages in the virtual address space of the process that are currently resident in physical memory. [1]
  • Commit Size: The amount of memory the OS has reserved for the process [2]
The thing is, Commit Size for the "big picture" point of view counts as memory being used. So if you have a memory space of 16 GB, and for the sake of example, a commit size of 1GB, even if the working set is only 512MB, there's only 15GB of memory free for other things to use. It's a similar thing with VRAM reporting, only for whatever reason it's simplified such that there's only two states of memory: it's either not used or it's used. The point of reserving memory more memory than the application needs is that memory allocation is an expensive process. The OS has to figure out which sections of memory are free and do bookkeeping on it. It's much faster to give more memory than required so that if the application asks for even more memory, there's less overhead involved, it just gets it.

Another thing to note is some games will actually fill up VRAM to use as cache. For instance, The Division 2 will happily eat up whatever it can from the card's VRAM. Call of Duty in some releases (I believe World War 2 had this) had an option that was literally called "Fill remaining VRAM"

There's also this series of tests from a Linus Tech Tip user: https://linustechtips.com/status/230140/ | https://linustechtips.com/status/230140/ | https://linustechtips.com/status/230532/ . The tl;dr at least for one of them is GTA V claimed on the lowest setting it'll eat 1GB of VRAM. It ended up eating 2GB after doing one run of the benchmark.

So the overall point is: VRAM usage in most reporting metrics simply means the game has access to use that VRAM. Whether or not the game actually uses it in any meaningful way such that it affects rendering performance is something you, I, nor anyone else who isn't part of the game's dev team won't be able to figure out.
 

KyaraM

Admirable
Well a lot of titles will attempt to configure automatically. So a lot of the systems the card could be turning down textures or something.

I guess I just think about when games like hogwharts legacy can use almost 10gb vram at 1080p, then what about 1440p or 4K? You can say that game wasn’t optimized well, but how many will work like that in the next few years? And the card is likely fine today. But how about 2-3 years since the op is saying this is a longer term card? Just my 2 cents. Either card should be good but that is probably my train of thought if I’m the buyer.
Hogwarts Legacy reserves almost 10 GB VRAM at 1080p (and 1440p, btw, there is very little difference). It uses only about 8GB. 9 GB with Frame Generation, which makes sense. I checked that myself and so can you. Just follow this guide here:
View: https://www.reddit.com/r/nvidia/comments/j1tm2t/psa_msi_afterburner_can_now_display_per_process/


PCGH also did a test just last month checking how different resolutions and sertings affect VRAM useage. At no time did they reach 12 GB in any of the tested games at anything lower than 4K, and even that was with RT and Frame Generation active, so the absolute worst case. Also, the same test showed quite clearly that the "AMD has more VRAM so they will last longer" argument is absolute bs. AMD, and Intel for that matter, use more VRAM than Nvidia. Meaning that AMD needs the higher VRAM to even work properly, it's not future proofing as much as necessity. If a 6950XT with 16GB already uses 13.66GB VRAM at 4K while a 4070Ti uses 10.8 GB (see image linked below), then you can go fugire how "future proof" that is compared to Nvidia (read: it is the exact same). That has been known for years, yet apparently it still didn't register with most people. Also, even between generations, VRAM apparently gets used differently.

View: https://imgur.com/gallery/pTPuJLW


Here is a picture from the article in two games to illustrate some of the findings. FPS were over 100 on average. See how the RTX 3080 uses more VRAM, in 4K in these games, than the 4070Ti. I believe this might be connected to the bigger cache of the 4070Ti, which is something that also plays into the picture and makes it a little hard to figure out how the future will look like with the card.

What I can tell you, however, and what every single benchmark shows is. Up to 1440p, this card does not show any peculiarities despite the lowered memory bandwidth, and is an absolute monster by every metric. In 4K, it breaks down a bit harder than other cards, but it still performs somewhere around the 3080Ti level (down from 3090Ti) due to the lower memory bandwidth. That is still stellar performance in my eyes, especially aince you don't even play in 4K. And in some games, it still beats said 3090Ti anyway. And on top of that, Frame Generation exists and is quite viable in HL.

So my conclusion is, the 4070Ti is a very capable card for 1440p especially, but also 4K. Nothing small about the card either, it is a monster in terms of performance, size, and pretty muxh all else. It will also stay very cool. My card runs OCd to 2910MHz (+500 on VRAM, too) at max 68°C in benchmarks, and under 60°C at pretty much all times in games. Some games don't even break 50°C.

@fnordtr0l this is also for you.
 
Have a look at this video regarding vram and memory bus. Again I’d consider the amd card.

View: https://youtu.be/Iw_axwlehjU

And yet the RTX 4090 struggles to get 60 FPS... with DLSS performance mode outputtig to 4K, so it's really rendering at 1080p . Considering the CPU usage is relatively low across all the cores, this tells me the game runs a lot of important things over a single thread.

Also the memory allocation didn't go any higher than about 12GB in the run so... I'm not convinced it's a VRAM issue at least in that scene.

There's a followup video to this where the person tests on other CPUs if you're curious.

Another point to add is people keep thinking that once VRAM runs out, like the moment there's not a spare megabyte left in the card, the game's performance is going to tank. I argue in modern rendering engines, that shouldn't really be the case. The only thing that's immediately needed are the base quality assets and the render targets. Plus things are streamed in and out of VRAM all the time. This was the whole idea behind id's Megatexture technology: you don't need to have all of texture data in VRAM. You only load what you need at any given time.
 
  • Like
Reactions: fnordtr0l
With hogwharts legacy perhaps they didn’t optimize the game. But in the video I posted, he shows they 4070ti being a bit stuttery. The a770 he used had low fps but was relatively smooth and not as jumpy. You wonder if it had a similar gpu to the 4070ti how performance would have been had it still had the expanded vram and memory bus.
 
  • Like
Reactions: fnordtr0l

KyaraM

Admirable
Have a look at this video regarding vram and memory bus. Again I’d consider the amd card.

View: https://youtu.be/Iw_axwlehjU
After watching that dude's video about the rumors around the 4070 non-Ti, I rather will never watch a single video of that idiot again instead of sending back my card, thanks. He doesn't even know what he is talking about...

And yet the RTX 4090 struggles to get 60 FPS... with DLSS performance mode outputtig to 4K, so it's really rendering at 1080p . Considering the CPU usage is relatively low across all the cores, this tells me the game runs a lot of important things over a single thread. Also the memory allocation didn't go any higher than about 12GB in the run so... I'm not convinced it's a VRAM issue at least in that scene. There's a followup video to this where the person tests on other CPUs if you're curious.
Another point to add is people keep thinking that once VRAM runs out, like the moment there's not a spare megabyte left in the card, the game's performance is going to tank. I argue in modern rendering engines, that shouldn't really be the case. The only thing that's immediately needed are the base quality assets and the render targets. Plus things are streamed in and out of VRAM all the time. This was the whole idea behind id's Megatexture technology: you don't need to have all of texture data in VRAM. You only load what you need at any given time.

It isn't a VRAM problem. They basically already confirmed that the game has issues with Nvidia cards. Why, I can't tell, but they had at least two fixes already specifically for them. One targeted the game not getting the Nvidia storage manager, another struggled with generally low FPS on Nvidia cards only. However, even in Hogsmeade, I got no stutters this weekend on my 4070Ti. There are a few spots in Hogwarts castle that are rough, but they are rough. Very reproducable, though, always in the same spot. I play with RT lighting on Ultra. Native, Hogsmeade gets me 47-77 FPS averaging 61 at 1440p. DLSS averages 90-95. Frame Generation gives 122. All three feel smooth, but frame generation is obviously smoothest.

Btw, the videos are from launch day, LONG BEFORE any fixes to the game. They do NOT give accurate information anymore!

With hogwharts legacy perhaps they didn’t optimize the game. But in the video I posted, he shows they 4070ti being a bit stuttery. The a770 he used had low fps but was relatively smooth and not as jumpy. You wonder if it had a similar gpu to the 4070ti how performance would have been had it still had the expanded vram and memory bus.
See above. They had issues with Nvidia cards specifically and they already worked on fuxing it. Not yet optimal yet, but vastly better than before. It runs rather well for me. Even on a 3070Ti, btw. And it's not just the 4070Ti, it is ALL Nvidia cards. The game doesn't like them.
 
Last edited:
  • Like
Reactions: fnordtr0l
I think his point with the 4070 non ti is that he was saying it’s supposed to have 16 gb of vram, so I think he’s thinking long term it may be more viable. It’s good they are fixing issues but if the op is planning this to be say a 5 year card, who knows if they’ll issue a fix for newer games if they are then worried about a 6070ti for example.

Anyway the 4070ti isn’t a bad card just wish it were a bit cheaper.
 
  • Like
Reactions: fnordtr0l

KyaraM

Admirable
I think his point with the 4070 non ti is that he was saying it’s supposed to have 16 gb of vram, so I think he’s thinking long term it may be more viable. It’s good they are fixing issues but if the op is planning this to be say a 5 year card, who knows if they’ll issue a fix for newer games if they are then worried about a 6070ti for example.

Anyway the 4070ti isn’t a bad card just wish it were a bit cheaper.
And yet, he fails to understand even the simplest thing about VRAM usage between AMD, Nvidia, and now Intel... btw, I don't consider the 3060 a great example here simply because doubling the VRAM is something else than increasing it by 1/3, and because unlike the 4070Ti and assumed 4070 with 16GB, it is, at the end of the day, still the same card besides VRAM. If that even happens, which I doubt. The non-Ti will not even have the same number of GPU cores etc., making it highly unlikely to approach the 4070Ti in any meaningful way just because VRAM, or "more viable". VRAM alone simply doesn't bump up a card that hard.
 
  • Like
Reactions: fnordtr0l
I agree with ohio_buckeye that you should be looking at the RX 7900-series over the RTX 40-series. He's right about the VRAM limitation but there's an even bigger reason than that. It's a well-known fact that COD strongly prefers Radeon hardware to GeForce hardware:

CoD_4K-p.webp

CoD_1440p-p.webp

Like, seriously, it's not even close with the RX 7900 XT easily outpacing even the RTX 4090 at 1440p and doing the same to the RTX 4080 at 4K.

Unlike The Outer Worlds, Call of Duty: Modern Warfare II doesn't favor GeForce GPUs but quite the opposite. Here the Radeon 7900 XT is a massive 32% faster than the 4070 Ti at 1440p, despite the 4070 Ti being just 7% slower than the 4080.
- Techspot RTX 4070 Ti review: January 4th, 2023

Your best-case scenario here is that you play at 4K and the RX 7900 XT only destroys the RTX 4070 Ti by 27%. If you play at 1440p, the RX 7900 T is 32% faster. At 1080p, the difference will be even larger because even though they haven't tested it at that resolution, the trend is that the lower you go, the bigger the difference between the Radeon and GeForce cards.

The RX 7900 XT used to be $100 more expensive than the RTX 4070 Ti but it has very recently received a price cut of $100 which makes it now the same price as the RTX 4070 Ti. If your top game is a COD title and you want more FPS, it would be insane to buy an RTX 4070 Ti over the RX 7900 XT. That's just the truth.
 
Last edited:
I’m no expert by any means I’m just saying from a practical perspective when I look at the specs and the price, if I’m shopping knowing these should be comparable being in the same price bracket, if I knew I was keeping the card let’s say for 5 years, I’d want to look at raw specs like vram, memory bus etc etc.

The 4070ti isn’t a bad card, but if this is for example a 5 year card, if I’m the shopper, I want to look at things and say at the end of that time which hardware is still viable or not, or which card has the best chance to give the highest level of performance for that time frame.

At some point the 7900xt and 4070ti won’t be supported parts any longer. And which of those are the time runs better could come down to brute force hardware. Anyway I’ve said enough on this thread. But for today, going to be hard to make a bad choice here really.
 

fnordtr0l

Honorable
Feb 11, 2017
13
1
10,515
Thank you all for your comments, I really did not consider well the AMD cards.
But as 90% of my GPU usage will be consumed by COD the numbers speak for themself.
I guess I will go for a RX 7900 XT or RX 7900 XTX.
Thank you all!

Update: I remember why I did not consider AMD. The issue is the lacking support for machine learning libraries. :-(