Nvidia RTX 5070 vs RTX 5060 Ti 16GB — less VRAM but much better performance

I had not looked at GPU street prices in a while, thought I would look at where a 4070Ti/Super/Ti Super land for comparison and WOW. Get the 5070, even at the 679 USD lowest price I could find on Newegg it's a better deal. What a time to not need a system upgrade.
 
Can always set the power limits. Power falls off more quickly than performance. You could probably do 200W and get about 90% of the same performance.
Often times the perf drop off is less, depending on the workload. Almost no game will push all parts of the GPU to 100% and therefor, 250W. I monitor Wattage on my cards, at full gaming utilization they're virtually never at the TGP, more like 80% of it. Now I'm off to play with power limits and voltage curves so thanks! It's raining/snowing atm and I was looking for something to do.
 
  • Like
Reactions: ThereAndBackAgain
During the summer I often run my 3080 Ti at 80% power just to help with the heat output. I can't tell the difference between 350W and 280W in actual gaming (also can't really tell the difference in how warm the room gets, but it must be happening)

Also seeing very promising undervolt results from the 9000 series from AMD. Like a 200Mhz jump just by dropping the voltage a little.
 
  • Like
Reactions: CelicaGT
During the summer I often run my 3080 Ti at 80% power just to help with the heat output. I can't tell the difference between 350W and 280W in actual gaming (also can't really tell the difference in how warm the room gets, but it must be happening)

Also seeing very promising undervolt results from the 9000 series from AMD. Like a 200Mhz jump just by dropping the voltage a little.
Just messing around on a 4070Ti. I dropped the Power limit by 35%, in 3DMark Speedway my score only dropped a tick over 8% and at 185W no less. So even a simple power limit tweak in Afterburner brings massive consumption drops. I hadn't messed around much with this since my old 1070 powered laptop. I'll definitely be doing this once it gets hot in a month or so.
 
Looking at the gaming numbers wouldn’t a more logical comparison be the B580 as the 5060 is clearly a 1080p card (not even 60fps in most games at 1440p so even with frame gen ymmv)?

In Europe a B580 can be had for a little more than €300 including sales tax.

Comparing it to the 5070 makes it look kind of ok value when it’s really quite overpriced to my mind.
 
So unless you specifically need a card with 16GB, perhaps to run a particular LLM, the RTX 5070 offers the better overall value right now, and will probably continue to do so in the future.

For a card retailing for $600 or even $700 (current sold by Newegg prices) it should have at least 16gb VRAM, and I wouldn't recommend it to anyone, at least not until we see if Nvidias supposedly new ultra efficient memory compression tech becomes widespread and lives up to the hype so as to make 12gb a non issue.

If someone absolutely NEEDED a GPU right now and those were the only choices, and cost and potential longevity weren't an issue, sure, 5070 over 5060ti, but unless you are in that position I'd say hold off.
 
For a card retailing for $600 or even $700 (current sold by Newegg prices) it should have at least 16gb VRAM, and I wouldn't recommend it to anyone, at least not until we see if Nvidias supposedly new ultra efficient memory compression tech becomes widespread and lives up to the hype so as to make 12gb a non issue.

If someone absolutely NEEDED a GPU right now and those were the only choices, and cost and potential longevity weren't an issue, sure, 5070 over 5060ti, but unless you are in that position I'd say hold off.
If you "need" a 16GB card, great. But we're talking about a 30% performance delta — MORE at 4K than at 1440p ultra. Even with games that can push up against 12GB. Given you can run 4K with DLSS quality upscaling (renders at 1440p), that basically kills off the need for 16GB in most games. Would 16GB be nicer to have? Sure. So would 18GB! (Using 3GB GDDR7 chips.) But right now, "saving" $100 to get 4GB more VRAM but 33% less bandwidth and 26% less compute is a bad choice for nearly all buyers.

16GB is most practically useful right now if you want to run one of the LLMs that exceeds 12GB. That's easy enough to do. It's why the price jump between 5070 and 5070 Ti is so much higher, and even why the 9070 costs $100 more than the 5070 at current online (US) prices.

People love to pretend that there's a "need" for 16GB for gaming, and there's absolutely not. There are individual games that need 16GB to run certain settings, almost always at 4K native with maxed out settings. But that's mostly the placebo effect, because 4K high will look 99% the same and not exceed 12GB.

TLDR: There is no concrete rule on how much VRAM you need. More is better, everything else being equal, but in this case there's nothing even remotely close to equivalence between a 5070 12GB and a 5060 Ti 16GB.
 
If you "need" a 16GB card, great. But we're talking about a 30% performance delta — MORE at 4K than at 1440p ultra. Even with games that can push up against 12GB. Given you can run 4K with DLSS quality upscaling (renders at 1440p), that basically kills off the need for 16GB in most games. Would 16GB be nicer to have? Sure. So would 18GB! (Using 3GB GDDR7 chips.) But right now, "saving" $100 to get 4GB more VRAM but 33% less bandwidth and 26% less compute is a bad choice for nearly all buyers.

16GB is most practically useful right now if you want to run one of the LLMs that exceeds 12GB. That's easy enough to do. It's why the price jump between 5070 and 5070 Ti is so much higher, and even why the 9070 costs $100 more than the 5070 at current online (US) prices.

People love to pretend that there's a "need" for 16GB for gaming, and there's absolutely not. There are individual games that need 16GB to run certain settings, almost always at 4K native with maxed out settings. But that's mostly the placebo effect, because 4K high will look 99% the same and not exceed 12GB.

TLDR: There is no concrete rule on how much VRAM you need. More is better, everything else being equal, but in this case there's nothing even remotely close to equivalence between a 5070 12GB and a 5060 Ti 16GB.
While I agree with the fine article's conclusion, and the sentiment of your post, Horizon: Forbidden West will indeed chew through to 16GB VRAM usage at 4K, with DLSS, and in my experience it's 1% lows to look at not the average. Game fluidity takes a big dump as VRAM limits get brushed up upon. Additionally, many games automatically pull back on render distances and/or texture resolution to fit the frame buffer, visibly blurring the image, sometimes with ridiculous pop in. This will not show up in numerical results, one has to play the game to see it as it is not always present.

Not to be too forward may I suggest publishing VRAM usage in the charts? It's a hot topic and I think many would love to see it. As always, thanks for the work you do, and for your responsiveness in the comments. It is indeed a rare thing in a forum.
 
WARNING on 5060ti: do NOT be duped into buying the 5060ti 8GB! It definitely loses more performance than you would think with just a lower VRAM amount that the 16GB version. So if you find what you think is a "Deal" on a 5060ti, make sure you ascertain the VRAM amount!
Yup, it's pretty bad. Just finished watching this (Not normally a HUB fan) and even at 1080p there's badness.

View: https://www.youtube.com/watch?v=AdZoa6Gzl6s&t=835s
 
Often times the perf drop off is less, depending on the workload. Almost no game will push all parts of the GPU to 100% and therefor, 250W. I monitor Wattage on my cards, at full gaming utilization they're virtually never at the TGP, more like 80% of it. Now I'm off to play with power limits and voltage curves so thanks! It's raining/snowing atm and I was looking for something to do.

Odd my 4070 super always hit 220w
 
Odd my 4070 super always hit 220w
Depends on the game, in benchmarks my Ti will typically sit around 280-285W, but in games it floats around quite a bit, despite being at 99% utilization. There are different parts of the GPU doing different things, if one of those things (say, DLSS, tensor cores) is not used wattage can drop.

(Edit: to add example) Currently running Enshrouded, 3000Mhz, 99% utilization, 90FPS, 210W. During the day/night cycle, due to the type of lighting used, power consumption will rise at dawn/dusk while frame rate drops, still at 99% utilization. I see similar things in all my other games, and on my other systems.
 
Last edited:
While I agree with the fine article's conclusion, and the sentiment of your post, Horizon: Forbidden West will indeed chew through to 16GB VRAM usage at 4K, with DLSS, and in my experience it's 1% lows to look at not the average. Game fluidity takes a big dump as VRAM limits get brushed up upon.
Are there specific areas where this happens? Does it only happen over longer play sessions? Because in my testing (which may not be representative of other areas of the game, and that's why I ask) generally doesn't show 12GB as being a problem even at 4K native with very high or ultra or whatever settings. (Whatever the max is, I think very high... which is probably wrong in my charts. LOL)
 
I had not looked at GPU street prices in a while, thought I would look at where a 4070Ti/Super/Ti Super land for comparison and WOW. Get the 5070, even at the 679 USD lowest price I could find on Newegg it's a better deal. What a time to not need a system upgrade.
What irks me is that pretty much every other part of a computer one may need is readily available at normal prices. Its just the GPU market being stupid overpriced when available. I have a nephew wanting to build a gaming computer and everything else but the video card is $500. I just had him order the other stuff, will let him use my old RX 580 until the video card market stops being dumb.
 
If you "need" a 16GB card, great. But we're talking about a 30% performance delta — MORE at 4K than at 1440p ultra. Even with games that can push up against 12GB. Given you can run 4K with DLSS quality upscaling (renders at 1440p), that basically kills off the need for 16GB in most games. Would 16GB be nicer to have? Sure. So would 18GB! (Using 3GB GDDR7 chips.) But right now, "saving" $100 to get 4GB more VRAM but 33% less bandwidth and 26% less compute is a bad choice for nearly all buyers.

16GB is most practically useful right now if you want to run one of the LLMs that exceeds 12GB. That's easy enough to do. It's why the price jump between 5070 and 5070 Ti is so much higher, and even why the 9070 costs $100 more than the 5070 at current online (US) prices.

People love to pretend that there's a "need" for 16GB for gaming, and there's absolutely not. There are individual games that need 16GB to run certain settings, almost always at 4K native with maxed out settings. But that's mostly the placebo effect, because 4K high will look 99% the same and not exceed 12GB.

TLDR: There is no concrete rule on how much VRAM you need. More is better, everything else being equal, but in this case there's nothing even remotely close to equivalence between a 5070 12GB and a 5060 Ti 16GB.

The biggest reason is because this is a card costing upwards of $700, so while you may not need it right now, what about in 2 years, especially if they're going to be leaning more heavily on Ai and frame generation? With a component that's the cost of the rest of your system combined these days most people have to think longer term than next generation, and I could not in good conscious recommend a $700 midrange GPU with 12gb VRAM to anyone, I'd tell them to spend a bit more on the 5070ti (which is what the 5070 should be), or wait for the next generation OR for something to happen and the price falls to $500.
 
Are there specific areas where this happens? Does it only happen over longer play sessions? Because in my testing (which may not be representative of other areas of the game, and that's why I ask) generally doesn't show 12GB as being a problem even at 4K native with very high or ultra or whatever settings. (Whatever the max is, I think very high... which is probably wrong in my charts. LOL)
Plainsong, in the village area. Max settings, no dynamic scaling, DLSS Quality. Even right now on my monitor at 1440p (vs. on my 4K TV at these settings where I would hit smash right into system RAM and get the jitters) I'm sitting at 11.4GB used (Afterburner, Rivatuner), driver 572.83

(Edit: to add) Also consider the following by Hardware Unboxed. It is focused on the 8GB 5060 Ti variant but the breadth of the data is there. Lots of titles are up against 12GB now, and beyond 8GB, even some at 1080p. Granted, this video displays some confirmation bias but it does prove the point. 8GB is nearly dead now, 12GB is in the crosshairs....but as you said, depends on the game and settings.

View: https://www.youtube.com/watch?v=AdZoa6Gzl6s&t=974s
 
Last edited:
Let's be real here. The only reason to get a 5070 is because...
9070/XT, 7800/XT, 7900/XTX, 4070/Ti/Super, 4080/Ti/Super, 5070Ti and 5080 are all Sold Out.
And your only options are 5060Ti 16GB or 5070.

It's too bad these chips don't spoil because a supermarket would slap a 30% discount sticker on them by the end of the day.
 
  • Like
Reactions: CelicaGT
Let's be real here. The only reason to get a 5070 is because...
9070/XT, 7800/XT, 7900/XTX, 4070/Ti/Super, 4080/Ti/Super, 5070Ti and 5080 are all Sold Out.
And your only options are 5060Ti 16GB or 5070.

It's too bad these chips don't spoil because a supermarket would slap a 30% discount sticker on them by the end of the day.
Way back, in more sane times....last gen GPU prices would toilet when the new stuff came out. These days PC components are sold like a commodity, unfortunately.

As to the 30% off sticker, hilariously my local grocery store uses a "Make me tonite!" sticker to indicate last gen chicken breasts that are on sale...