Review Nvidia GeForce RTX 4070 Review: Mainstream Ada Arrives

Page 4 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.

umeng2002_2

Commendable
Jan 10, 2022
186
169
1,770
For 2560x1440p or 3440x1440p, I wouldn't advise against the the 4070 or 4070 Ti. The memory bandwidth and capacity is a concern for 4K. DLSS 2 and 3 does MITIGATE it SOMEWHAT. DLSS 3 is better at alleviating VRAM starvation than DLSS 2 because VRAM starvation causes stutters that DLSS 3 can reduce by more than half, while DLSS 2 is a pure resolution dependent solution.
 

oofdragon

Honorable
Oct 14, 2017
241
234
10,960
I'm really curious to see if theyr going to launch the 60s with only 8GB when even their own previous gen had 12GB. I guess the 4060Ti at 10gb would not be like.. that bad, but man Nvidia is really pushing it this time. Oh if only people wake up and realize RT and FG are gimmicks.....
 
Apr 24, 2023
4
0
10
The "worry" about a Bitcoin rally is completely misplaced. GPU mining is dead and buried and a sport of those in China with ultra-low power and infrastructure costs. Most importantly, there is simply no GPU-minable coin that will EVER have market cap like Ethereum's was and market cap is what drives profitability and ergo, mining for profit. GPU mining will never be an issue again in the broader GPU market, nor will "AI" uses, that's complete nonsense.
 

Dantte

Distinguished
Jul 15, 2011
162
59
18,760
Here's the funny thing: Name me some monitors that you know support DSC. It has been around for a while, yes, but it's actually hasn't been used much. This Samsung Odyssey Neo G8 is one of the first I've encountered with DSC support. So I guess anything that has existed as a standard and not been used until recently is "yesterday's technology." We could also rephrase your rephrasing to this: "I really need to buy 'future proof' hardware with support for standards that barely exist, in the hope that some day over the 3-year life of this product I'll actually be able to use that tech!" By the time monitors and displays that can actually benefit from DP2.1 are available at not-insane prices, we'll be on to the next generation GPUs, and I'm positive Nvidia will finally leave DP1.4 behind with Blackwell. (Though I was almost certain that would happen with Ada and that ended up being wrong...)

I'm not opposed to DisplayPort 2.1 support in AMD, but it's truly not a big deal. We only have a very limited number of monitors that just came out in the past six months that approach the limits of DisplayPort 1.4a with DSC (as in, 4K and 240Hz). If 8K actually takes off (doubtful in the next five years), maybe we'll see displays that truly need DP2.1 bandwidth — 8K and 120Hz. Now read that again. 8K. 120 Hz.

Do you know how much high fps 8K content there is? Practically none, and what there is exists mainly for marketing purposes. 8K and 60 fps is barely a thing, and it's already supported with DP1.4a. 8K and 120 fps is the practical limit of what we might ever really need, because while we could reasonably do 120Hz videos and gaming at some point, it's a massive stretch. Gaming at 8K and 120Hz? That's 4X the pixels of 4K, and even the fastest GPUs, using Frame Generation, rarely get above 120Hz at 4K.

8K is basically a living room standard for placebo effect. 8K for movie theaters with 80 foot screens, sure, but on a 120-inch projection viewed from 20 feet away? Your eyes literally can't see the difference. 8K on a screen that's three feet away? Sounds like great marketing material. We could call it the "Retina 8K display" and get all the Apple users to buy it, and then run at 200% scaling. "OMG the pixels are so small that I can't see than and need a magnifier!" That's already basically true of 4K on a 27-inch monitor, like the one I'm using right now (and have been using for eight years).

Calling DSC "trickery in video compression" could apply to every video codec ever released. Well, except DSC isn't a codec, it's a very high throughput algorithm that, depending on the screen content, can deliver either lossless compression (on a lot of stuff) or it can do up to 3:1 compression with minimal artifacting. And I really do mean that 'minimal' bit. Good luck spotting it. It's not at all like 4:2:2 or 4:2:0 "compression" where the result is lossy and you might actually notice (maybe, depending on the implementation — RTX GPUs and RX 5000 and later look fine; GTX 10-series not so much). The only real problem with DSC is that if you get any signal corruption, like even a single flipped bit, it can seriously screw up transmission. That's probably why it hasn't been used until now.
You're kind of proving my point for me... 1.4 is good for past, and just good enough for present-day technology. At $1000 for a single component, I want it to be good for future tech as well. Atleast if I get a card, 5000-series??? with DP2.1 I know its probably good for the next 5-years because not only will it support the latest standards, it will also have more power, but if I bought a 4000 today its prbably going to need another upgrade in the next 2 years or so. Look at it another way; waiting for the next gen card will double my investment in the future.
 
You're kind of proving my point for me... 1.4 is good for past, and just good enough for present-day technology. At $1000 for a single component, I want it to be good for future tech as well. Atleast if I get a card, 5000-series??? with DP2.1 I know its probably good for the next 5-years because not only will it support the latest standards, it will also have more power, but if I bought a 4000 today its prbably going to need another upgrade in the next 2 years or so. Look at it another way; waiting for the next gen card will double my investment in the future.
The only way an RTX 40-series becomes "outdated" or whatever you want to call it is if you actually upgrade your monitor to something that uses DP2, in a way that truly benefits DP2 users. I've explained why, right now, there aren't really any good choices that require DP2.

So let's say you buy a 4080 or 4090. That means you should have at least a 4K 120Hz display to use it with. And if you have that and then in the next year we get more DP2.1 displays that can do 4K 240Hz... well, assuming they also support DSC then they can also do 4K 240Hz with DSC. But would you even upgrade to a 4K 240Hz display from a 4K 144Hz display? Not unless the old display dies, I'd wager.

I'm still running several 4K 60Hz displays, and they're fine for everyday use. In gaming, sometimes I notice their lack of >60Hz refresh rates, but also I'm far more likely to notice the <60 fps performance from most GPUs at 4K. We would need GPUs that can drive 4K at 240 fps for 240Hz monitors to really matter. This is definitely a case of higher numbers and marketing not being something most of us need or would even really benefit from. Unless you're viewing 4K 240fps videos maybe?
 
D

Deleted member 2838871

Guest
So let's say you buy a 4080 or 4090. That means you should have at least a 4K 120Hz display to use it with. And if you have that and then in the next year we get more DP2.1 displays that can do 4K 240Hz... well, assuming they also support DSC then they can also do 4K 240Hz with DSC. But would you even upgrade to a 4K 240Hz display from a 4K 144Hz display? Not unless the old display dies, I'd wager.

I'm still running several 4K 60Hz displays, and they're fine for everyday use. In gaming, sometimes I notice their lack of >60Hz refresh rates, but also I'm far more likely to notice the <60 fps performance from most GPUs at 4K. We would need GPUs that can drive 4K at 240 fps for 240Hz monitors to really matter. This is definitely a case of higher numbers and marketing not being something most of us need or would even really benefit from. Unless you're viewing 4K 240fps videos maybe?

Here's my take on the situation.

1) The 4090 is part of my new high end build
2) I play on a 4K 120hz OLED that I set at 60hz
3) I play at 60hz because I personally don't see the difference in 120hz
4) That's probably a good thing because the 4090 won't do 120 fps in 4K Ultra on the latest AAA titles

So basically I'm confident that I'll be good to go for quite a long time because the 4K Ultra settings at 60 fps that I like to play at my 4090 PC has no problems achieving so I'm not sure the point of future upgrades.

The 3090 couldn't do it... but the 4090 can... 4K Ultra 60 fps.

As for 120 fps+... I've tested it on titles like Hogwarts and Last of Us and my system was pushing 90-100 fps in 4K Ultra so I guess these people who buy 4K 144hz+ displays are running at lower settings.
 
Last edited by a moderator:

Dantte

Distinguished
Jul 15, 2011
162
59
18,760
Here's my take on the situation.

1) The 4090 is part of my new high end build
2) I play on a 4K 120hz OLED that I set at 60hz
3) I play at 60hz because I personally don't see the difference in 120hz
4) That's probably a good thing because the 4090 won't do 120 fps in 4K Ultra on the latest AAA titles

So basically I'm confident that I'll be good to go for quite a long time because the 4K Ultra settings at 60 fps that I like to play at my 4090 PC has no problems achieving so I'm not sure the point of future upgrades.

The 3090 couldn't do it... but the 4090 can... 4K Ultra 60 fps.

As for 120 fps+... I've tested it on titles like Hogwarts and Last of Us and my system was pushing 90-100 fps in 4K Ultra so I guess these people who buy 4K 144hz+ displays are running at lower settings.
Yes, I run games in much lower settings all the time to gain frame rates in first person shooters. There was a bug, still might existing in the Unreal Engine 4 that tied firing rate to frame rate, so "potato mode" and maxing out the frame rate there was an advantage. I still want to be able to turn up the graphics for single players games too.

To Jarreds point, I tend to upgrade montiors often, such as next on the list will be the new G9 dual-4k (32:9) 240hz. I can run this on my current setup now with compression, but my frames would suck without dumbing down the graphics. So why would I spend $1000+ now on a new card for a few more frames and still handicap myself, instead of just waiting for the next gen? Next gen I assume will not only support DP2.1, but have more power than current gen as well, double the bang for my $$$. "Investing in the future"