News Nvidia's midrange GPUs through the years revisited — pitting the RTX 5070 versus the 4070, 3070 and 2070 in an all-encompassing gaming showdown

Admin

Administrator
Staff member
Seriously, midrange? The last one I remember worthy of that description was the 1660.
From the age of RTX there's no midrange anymore. Either it's underperforming entry level 50/60's, or high-end 70/80's and overkill 90's. All of them overpriced in my view. Anytime I upgrade I'm furious that I have to either pay half the price of the rig on a single component (the GPU), to get something at least moderately worth the money, or settle for previous-gen that got cheaper because of the new bells and whistles on the market. Next upgrade I'm considering a Radeon instead, after I've ditched the f00ked up 13th gen i7 for a Ryzen three weeks ago. Midrange used to be best bang for the buck, nowdays it's a sad joke.
 
Be me: browsing articles on doomscroll... See an article about 70 class cards. Reading some nonsense...

My setup:
MSI Mag z590 tomahawk
Intel i5 10400 cooled by a hyper 212 black rbg edition.
Corsair 2x16 ddr4 3200
Rtx 3070 fe
Corsair Rm750e PSU
Sn770 500gb as a boot drive.
Sn850 1tb as a game drive.

Using the 1080p settings described... DLSS Balanced with Rtx medium... Getting 58fps with an average of 55 and a max of 79, 38 fps 1% low. Frametime showing 19.5 ms (cuz DLSS) and yes I waited for Frametime to normalize.

These settings result in a very poor looking image... Turning OFF rtx suddenly you're not blowing out pixel full with highlights everywhere. During off DLSS results in better response times (go figure).

By running native 1080p without rtx, cyberpunk 2077 on a 3070 pushes 90fps with an average of 85, max of 146 and 52fps 1% low. Looking at 9ms on the Frametime and the game generally looks better with these settings at this resolution.

Unsure what you guys are doing to test games, but this is what I threw together in about 30 minutes of testing on a fresh cyberpunk install. Again, this was my result after running through the city after letting Frametime normalize. No driving, no combat, no sitting in a corner to get best results. Normal game play.

Even setting rtx to ultra with DLSS quality results in 69 fps avg with 81 max and 54 1% low. 13ms response time. My CPU is running 4000mhz and my ram at 3200mhz. My GPU is running stock clock speeds.

My Rtx 3070 is doing as well as the 4070/5070 numbers you are showing here for 1080p. Unsure why yours are so bad...
 
How are they getting such low fps on Cyberpunk 2077 with RT medium and DLSS balanced at 1080p? I get 138fps avg with RT psycho and DLSS balanced at 1440p with a 4070ti.
 
Be me: browsing articles on doomscroll... See an article about 70 class cards. Reading some nonsense...

My setup:
MSI Mag z590 tomahawk
Intel i5 10400 cooled by a hyper 212 black rbg edition.
Corsair 2x16 ddr4 3200
Rtx 3070 fe
Corsair Rm750e PSU
Sn770 500gb as a boot drive.
Sn850 1tb as a game drive.

Using the 1080p settings described... DLSS Balanced with Rtx medium... Getting 58fps with an average of 55 and a max of 79, 38 fps 1% low. Frametime showing 19.5 ms (cuz DLSS) and yes I waited for Frametime to normalize.

These settings result in a very poor looking image... Turning OFF rtx suddenly you're not blowing out pixel full with highlights everywhere. During off DLSS results in better response times (go figure).

By running native 1080p without rtx, cyberpunk 2077 on a 3070 pushes 90fps with an average of 85, max of 146 and 52fps 1% low. Looking at 9ms on the Frametime and the game generally looks better with these settings at this resolution.

Unsure what you guys are doing to test games, but this is what I threw together in about 30 minutes of testing on a fresh cyberpunk install. Again, this was my result after running through the city after letting Frametime normalize. No driving, no combat, no sitting in a corner to get best results. Normal game play.

Even setting rtx to ultra with DLSS quality results in 69 fps avg with 81 max and 54 1% low. 13ms response time. My CPU is running 4000mhz and my ram at 3200mhz. My GPU is running stock clock speeds.

My Rtx 3070 is doing as well as the 4070/5070 numbers you are showing here for 1080p. Unsure why yours are so bad...
It's great you're getting better numbers, I've seen others report better numbers. However, the takeaway from the article, since I assume consistent benchamarking from test to test, is the gen on gen improvements. Presumably if you had access to the next gen newer card and reran YOUR test methodology you would see similar improvements.

This isn't about how good the cards are or aren't compared to what you (anyone reading the article) is capable of. It's one persons testing methodology on successive gen's of xx70 and how they compare to each other.

Take the info for what it is. Proof that there has been gen over gen improvements even it we dislike the price.
 
It's great you're getting better numbers, I've seen others report better numbers. However, the takeaway from the article, since I assume consistent benchamarking from test to test, is the gen on gen improvements. Presumably if you had access to the next gen newer card and reran YOUR test methodology you would see similar improvements.

This isn't about how good the cards are or aren't compared to what you (anyone reading the article) is capable of. It's one persons testing methodology on successive gen's of xx70 and how they compare to each other.

Take the info for what it is. Proof that there has been gen over gen improvements even it we dislike the price.
Yeah but there’s the rub isn’t it?

I don’t think anyone disputes there’s been gen on gen improvement in absolute performance, however it used to be (yeah getting old over here) that the next gen was in roughly the same price bracket as the previous gen. So you got more performance per dollar of euro or whatever currency.

Now you get more performance but also a price increase that’s at least as high as the performance increase (usually more than the performance increase) so per dollar there’s no performance increase, might even be a degradation in performance.
 
Be me: browsing articles on doomscroll... See an article about 70 class cards. Reading some nonsense...

My setup:
MSI Mag z590 tomahawk
Intel i5 10400 cooled by a hyper 212 black rbg edition.
Corsair 2x16 ddr4 3200
Rtx 3070 fe
Corsair Rm750e PSU
Sn770 500gb as a boot drive.
Sn850 1tb as a game drive.

Using the 1080p settings described... DLSS Balanced with Rtx medium... Getting 58fps with an average of 55 and a max of 79, 38 fps 1% low. Frametime showing 19.5 ms (cuz DLSS) and yes I waited for Frametime to normalize.

These settings result in a very poor looking image... Turning OFF rtx suddenly you're not blowing out pixel full with highlights everywhere. During off DLSS results in better response times (go figure).

By running native 1080p without rtx, cyberpunk 2077 on a 3070 pushes 90fps with an average of 85, max of 146 and 52fps 1% low. Looking at 9ms on the Frametime and the game generally looks better with these settings at this resolution.

Unsure what you guys are doing to test games, but this is what I threw together in about 30 minutes of testing on a fresh cyberpunk install. Again, this was my result after running through the city after letting Frametime normalize. No driving, no combat, no sitting in a corner to get best results. Normal game play.

Even setting rtx to ultra with DLSS quality results in 69 fps avg with 81 max and 54 1% low. 13ms response time. My CPU is running 4000mhz and my ram at 3200mhz. My GPU is running stock clock speeds.

My Rtx 3070 is doing as well as the 4070/5070 numbers you are showing here for 1080p. Unsure why yours are so bad...
Basically all I read from this articles was buy the new shiny Nvidia card for daft money as it gives a small bump each generation that we carefully control to squeeze every cent of revenue out of at every level, that you may or may not need, while boosting our profits.
 
In 2018 (2070), the UK Average weekly wage was £510, launch price £479
In 2020 (3070), the UK Average weekly wage was £550, launch price £469
In 2022 (4070), the UK Average weekly wage was £610, launch price £589
In 2025 (5070), the UK Average weekly wage was £715, launch price £549

As a non-fan on nVidia's pricing, I have to say I was really quite surprised when collating these figures, they are much more reasonable than my bias wanted them to be!

Of course the unknown is just how many people managed to get cards at launch at that price - be interesting to be able to see that stat through the generations.
 
It's great you're getting better numbers, I've seen others report better numbers. However, the takeaway from the article, since I assume consistent benchamarking from test to test, is the gen on gen improvements. Presumably if you had access to the next gen newer card and reran YOUR test methodology you would see similar improvements.

This isn't about how good the cards are or aren't compared to what you (anyone reading the article) is capable of. It's one persons testing methodology on successive gen's of xx70 and how they compare to each other.

Take the info for what it is. Proof that there has been gen over gen improvements even it we dislike the price.
"In Cyberpunk 2077 with ray tracing on medium and DLSS Balanced, the RTX 2070 averages 12.9 FPS, the 3070 nearly doubles that at 23.8 FPS, the 4070 reaches 48.2 FPS, and the 5070 peaks at 61.3 FPS. That kind of progression carries into other ray-traced workloads: Oblivion Remastered climbs from 24.9 FPS on the 2070 to 63.3 FPS on the 5070, while Quake 2 RTX rises from 52.6 FPS to 166.5 FPS over the same generational span."

As quoted from their article. The 3070 averages 23.8 FPS with RTX medium and DLSS balanced presets @ 1080p. That is not what I am getting. How can I trust what they are writing if I am not having their experience? If I am a long time enthusiast/repair tech, with the knowledge/understanding to maintain a 3070 to this degree of performance over half a decade...

Kinda makes me wonder why they bother putting this out there? If you have a 3070 and you're trying to decide whether or not to upgrade, this article would tell you 'absolutely.' And then the real world comes crashing down as you stick a brand new 5070 into a box that was just running a 3070... you're going to run into some issues right away.

If your 3070 isn't running so well, your 5070 isn't going to run so well. The conversation should be more about how the generational shift isn't as prevalent as it used to be, prior to the RTX generation of cards. About how there's still an abundance of value for people playing on older cards, and how that value holds up even against the newest versions of those 70 class cards.

If I am reading my comment, seeing a bog standard build from like 5 years ago (which is what I have) instead of some contrived high end test bench, that gives me a more realistic perspective on what to expect from this conversation, rather than buying the latest greatest hardware for a lot more money, only to get trade offs in performance.

It would be one thing if there were clear generational improvements in the hardware for each card. But the facts bear such slim generational hardware improvements... it's a wonder the AI software support can even muster them. I wouldn't feel comfortable building a new machine for someone with a 5070 in it. Too many reported issues with black screens, drivers, and missing RoPs. There was even a 5070 that burned itself at the connector, indicating the same low quality control as present on the higher end cards.

Taking the information at face value is probably the last thing I want to do, buddy.