News Upscaling can't save the world's slowest 'modern' GPU — FSR doubles performance on GT 1030, but titles still barely playable

It was a crap card 5 years ago...no surprises here. Although it probably holds up better than the 5060 8GB will in 5 years.

It was a crap card 5 years ago...no surprises here. Although it probably holds up better than the 5060 8GB will in 5 years.
Or the 9060xt with 8gb as well.. or is it still only cool to hate on Nvidia? My pitchfork is still in tip top shape
 
Why would you expect a low end card from 8 years ago to play modern AAA games when. It was already running the modern AAA games for its time at low settings most of the time? Even doubling performance is maybe going to uplift if time being able to handle about medium settings for the time period it came out in.
 
Why would you expect a low end card from 8 years ago to play modern AAA games when. It was already running the modern AAA games for its time at low settings most of the time? Even doubling performance is maybe going to uplift if time being able to handle about medium settings for the time period it came out in.
I think testing these things is fun, even if the results are as predictable as this :)
 
calling pascal modern is criminal lol
Yeah, exactly how do you even define "modern GPU"? Nvidia already deprecated it. IMO, if they've already put a product in legacy-support mode, it's no longer modern.

On a related note, I fired up an old AMD HD 5450 on Linux, this weekend. It worked right out of the box and even supported full OpenGL. For sure, that thing is slower than the GTX 1030.
 
  • Like
Reactions: artk2219
It was a crap card 5 years ago...no surprises here. Although it probably holds up better than the 5060 8GB will in 5 years...
The GTX 1030 is a 2 GB card. So, you're saying that 2 GB works better today than 8 GB will work in 5 years? I doubt that.

VRAM capacity isn't increasing nearly as fast as it used to, and with so many new entry-level cards only shipping with 8 GB, you can bet game developers will be forced to still support them.
 
Why would you expect a low end card from 8 years ago to play modern AAA games when.
It's a dumb youtuber desperate for publicity, being exploited by a lazy and failing tech site that's desperate for clicks.

You're 100% right. This is pure clickbait. There's nothing to be learned from this, other than quite how badly it ran. I can see more value in that, in the sea of content that is youtube, than I can in the newsfeed of a tech site. It's really not news. Nobody needs to know this.
 
  • Like
Reactions: artk2219
I vote TomsHardware stops posting these clickbaity junk articles.
They will keep writing them as long as people keep clicking on them.

In my defense, I was actually just playing with an ancient video card this weekend. I correctly guessed they'd be talking about a GTX 1030 and was curious exactly how it was deemed "modern", which they did not even say.
 
It was a crap card 5 years ago...no surprises here. Although it probably holds up better than the 5060 8GB will in 5 years...
I know we all hate 8gb cards, but the 1030 is a 2gb card, with 30-class levels of compute (384 CUDA cores @1.2GHz). For the 5060 to be as proportionally weak in 5 years time as the 1030 is compared to it now, the RTX 7060 would need to have >38,000 CUDA cores @4GHz and 32gb of VRAM.

If that happens, BIG “IF”, then it won’t matter what you buy now because the 60-class cards of 2030 will be 80% faster than the best that money can buy today.
 
I guess the GT 1030s are going for around $40 on their own.
The reason I was playing with an old video card was for a server. It has a BCM with integrated graphics, but I got fed up with that and so I reached for a low-end dGPU.

For something like this, you want bus power and preferably just a single-slot card. Multiple display outputs and > 1080p resolution (which is what ultimately forced me to move beyond the HD 5450).

The only reason I'd rather go with AMD or Intel is that Nvidia cards aren't well-supported under Linux, if you just rely on the open-source drivers (and, as I mentioned, their proprietary drivers are phasing out support for the Pascal generation). They have a new open source driver, but it will only support current & newer GPUs.
 
The article said:
Pascal was the first Nvidia architecture to natively support FP16 compute. By contrast, the GTX 660 was based on the Kepler architecture, which only features FP32 and FP64 compute support. FSR has an FP32 fallback mode for these types of GPUs, but using FP32 compute reduces performance.
I forgot I had wanted to mention something about this. It was only the P100 that supported packed-fp16 dot product. Technically, part of the same generation, but not a feature you'd find in the GTX 1030. However, what the client GPUs supported, that the P100 lacked, was 4x packed int8 dot-product (so-called DP4A).

It basically comes down to the distinction between Nvidia's big datacenter GPUs (i.e. the x00 series) vs. everything else.
 
Fair enough...but unlike Nvidia, AMD didn't produce a whole lot of the 8GB turd...they knew it wasnt going to sell.
It's selling plenty, just to system integrators, where most people buy computers from.

Us enthusiasts have no business buying those 128-bit entry level cards, which is why we ridicule them. But don't kid yourself, those are by far the most common products in end user systems.
 
Yeah, exactly how do you even define "modern GPU"? Nvidia already deprecated it. IMO, if they've already put a product in legacy-support mode, it's no longer modern.

On a related note, I fired up an old AMD HD 5450 on Linux, this weekend. It worked right out of the box and even supported full OpenGL. For sure, that thing is slower than the GTX 1030.
Not so fast, the end of support has been announced after the 580 series drivers which aren't released yet so the 1030 is still cutting edge as far as driver support. So it can be "modern" for at least a couple more months.

As can 1st and 2nd gen Maxwell, like the $20 Quadro K620 which is nearly as good as the 1030.

A good question the article presents is: if there is a such thing as driver support for too long?

I think everyone knows these 10 year old PCIe powered cards suck for gaming, but they aren't so bad for display.

Heck, even the 4770k my daughter used got dropped from W10 official support: https://learn.microsoft.com/en-us/w...ed/windows-10-22h2-supported-intel-processors and that one does light use just fine. Even plays most games above 60 fps, at some settings.
 
  • Like
Reactions: artk2219
Us enthusiasts have no business buying those 128-bit entry level cards, which is why we ridicule them. But don't kid yourself, those are by far the most common products in end user systems.
GDDR7 and the amount of cache in modern GPUs is squeezing more performance out of a 128-bit memory interface than we're used to.

I would just focus on benchmark results & pay attention to the 1% lows. If those are good enough, for the games you want to play, then don't worry about the bit-width of the data bus.
 
A good question the article presents is: if there is a such thing as driver support for too long?
A good question? Huh?

The answer is clearly no. Whether or not a graphics card is useful should be determined based on its capabilities vs. what you want to use it for. Drivers ending support for it is a crappy and artificial reason to have to stop using it. This is obvious. Doesn't justify the article, one bit.

I think everyone knows these 10 year old PCIe powered cards suck for gaming, but they aren't so bad for display.
Yeah, that's what I tried to use a ~15-year-old HD 5450 for. In the end, even though its HDMI interface was v1.3 and should've supported high enough bitrate, it wouldn't run above 1080p. I needed 1440p, so it finally went in the scrap heap. Thankfully, I also had a RX 550 that ran just beautifully!
 
  • Like
Reactions: Geezen and artk2219
Two things need to be addressed:

1. The GT 1030 DDR5 was released in 2017, I bought one in 2017 for $70. Calling it modern is like calling a flintlock musket a weapon of modern warfare.

2. The GT 1030 is not a gaming card. It can run games, I played games with mine for years and ran low to medium specs at 1080p, but it's mainly a display adapter for the office PC that requires a little more power than integrated graphics can handle.
 
VRAM capacity isn't increasing nearly as fast as it used to, and with so many new entry-level cards only shipping with 8 GB, you can bet game developers will be forced to still support them.
I wish you were wrong, but there's very little evidence that you are. I find it interesting how good the 5060 is from a performance increase standpoint if only it had more VRAM. The 60 series especially has been hit by pretty low uplifts over the last 4 generations though which may make it seem better than otherwise though.

HUB did a video about the 2060-5060 and I think it'd be interesting to see the same with the 70 series. Transcription: https://www.techspot.com/review/3012-nvidia-geforce-rtx-60-class/

Even though support is ending this year I'd love to see the GTX 1060 in there too.
 
  • Like
Reactions: Geezen and artk2219
I find it interesting how good the 5060 is from a performance increase standpoint if only it had more VRAM.
It does come in a 16 GB version.

I wonder if they might even stop selling the 8 GB soon, given how small the price gap is with the 5050, which also has 8 GB. Of course, the 8 GB 5060 is much faster, since it has a larger die and GDDR7 memory. So, if you're Nvidia and people are going for either the 5050 or the 16 GB 5060, why keep selling the 8 GB version?
 
  • Like
Reactions: artk2219
It does come in a 16 GB version.
It does not, only the 5060 Ti comes in both. I wouldn't be surprised if we saw a 12GB 5060 Super when nvidia feels like using higher capacity memory IC though.

5060 Ti ($379/429) and 9060 XT ($300/350) are the two this generation with different memory capacities.

The 5060 does undermine the 5050 in terms of value/performance though.