Discussion GPUs and Systems -- 2010 to 2025

jnjnilson6

Distinguished
How far have we actually gone?
Back in the day (2009-2011) RAM was 4 to 8 GB in gaming systems and GPU VRAM was generally between 1-2 GB in the high-end sector.

What is terrific in retrospect is that with 4 GB system RAM and 1 GB VRAM on a high-end card you could play games at 1600x900 or 1920x1080 with stunning graphics and a decent framerate. We have gone up to 32 GB RAM and 8 to 16 GB VRAM today and the graphics of current games aren't much better. A marginal improvement in gaming only. Sure thing, 4K cannot compare to 1080p, but in the end of the day when you look at the screen and compare, you can but wonder if the difference you're witnessing really complies with a x8 difference in RAM and a x8 to x16 difference in VRAM (if we are talking about 4 GB RAM and 1 GB VRAM systems from 2010 and current gaming systems).

I still feel software was much better written back then. Do tell me your opinions and let's delve into the mysterious corners of the past!

Thank you!
 
I'm not exactly sure what kind of discussion can be held here based on this incorrect statement.
Well, 'incorrect' cannot be solely defined as a term. In art there are the shadows of pasty wonder that creep over the canvas and delve by saturation in vividness and what is acceptable for one may be unacceptable for another. Yet masterpieces still exist. So there is always the exigent boundary of general perception.

I am talking about specifications which back in the day were, generally, considered the norm for high-end gaming and which today are, generally, considered the norm for high-end gaming. It complies with a general point of view, not an eccentric perception.
 
Hey,

I agree with you for the most part, that top quality systems from then playing the latest games certainly did look great. But, I would say that for the current state of render quality too, I would say it has defo moved on or with the times.

I think that both times of software driving the hardware and conversely hardware driving software seem to be quite cyclical.

I remember when the original Crysis came out it brought even the strongest systems to their knees. IMO, that would be an example of a times when software was driving hardware, as it took a few gen catch up in hardware to be able to max out that game as the devs may have intended.

Then there are games, as you said, that don't require a really beefy machine to play, but look amazing in terms of GFX too. It does make me wonder why those really good (low resource engines aren't use more). I think id tech engine is a good example of that.

But, now when we look at the likes of Unreal Engine 5, and it's back to software leading the hardware again. UE5 can tank performance on all but the best GPU's.

Overall, I'd say that in general the quality of GFX has improved roughly in line with hardware requirements over the years with some ups and downs either way.

Personally for me, UE5 just doesn't float my boat. I don't think it looks as amzing as other games I've played. Even something older like Metro Exodus Enhanced. If I had that side by side with an UE5 game, I'd choose the former as better looking.

But, this is all very subjective. Hard to quantify.
 
  • Like
Reactions: jnjnilson6
Resolution, Polygon count, texture quality and people's expectations of framerate are well beyond what they were in 2010.

I will grant that the gameplay experience isn't that much different, which is why I don't play a lot of newer games. Basically once we had full 3D that had a decently high polygon count everything was in place. What has been added since is mostly cosmetic and some interesting physics.

Also I had 12GB system memory in 2010 (Then I stuck with 16GB for the next decade), but yeah, definitely only needed 4-8GB at the time.
 
  • Like
Reactions: jnjnilson6
Well, 'incorrect' cannot be solely defined as a term. In art there are the shadows of pasty wonder that creep over the canvas and delve by saturation in vividness and what is acceptable for one may be unacceptable for another. Yet masterpieces still exist. So there is always the exigent boundary of general perception.

I am talking about specifications which back in the day were, generally, considered the norm for high-end gaming and which today are, generally, considered the norm for high-end gaming. It complies with a general point of view, not an eccentric perception.
I'll say it straight, as I was trying to be vaguely nice about it.

RAM and VRAM are a cold hard fact of life. Having more RAM/VRAM simply allows to store more and higher resolution texture to be used in the scene - this is not some feels subjective "graphics aren't much better" statement, it's the reality and it affects many games in various ways.

For example, nowadays you need to do quite a bit less "transition" scenes where you squeeze through some crack for 2 seconds or use an elevator, which is a trick made compensate for low memory capacity. You can have much bigger and more detailed areas that are actually rendered and not a background 2D-ish facade like we had back with Mass Effect 3.

And yes indeed - 4k vs 1080p is also not something you can just casually brush off - that too is a whole different ballpark there.

And finally - newer engines are not only about smoke, mirrors and RT. There is a ton of texture and rendering manipulation techniques and a very long list of other capabilities that straight up did not exist previously, which by definition means better end visuals.

The above is not some feels statements.
 
  • Like
Reactions: jnjnilson6
Hey,

I agree with you for the most part, that top quality systems from then playing the latest games certainly did look great. But, I would say that for the current state of render quality too, I would say it has defo moved on or with the times.

I think that both times of software driving the hardware and conversely hardware driving software seem to be quite cyclical.

I remember when the original Crysis came out it brought even the strongest systems to their knees. IMO, that would be an example of a times when software was driving hardware, as it took a few gen catch up in hardware to be able to max out that game as the devs may have intended.

Then there are games, as you said, that don't require a really beefy machine to play, but look amazing in terms of GFX too. It does make me wonder why those really good (low resource engines aren't use more). I think id tech engine is a good example of that.

But, now when we look at the likes of Unreal Engine 5, and it's back to software leading the hardware again. UE5 can tank performance on all but the best GPU's.

Overall, I'd say that in general the quality of GFX has improved roughly in line with hardware requirements over the years with some ups and downs either way.

Personally for me, UE5 just doesn't float my boat. I don't think it looks as amzing as other games I've played. Even something older like Metro Exodus Enhanced. If I had that side by side with an UE5 game, I'd choose the former as better looking.

But, this is all very subjective. Hard to quantify.
Yeah, you are absolutely right. Maybe my perceptions have been a little oldschool. The times of Crysis 1 - 3 and Far Cry3 and Battlefield 3 and 4... A Core i5-2400 and a HD 6850 (both of which I did not own; had an i7-3770K and 2x HD 7870s in CrossFire) could easily play said games at 1920x1080 with a little over 30 FPS. And they looked so nice. I think what we're witnessing here is that games have already become very high-quality in appearance. And the level of in-game perception currently is almost lifelike so the eye cannot render all those little details, the cards are dying to compute, to perfection. Soon graphics would be so good the differences made by higher resolutions and more shadows and vividness would be almost perceptually unrecognizable; yet the hardware requirements for these differences would still be going up the sky.
 
  • Like
Reactions: Roland Of Gilead
I'll say it straight, as I was trying to be vaguely nice about it.

RAM and VRAM are a cold hard fact of life. Having more RAM/VRAM simply allows to store more and higher resolution texture to be used in the scene - this is not some feels subjective "graphics aren't much better" statement, it's the reality and it affects many games in various ways.

For example, nowadays you need to do quite a bit less "transition" scenes where you squeeze through some crack for 2 seconds or use an elevator, which is a trick made compensate for low memory capacity. You can have much bigger and more detailed areas that are actually rendered and not a background 2D-ish facade like we had back with Mass Effect 3.

And yes indeed - 4k vs 1080p is also not something you can just casually brush off - that too is a whole different ballpark there.

And finally - newer engines are not only about smoke, mirrors and RT. There is a ton of texture and rendering manipulation techniques and a very long list of other capabilities that straight up did not exist previously, which by definition means better end visuals.

The above is not some feels statements.
I have never said that a 2 GB Nvidia GT 540M would perform better than a 1 GB Radeon HD 6850. I was just stating what the specifications of the highest cards were in nucleus; if I started to outline shader count and other specifications it would be quite a long post indeed; and most of the people who played games back then, generally, would associate the high-end cards with these VRAM numbers and not just the VRAM numbers themselves attached to any lower or middle-end card.

Thank you!
 
I have never said that a 2 GB Nvidia GT 540M would perform better than a 1 GB Radeon HD 6850. I was just stating what the specifications of the highest cards were in nucleus; if I started to outline shader count and other specifications it would be quite a long post indeed; and most of the people who played games back then, generally, would associate the high-end cards with these VRAM numbers and not just the VRAM numbers themselves attached to any lower or middle-end card.

Thank you!
But you yourself are well aware how much big of a difference 4k 100+ FPS is to your quoted:

had an i7-3770K and 2x HD 7870s in CrossFire) could easily play said games at 1920x1080 with a little over 30 FPS. And they looked so nice.
Even by that comparison, you're talking about driving 4x more pixels 300% faster. That is a whole another ballpark there.

That is even aside the subjective "looked so nice". Yes, everyone was sweating about Crysis, but did it look as good as full blown CP77 in 4k? No, not even close.
 
  • Like
Reactions: jnjnilson6
One of the things that actually bugs me about some games is the approach to realism.

Older games they couldn't achieve it, so it was an approximation and your imagination filled in the gaps. Now there is no room for interpretation, you see what you see and if it isn't right, it looks bad.

I think a lot of the games that have an art style to them work a lot better than realism attempts. Though some games do a great job in creating a realistic aesthetic within their game engine's capabilities.
 
Soon graphics would be so good the differences made by higher resolutions and more shadows and vividness would be almost perceptually unrecognizable
This is so true. And not only that, we are seeing some of this happen right now. With Frame Gen on AMD and nVidia, and Intel GPU's. FG is artificially inserting frames which increases FPS, but with frames that aren't actually rendered. Interesting times ahead with these AI (Tensor) cores. It may be in a few years that pure raster rendering gives way to AI rendering.
 
  • Like
Reactions: jnjnilson6
Having more RAM/VRAM simply allows to store more and higher resolution texture to be used in the scene
I think that is a too simplistic way to look at it. Yes, more VRAM 'can' help for certain aspects of game rendering, but it's not the be all and end all. If you look at games that come out now, they are unoptimized pieces of crap. Texture compression is non existent. This is what uses up so much vram, along with RT effects ( which for me are useless). Game Devs/Studios need to cop on. Texture compression algorithms are no new thing. It needs to be done better. These insane install sizes are proof that texture compression is not used at all.

Again, I would point to this thread being an opinion piece, and you are entitled to your opinion. But, you seem more set on challenging opinion, than sharing it.
 
  • Like
Reactions: jnjnilson6
But you yourself are well aware how much big of a difference 4k 100+ FPS is to your quoted:


Even by that comparison, you're talking about driving 4x more pixels 300% faster. That is a whole another ballpark there.

That is even aside the subjective "looked so nice". Yes, everyone was sweating about Crysis, but did it look as good as full blown CP77 in 4k? No, not even close.
You're right of course. But what I mean is the stark difference between games before, say, 12-14 years and games before 24-28 years. Graphics kept getting better (in terms of visual perception) with lightning speeds until they got so good and lifelike that every little visual upgrade began to demand a lot faster hardware and brought in only a little more perceptible eye-candy than the last. There was a point, maybe when Crysis 3 was released, when games became so lifelike that shown to a kid - he would hardly notice a big difference between that gaming era and the current one. There is no much more room for perceptibly better graphics. Sure thing, we may be getting bigger resolutions and billions of more polygons and graphics would get better and worldly refined to the last pixel, but compared to how Crysis 3 looks to the eye, there would never be a difference as big as that which was say, between Crysis 3 and Midtown Madness. All I wanted to say is that we've reached that point; otherwise I am not disputing that current video cards put the older ones to shame (in terms of cores and computation); I am just saying that to the human eye the difference becomes less and less prominent and much harder to discern from real life.
 
  • Like
Reactions: Roland Of Gilead
I think that is a too simplistic way to look at it. Yes, more VRAM 'can' help for certain aspects of game rendering, but it's not the be all and end all. If you look at games that come out now, they are unoptimized pieces of crap. Texture compression is non existent. This is what uses up so much vram, along with RT effects ( which for me are useless). Game Devs/Studios need to cop on. Texture compression algorithms are no new thing. It needs to be done better. These insane install sizes are proof that texture compression is not used at all.

Again, I would point to this thread being an opinion piece, and you are entitled to your opinion. But, you seem more set on challenging opinion, than sharing it.
Thank you!

Yeah, I completely agree. Software is currently written only for money and the truly brilliant minds are dimmed down by the career-seekers and the 'moneymakers.' A few decades ago, it was an entirely different story.
 
  • Like
Reactions: Roland Of Gilead
You're right of course. But what I mean is the stark difference between games before, say, 12-14 years and games before 24-28 years. Graphics kept getting better (in terms of visual perception) with lightning speeds until they got so good and lifelike that every little visual upgrade began to demand a lot faster hardware and brought in only a little more perceptible eye-candy than the last. There was a point, maybe when Crysis 3 was released, when games became so lifelike that shown to a kid - he would hardly notice a big difference between that gaming era and the current one. There is no much more room for perceptibly better graphics. Sure thing, we may be getting bigger resolutions and billions of more polygons and graphics would get better and worldly refined to the last pixel, but compared to how Crysis 3 looks to the eye, there would never be a difference as big as that which was say, between Crysis 3 and Midtown Madness. All I wanted to say is that we've reached that point; otherwise I am not disputing that current video cards put the older ones to shame (in terms of cores and computation); I am just saying that to the human eye the difference becomes less and less prominent and much harder to discern from real life.
So, if I'd give you this, wouldn't you agree this is a whole next level of realism compared to whatever we have so far?

View: https://youtu.be/otu_iFTivQw


I think that rather than exhausting our possibilities - we're going to reach whole next level of realism in games, if that's the desire in a matter of half a decade and it's only going to get better.

Quite frankly - your benchmark for what will eventually be possible should be literally going out on the street and looking around, and we're far from there yet because we're still limited by engine and hardware.
 
  • Like
Reactions: jnjnilson6
Resolution, Polygon count, texture quality and people's expectations of framerate are well beyond what they were in 2010.

I will grant that the gameplay experience isn't that much different, which is why I don't play a lot of newer games. Basically once we had full 3D that had a decently high polygon count everything was in place. What has been added since is mostly cosmetic and some interesting physics.

Also I had 12GB system memory in 2010 (Then I stuck with 16GB for the next decade), but yeah, definitely only needed 4-8GB at the time.
Remember 4 GB in 2010 were an insane amount. The first time I reached 16 Gigs was in 2012 on my i7-3770K machine (Corsair; could OC from 1600 to 2133 MHz; additionally was able to get the 3770K itself to 5 GHz w. Corsair H110 watercooling). And then the aspiration for higher memory kind of stalled in the hardware sector for a long time, as you said, indeed...

You are right about realism and the beautiful effect of imagination being taken away; that's why I prefer to read a book than to play a game and I haven't touched a game in a very long time; although those memories of Crysis will stay with me vividly until the end of time...
 
So, if I'd give you this, wouldn't you agree this is a whole next level of realism compared to whatever we have so far?

View: https://youtu.be/otu_iFTivQw


I think that rather than exhausting our possibilities - we're going to reach whole next level of realism in games, if that's the desire in a matter of half a decade and it's only going to get better.

Quite frankly - your benchmark for what will eventually be possible should be literally going out on the street and looking around, and we're far from there yet because we're still limited by engine and hardware.
Yeah.. It's undoubtedly a superior view. I wonder what will happen in the next decades. Current Nvidia GPUs are beating supercomputers from the beginning of the millennium. 👍
 
How far have we actually gone?
Back in the day (2009-2011) RAM was 4 to 8 GB in gaming systems and GPU VRAM was generally between 1-2 GB in the high-end sector.

What is terrific in retrospect is that with 4 GB system RAM and 1 GB VRAM on a high-end card you could play games at 1600x900 or 1920x1080 with stunning graphics and a decent framerate. We have gone up to 32 GB RAM and 8 to 16 GB VRAM today and the graphics of current games aren't much better. A marginal improvement in gaming only. Sure thing, 4K cannot compare to 1080p, but in the end of the day when you look at the screen and compare, you can but wonder if the difference you're witnessing really complies with a x8 difference in RAM and a x8 to x16 difference in VRAM (if we are talking about 4 GB RAM and 1 GB VRAM systems from 2010 and current gaming systems).

I still feel software was much better written back then. Do tell me your opinions and let's delve into the mysterious corners of the past!

Thank you!
Yeah, but in those days you only ran at max 30~40FPS at those resolutions not even talking about the settings from games GFX never being able to be maxed out, one thing that we are getting to see is when it was back in the day and you had everything setup right nothing would ever fail, likewise these days... there has been a time that has obviously passed that hardware and software had some hiccups, but I think where moving indeed and changes are apparent for sure!
 
  • Like
Reactions: jnjnilson6