jnjnilson6

Distinguished
Hello!

Cards like the Radeon HD 7970 GHz are still somewhat adequate in the gaming / animation world today - 12 years after release. The RTX 5090 would be an even greater thing than the aforementioned had been upon release in 2025. The way the world of computer graphics has been evolving - do you believe the RTX 5090 could last until 2035 - 2037 as a card still with some punch left?

Of course this question hasn't a ready answer and we cannot do anything but speculate. But let's speculate a little. :)

Thank you!
 
Always hard to predict the future, but the past tells us that yes, a 5090 will be [at least somewhat] viable in 2035.
But that begs the question, if you're spending that much on a GPU now, how far down the performance charts are you going to let it sink before you buy a new one? Probably not 10 years worth.

3BUQTn5dZgQi7zL8Xs4WUL-1200-80.png.webp

If you bought a 980Ti for $650 (which is equiv. $850 today), 9 years ago, are you still owning it when it's giving you the performance equivalent of a GTX1660?


Let's face it, the x090 isn't a good value. The x080Ti is your best bet if you have that kind of $$ to spend.
 
Last edited:
  • Like
Reactions: jnjnilson6

TeamRed2024

Upstanding
Aug 12, 2024
182
115
260
The way the world of computer graphics has been evolving - do you believe the RTX 5090 could last until 2035 - 2037 as a card still with some punch left?

Absolutely. The 4090 is an absolute beast of a card... If the 5090 is the same leap in performance over the 4090 that the 4090 was over the 3090 I would definitely think 10 years of use was doable.

I'd argue the 4090 will still be viable in 2032... Path tracing aside it doesn't really break a sweat at anything.
 
  • Like
Reactions: jnjnilson6

jnjnilson6

Distinguished
Always hard to predict the future, but the past tells us that yes, a 5090 will be [at least somewhat] viable in 2035.
But that begs the question, if you're spending that much on a GPU now, how far down the performance charts are you going to let it sink before you buy a new one? Probably not 10 years worth.
That's very true ... There are periods of extreme component productivity increase in the matter of 2-3 years and even longer ones of synonymous performance and hardly any increases.

For example, in 2001 a GeForce2 and 256 MB RAM were considered top-notch; by 2005 most modern games did not run on the aforementioned GPU and the standard had shifted to 1 GB RAM.

From 2011 with the Core i7-2600/2700K processors up until 2017 with the release of Core i7-7700 the 4 core / 8 thread configuration did not budge a notch.

A big jump came with Alder Lake. The Core i7-11800H had 8 cores / 16 threads whilst the Core i7-12700H provided 14 cores and 20 threads. A huge change keeping in mind we're talking about a single generation ahead.

Yesterday I'd read that Intel 16th gen. would harbor over 50 cores. Do you think there's any legitimacy within this? Would it be an insane leap forward?
 

TeamRed2024

Upstanding
Aug 12, 2024
182
115
260
For example, in 2001 a GeForce2 and 256 MB RAM were considered top-notch; by 2005 most modern games did not run on the aforementioned GPU and the standard had shifted to 1 GB RAM.

My first build was in 2001... Athlon XP 1800+ with a GeForce3 Ti 200. I wanna say it had 64MB. I upgraded the following year to the GeForce4 Ti 4600.

Times have obviously changed. Those were the days when hardware was obsolete sooner rather than later. Technology has improved... so has hardware.

It's kinda like cell phones. From 2011 to 2015 I upgraded my phone every year... then it went to every couple years until the most recent upgrade of my 3 year old phone to the new 16 Pro Max (my first Pro Max) with 1TB storage. 256GB just doesn't cut it anymore.

I will get 5 years out of this phone. There's no reason I'll have to upgrade before then.
 
  • Like
Reactions: jnjnilson6
I think your main question is in technological advancements/innovations. Things like newer DX versions can/will leave older GPUs in incompatibility-land. Things like ray tracing (while not a necessity right now) will continue to exponentially evolve to the degree that considering ray tracing on a 2080Ti is kindof a no-go by today's standards (TBF it was kindof a no-go by the standards at it's launch also). Not sure where all the AI/neuro-processing stuff will go. Etc etc.

It all goes back to my statement above. If you're spending >$2,000 on a GPU, are you REALLY going to hold onto it until a $150GPU beats it? Reallly?

Let's not talk about CPUs. That's a different set of considerations. At least GPUs are dealing with largely parallel workloads regardless of gaming or compute.
 
  • Like
Reactions: jnjnilson6

jnjnilson6

Distinguished
My first build was in 2001... Athlon XP 1800+ with a GeForce3 Ti 200. I wanna say it had 64MB. I upgraded the following year to the GeForce4 Ti 4600.

Times have obviously changed. Those were the days when hardware was obsolete sooner rather than later. Technology has improved... so has hardware.

It's kinda like cell phones. From 2011 to 2015 I upgraded my phone every year... then it went to every couple years until the most recent upgrade of my 3 year old phone to the new 16 Pro Max (my first Pro Max) with 1TB storage. 256GB just doesn't cut it anymore.

I will get 5 years out of this phone. There's no reason I'll have to upgrade before then.
That's true ... Same goes along with gaming graphics, I suppose ... Crysis took that aspect to the skies back in 2007 ... And games like Crysis 3 and Far Cry and Battlefield 4 did a very good job. I would say that judging by the same axiom you have provided gaming graphics have too changed little from 2014 up until today. Little in comparison to how they'd changed from 2004 to 2014.
 
  • Like
Reactions: TeamRed2024

jnjnilson6

Distinguished
I think your main question is in technological advancements/innovations. Things like newer DX versions can/will leave older GPUs in incompatibility-land. Things like ray tracing (while not a necessity right now) will continue to exponentially evolve to the degree that considering ray tracing on a 2080Ti is kindof a no-go by today's standards (TBF it was kindof a no-go by the standards at it's launch also). Not sure where all the AI/neuro-processing stuff will go. Etc etc.

It all goes back to my statement above. If you're spending >$2,000 on a GPU, are you REALLY going to hold onto it until a $150GPU beats it? Reallly?

Let's not talk about CPUs. That's a different set of considerations. At least GPUs are dealing with largely parallel workloads regardless of gaming or compute.
That's true ... Software and hardware go hand-in-hand and we'll be seeing newer hardware features and newer software harbored within them which would really leave cards from the past both near and far away some ways behind.
 
gaming graphics have too changed little from 2014 up until today. Little in comparison to how they'd changed from 2004 to 2014.
Consider the leaps and bounds that were made in visual fidelity from 2004-2014. I'd say that by ~2014 we've reached a pretty respectable level of "photo-realism" that's kindof...."good enough" for most people, as evidenced by the lackluster adoption of ray tracing and whatnot. Nowadays, we're certainly running against more....game engine limitations. Facial animations improvements, use of AI to generate more realistic/adaptive conversations, etc etc aren't going to rely on GPU shader performance as much. But those are the types of things that we need to make the next jump.
 

jnjnilson6

Distinguished
Consider the leaps and bounds that were made in visual fidelity from 2004-2014. I'd say that by ~2014 we've reached a pretty respectable level of "photo-realism" that's kindof...."good enough" for most people, as evidenced by the lackluster adoption of ray tracing and whatnot. Nowadays, we're certainly running against more....game engine limitations. Facial animations improvements, use of AI to generate more realistic/adaptive conversations, etc etc aren't going to rely on GPU shader performance as much.
I have not dabbled within the gaming world in ages... But from what I've seen things have only changed marginally since the last time I have played which ought to have been around 2015. The precocious lure of sun shards hitting lingering leaves and the melancholy drag of the frame rate, skipping and polishing particulars in lighter beauty had really left me breathless at a time. But I suppose the industry has been getting along very well, especially keeping in mind the newer high-end video cards.
 
Aside from varying creative aesthetics, you can look at a few of the more "realistic" games of a time and just draw conclusion that, as time goes on and API tools get more feature-rich, more and more devs will be producing that level of visual realism to the point where it will become the norm. An exercise like that can fairly easily put you 5-7 years in the future as your new "normal". Sure, at that time, the peak game(s) will be pushing the limits higher, but..
 
  • Like
Reactions: jnjnilson6

Eximo

Titan
Ambassador
I would bet money the next major Direct3D version will be called DirectAI and require an NPU/GPU acceleration for a standard in-game AI framework.

5090 might be decent at that.

Nvidia ATX (hmm, maybe not ATX), and AMD AX coming to store shelves near you.
 
  • Like
Reactions: jnjnilson6

valthuer

Prominent
Oct 26, 2023
161
166
760
The way the world of computer graphics has been evolving - do you believe the RTX 5090 could last until 2035 - 2037 as a card still with some punch left?

It largely depends on your definition of "last".

What level of game resolution and/or graphics detail are we talking about?

The reason i'm asking, is 'cause 2 years after its release, 4090 can already be brought to its knees by a total of maybe 6 games, at 4K Ultra Path/Ray Tracing settings.

And we still have two months ahead of us, before the end of this year.

You should also take into account that Unreal Engine 5 is gradually becoming mainstream.

My personal guess? Even with a 100% performance increase over its predecessor, we'll be VERY lucky if 5090 lasts 4 years, before it becomes obsolete.
 
  • Like
Reactions: jnjnilson6

SyCoREAPER

Honorable
Jan 11, 2018
957
361
13,220
Everything and anything is prediction, conjecture and hearsay.

Until it comes out nobody knows anything. It could run hot as the sun, it could have an equally severe fatal flaw like the 12VHPWR connector. We know nothing. Until we at least have a render of FCC patent filing of the PCB no reasonable predictions can be made.