Question Importance of VRAM and longevity

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
May 8, 2023
10
1
15
I am looking at upgrading my 2016 build (i5 6500/GTX 1060 6 gb/8 gigs RAM) in the next couple of weeks - I got some gift money or else I would consider holding out even a little longer. My previous build was late 2008/early 2009 and that's around my timeline that I typically follow. I know GPUs are a bit more expensive than 2016, but typically like to spend in that $200-$300 range IF it means I can get long-term use out of the hardware. I might bump to $350 given the current state of things. I do not plan on upgrading my 1080p/60 Hz monitor. I do like cranking up visual settings and RT sounds awesome. This said, I have no problem playing something later in a system's life at like 720p/low settings in order to have it be playable. I don't game competitively and can't really notice FPS differences so long as it is probably north of 30 or so - grew up playing Nintendo (NES up), which was historically 30 fps.

While this might sound like REALLY distant planning - my hope is to get hardware now that I can slightly tweak and turn into a retro gaming machine for perhaps PS3 era with minimal power draw, I'm currently looking at:
Ryzen 5 7600 (superior iGPU vs Intel)
2x16 gigs of DDR5 RAM
ASROCK B650M-HDV/M.2 motherboard

While I would remove it when it comes to its retirement plans as a retro gaming machine, I am torn on GPU selection. I have no allegiance to any brands. It sounds like DLSS is superior to FSR and ray tracing does better with nvidia. I also know some games call out the 1060 6 gig variant over the 3 gig variant, meaning VRAM has some level of importance. At this point in time, rumors are suggesting the 4060 would have 8 gigs of VRAM. There is a 3060 variant with 12 gigs. Thinking about longevity, which is the more likely stumbling block? I have no problem waiting until July or so if rumors are true on the 4060, but not knowing if there would be a 12/16 gig variant later hurts. That would likely be the best answer.

In terms of games I play - I am all over the place. The main thing is I rarely buy games for more than $20 - I'll wait a couple of years even for a price drop. My backlog is intense, though have games leap to the front of the line regularly. I do enjoy AAA and indie games - Hokko Life, My Time at Portia, Overcooked, Two Point, etc. I do have the LOTR Mordor games, Tomb Raider trilogy, etc., Spider-Man Remastered (Miles Morales is needing to hit a sale ASAP!), but all bought on deep sales or in a bundle. I do love some FIFA and Madden as well.

Ultimately, I'm not opposed to hearing a Radeon GPU would have better long term usage, but it's really a question of finding that VRAM/horsepower balance and where we think 1080p gaming heads with ray tracing, upscaling tech, etc. In that regard, it seems like a question between RTX 3060 12 gig and waiting on an RTX 4060 8 gig variant.

Thank you!
 
Question for you - without the side by side comparisons, is the tessellations off bad?
No, my point is that the difference between tessellation on and off is MASSIVE and affects EVERYTHING. Whether you personally like it better or not is irrelevant. What I was showing was how little RT makes a difference when compared to the difference that tessellation made back in the day when it was introduced. Compared to tessellation, ray-tracing is an absolute joke. Or at least, it's a joke right now. Maybe it will one day make the massive difference that tessellation did, but we're nowhere near that yet.
The fact we can now get accurate light reflections compared to solid colors or solid textures is incredible.
It would be... except that it cripples the game itself and I can't imagine anyone looking at shadows or reflections when they're hunting an enemy NPC or are themselves being hunted by one.

If you do that, you're almost unique and you probably get toasted in-game quite often because you're not paying attention to what you should be.
 
  • Like
Reactions: Order 66
Saying that Control is the one of the best implementations to me feels disingenuous when there are other games that use more aspects of ray tracing.
Disingenuous to use the game that was nVidia's Ray-Tracing poster-child? You've seen a lot of my posts in the past. When have I ever been disingenuous?
I will argue that while there are some noticeable graphical benefits if you will that ray tracing provides to the end user, a much bigger benefit is to the level artist. A few things that a level artist no longer has to worry about, though this is nowhere near the only things they don't have to worry about:
  • Placing fake lights to simulate global illumination
  • Light leakage, where some geometry is lit up from a light source that doesn't make sense.
    • The prominent example I remember from this was in Skyrim, where I saw the sun's reflection off water... while it was in the shadow of a building.
  • Tweaking with real time shadow rendering cutoff
  • How many dynamic lights they can use.
    • This is a big one, because dynamic lighting is a major factor in limiting how fast rasterization can perform, and I don't think rasterization can ever escape from M objects + N lights of complexity. While ray tracing itself may be computationally heavy, the only thing limiting its performance is simply geometry complexity
I don't think that you understand what I've been saying. I never said that RT is a bad innovation. I said that right now, it's nothing more than a gimmick because we don't yet have the hardware to properly implement it. I don't care about ray-tracing because it's not impressive enough to spend excessive amounts of my hard-earned dollars on it... yet. Sure, for the artists, it's probably a blessing from heaven but for gamers, it's not mature enough to really have the kind of impact that would make it worth spending extra hundreds of dollars on video cards that are already way overpriced. Does that make more sense?
Except in practice no game ever jacked up the sliders that much. Looking at say the cobblestone road, this immediately breaks my immersion because it looks no where near realistic:

oxbl6NZ.jpg
Unigine Heaven came out over a decade ago. Even if games didn't turn the tessellation settings up to 11, you still could. RTX cards came out in 2018 with the promise of "everything just works". So, here we are, five years later and RT still cripples cards. Then of course, there's the 8GB fiasco that's just making things worse.

Make no mistake, my annoyance with this RT obsession that some have is purely altruistic because I don't use it and I have a 16GB video card so I'm not suffering from the whole VRAM debacle either.

My stance is "Yeah, it's a great innovation and will one day rule the video game world... but not today."
A cursory glance around the interwebs for games that had a toggleable tessellation feature tells me:
Allegedly NVIDIA also pressured partnered game developers to do what Crytek did with Crysis 2: jack up the tessellation factor for no apparent benefit. Why? Because AMD's tessellation engine wasn't as good at the time, ergo, makes their cards look better.
Yes, that's true. What ironically happened is that the overuse of tessellation also hurt GeForce cards, but not as much. Now we have that Ultra+RT DLC for CP2077 that reminds me of exactly that.
Also I don't believe tessellation was really used to a high degree because adding geometry to a scene increases the rendering complexity no matter how you slice it. If you need say a brick wall to look like it has depth, a parallax occlusion map is plenty to get "good enough" as long as the person isn't pressing the camera up against it and looking at the wall in weird angles.
I'm not trying to sell tessellation as the greatest thing ever, I'm only using it as an example of a technology that had a much bigger impact on what a game looks like than RT does. It's a reason why I don't find RT to be all that impressive at this point in time.

When you've seen it all before, you're a lot harder to impress.
Not necessarily. There are ways to fake the look of extra geometry (such as parallax mapping) and these days hardware has more than enough horsepower to render detailed models from the get-go.

I would also argue that mesh shading is making tessellation moot,
The whole point of my post was to show just how big of a difference that a technology can make and to show that RT isn't there yet. I don't know who you're arguing with but your whole post is one of someone that completely missed the point. Most of what you said is pretty on-point (except the part where you tried to call me disingenuous) so I don't really get what you think that you're teaching me. I've been gaming on PC since the 1980s and I've seen everything that has come and gone. This is all old-hat to me and I seriously doubt that you're going to drop anything that is a big revelation to me.
 
  • Like
Reactions: Order 66

Firestone

Distinguished
Jul 11, 2015
99
18
18,535
While this is true, the difficulty is my hope is to buy something that will allow me to continue using it well into the future. So while there is not a use case currently, what about in 5 years?
In five years you re-evaluate and just buy another video card if you need to.

Too much obsession with penny pinching.

Shop on eBay you can get used gear for cheaper.
 
Disingenuous to use the game that was nVidia's Ray-Tracing poster-child? You've seen a lot of my posts in the past. When have I ever been disingenuous?
I was using the wrong word, but if Control is the best game you can come up with as "one of the best implementations of ray tracing", a game that uses ray tracing for a small amount of effects, tells me you're either ignorant of games that use ray tracing in many more aspects of lighting or you flat out dismiss them for some reason or another.

I don't think that you understand what I've been saying. I never said that RT is a bad innovation. I said that right now, it's nothing more than a gimmick because we don't yet have the hardware to properly implement it. I don't care about ray-tracing because it's not impressive enough to spend excessive amounts of my hard-earned dollars on it... yet. Sure, for the artists, it's probably a blessing from heaven but for gamers, it's not mature enough to really have the kind of impact that would make it worth spending extra hundreds of dollars on video cards that are already way overpriced. Does that make more sense?
If a feature is out of reach because of personal budgetary reasons, is that really a reason to condemn it to the point where you're preaching basically what a joke of a feature it is?

Graphics cards that support the latest whizzbang feature have always been one of the most expensive things a consumer could buy and while I don't have any statistics to back it up, I'm willing to bet that comparatively fewer percentage of the PC gaming population were willing to put down the money for one even back in the early 2000s. Heck even a $400 card back in 2002 was seen as really pricey if Anand was willing to lament on the pricing of the 9700 Pro:

As the Radeon 9700 Pro begins its journey into the hands of the fortunate few that are spending $399 on a video card, we're here to bring you a final review of the card based on shipping hardware.

Unigine Heaven came out over a decade ago. Even if games didn't turn the tessellation settings up to 11, you still could.
That could be said about any graphical feature with an adjustable value. Which is nearly all of them.

So, here we are, five years later and RT still cripples cards. Then of course, there's the 8GB fiasco that's just making things worse.
Because RT is that computationally expensive.

As a point of reference, let's look at a pure software renderer, something that doesn't have any specialized hardware because GPUs still have a lot of that. A Threadripper 3990X is able to render Crysis using DX10 WARP (i.e., pure software rendering) at around 15FPS with low settings. If we compare that to a software based ray tracer such as Cinebench R20/R23, a run takes about 10 seconds per frame. So taking this into consideration and trying to find some other data to help get some semblance of a comparison (because there's a lot of asterisks), I found some benchmark figures of video cards rendering Crysis at 800x600 low settings. Plucking out the 8800 GTS (I'm guessing the 640MB version), I found some benchmarks of it on Crysis at a higher quality setting and the performance drops around 60%, with the caveat that the lowest resolution I found that was tested was 1024x768.

So if we want to extrapolate some numerical value out of all this nonsense with a lot of asterisks, we're still in triple digits as far as how much more time it'll take to do a ray traced image than a rasterization one. And the image in Cinebench R20/23 isn't even that complex.

Doing a run of Cyberpunk 2077 benchmarks, I don't even lose half the performance turning on the RT Overdrive setting with DLSS FG disabled compared to the non-RT Ultra setting. Even if I turn off DLSS, performance drops down to about 1/6 to 1/7 that of non-RT Ultra. For reference, the numbers are 146 FPS for non-RT Ultra, 84 FPS for RT Overdrive w/o DLSS FG, and 22 FPS for RT Overdrive without DLSS.

If shrinking the computation time by two orders of magnitude isn't impressive, then I don't know anymore. And the fidelity you get with something like Cyberpunk 2077's RT overdrive would likely be as resource intensive if not worse. Take shadows for example, if I want clean, crisp shadows to the horizon, the game would need to render the shadow map at a stupidly high resolution.

I'm not trying to sell tessellation as the greatest thing ever, I'm only using it as an example of a technology that had a much bigger impact on what a game looks like than RT does. It's a reason why I don't find RT to be all that impressive at this point in time.
Given the examples I've shown, which you know, were actual games, I'd argue that tessellation didn't have much of an impact on graphical quality in games. In fact, in Ungine's tech demos after Heaven, tessellation isn't even used.

So within four years, Ungine dropped using a graphics feature they proudly touted before. Whereas it's getting close to 5 years with ray tracing and it's still being pushed out.

Most of what you said is pretty on-point (except the part where you tried to call me disingenuous) so I don't really get what you think that you're teaching me. I've been gaming on PC since the 1980s and I've seen everything that has come and gone. This is all old-hat to me and I seriously doubt that you're going to drop anything that is a big revelation to me.
I'm putting up a strawman here, but seriously, for someone to say they've "seen it all" and claim a graphical feature that wasn't really used in games as much as or in the manner their cherry picked example would lead people to believe, I find it hard to respect said words.

Maybe if you had like a handful of presentations at SIGGRAPH and a few papers published under your belt I might be more inclined to respect your claim you've seen it all.
 
Last edited:
I was using the wrong word, but if Control is the best game you can come up with as "one of the best implementations of ray tracing", a game that uses ray tracing for a small amount of effects, tells me you're either ignorant of games that use ray tracing in many more aspects of lighting or you flat out dismiss them for some reason or another.
I was just using Control because it was well-known and because, from what I could see, RT was a lot more evident in it than it is in CP2077 and Arkham Knights. I wanted to use a game that had better implementation than any game that I've tried. I didn't find it very impressive in CP2077 or Arkham Knights so I considered them to be bad examples and so I didn't use them. Of all the reviews that I've seen, to me, RT had the biggest impact on Control so I used it. Any ignorance that I have stems from the fact that I haven't played every game in existence. You share that ignorance.
If a feature is out of reach because of personal budgetary reasons, is that really a reason to condemn it to the point where you're preaching basically what a joke of a feature it is?
Right now, tt's a joke on hardware that most people can afford, so yes, it's a joke.
Graphics cards that support the latest whizzbang feature have always been one of the most expensive things a consumer could buy and while I don't have any statistics to back it up, I'm willing to bet that comparatively fewer percentage of the PC gaming population were willing to put down the money for one even back in the early 2000s. Heck even a $400 card back in 2002 was seen as really pricey if Anand was willing to lament on the pricing of the 9700 Pro:
Inflation hasn't quadrupled since 2002. I couldn't buy a car for $7,500 brand-new in 2002.
That could be said about any graphical feature with an adjustable value. Which is nearly all of them.


Because RT is that computationally expensive.
Yes, it is. At this point, it's too computationally expensive. If nVidia had introduced it just now, I would've found it very interesting because at least now, the top card can actually use it in all scenarios. Five years ago, this wasn't the case. When someone has the budget for it, I always offer the RTX 4090 as an option because it's the only card that can use all of these features really well. I also offer the RX 7900 XTX as an option because some people, even if they have the budget for it, still balk at paying that much money.

Now, sure, cards like the RX 7900 XTX and RTX 4080 aren't bad at RT but for the prices they command, they should be more than just "not bad". Yes, the RTX 4080 is far better than the RX 7900 XTX at RT but it's also an extra $200 and, to date, most games still don't have RT implementations. This is why I believe that the RTX 4080 is a terrible value so I never recommend that.
As a point of reference, let's look at a pure software renderer, something that doesn't have any specialized hardware because GPUs still have a lot of that. A Threadripper 3990X is able to render Crysis using DX10 WARP (i.e., pure software rendering) at around 15FPS with low settings. If we compare that to a software based ray tracer such as Cinebench R20/R23, a run takes about 10 seconds per frame. So taking this into consideration and trying to find some other data to help get some semblance of a comparison (because there's a lot of asterisks), I found some benchmark figures of video cards rendering Crysis at 800x600 low settings. Plucking out the 8800 GTS (I'm guessing the 640MB version), I found some benchmarks of it on Crysis at a higher quality setting and the performance drops around 60%, with the caveat that the lowest resolution I found that was tested was 1024x768.

So if we want to extrapolate some numerical value out of all this nonsense with a lot of asterisks, we're still in triple digits as far as how much more time it'll take to do a ray traced image than a rasterization one. And the image in Cinebench R20/23 isn't even that complex.

Doing a run of Cyberpunk 2077 benchmarks, I don't even lose half the performance turning on the RT Overdrive setting with DLSS FG disabled compared to the non-RT Ultra setting. Even if I turn off DLSS, performance drops down to about 1/6 to 1/7 that of non-RT Ultra. For reference, the numbers are 146 FPS for non-RT Ultra, 84 FPS for RT Overdrive w/o DLSS FG, and 22 FPS for RT Overdrive without DLSS.
With what card and at what resolution?
If shrinking the computation time by two orders of magnitude isn't impressive, then I don't know anymore.
I never said that RT isn't impressive, I said the exact opposite. I said that hardware today that isn't the RTX 4090 suffers too much from its use. The king of graphics is still resolution and I'd much rather play at 1440p with RT off than 1080p with RT on and I think that most people would agree. The fact that you happen to love RT doesn't change the fact that in every poll that I've ever seen, most gamers don't really care about it (and it's not even close).

Understanding the technical aspects does give you an appreciation for it, but at the end of the day, if your FPS rate struggles with it, all of its technical marvel becomes moot.
And the fidelity you get with something like Cyberpunk 2077's RT overdrive would likely be as resource intensive if not worse. Take shadows for example, if I want clean, crisp shadows to the horizon, the game would need to render the shadow map at a stupidly high resolution.
Yes, that's why I don't understand what the point of it is at this time. I get that such a thing would be amazing, of that there is no doubt, but with hardware the way it currently is, it's impossible.

Believe me, I felt the same way about tessellation when it was introduced on the ATi Radeon 8500 in 2001. I saw a demo at a computer show and thought "It looks amazing, but it's not really usable yet" and it didn't become even somewhat widespread until about five years later. IMO, ATi was being more realistic about it. Sure, it was a cool curiosity in 2001 but it took a decade before something like Unigine Heaven would have it. The ironic thing was that nVidia ended up surpassing ATi in tessellation and then started using it as a blunt weapon against them, crippling a lot of their own cards in the process. It's one of the first things that nVidia did that left a real sour taste in my mouth.

The RTX 20-series was like the Radeon 8500 in the way that it showcased a fantastic new technology but instead of treating it like the future-tech that it is, nVidia's marketing got people believing that if they didn't have RT, the games were crap. With the consumerist mindset that has dominated Gen-Y and Gen-Z, this was relatively easy to implement. In 2001, Gen-X was still the generation in their 20s. Gen-X was a lot more appreciative of tech as a whole because Gen-X'ers could still remember a time when that tech wasn't there but it got to us early enough in life (myself especially) that we didn't end up hating it like the Boomers tended to. We saw it through children's eyes and to us, it was beautiful, not mundane.

I actually feel sorry for the younger generations because they can't appreciate anything that isn't on the absolute cutting-edge. I don't blame them because I doubt that I'd be any different if I grew up in the same environment. I'm actually thankful for the fact that I can still have a crap-tonne of enjoyment just playing an old NES game.

Maybe that's the difference.