Question Importance of VRAM and longevity

May 8, 2023
10
1
15
I am looking at upgrading my 2016 build (i5 6500/GTX 1060 6 gb/8 gigs RAM) in the next couple of weeks - I got some gift money or else I would consider holding out even a little longer. My previous build was late 2008/early 2009 and that's around my timeline that I typically follow. I know GPUs are a bit more expensive than 2016, but typically like to spend in that $200-$300 range IF it means I can get long-term use out of the hardware. I might bump to $350 given the current state of things. I do not plan on upgrading my 1080p/60 Hz monitor. I do like cranking up visual settings and RT sounds awesome. This said, I have no problem playing something later in a system's life at like 720p/low settings in order to have it be playable. I don't game competitively and can't really notice FPS differences so long as it is probably north of 30 or so - grew up playing Nintendo (NES up), which was historically 30 fps.

While this might sound like REALLY distant planning - my hope is to get hardware now that I can slightly tweak and turn into a retro gaming machine for perhaps PS3 era with minimal power draw, I'm currently looking at:
Ryzen 5 7600 (superior iGPU vs Intel)
2x16 gigs of DDR5 RAM
ASROCK B650M-HDV/M.2 motherboard

While I would remove it when it comes to its retirement plans as a retro gaming machine, I am torn on GPU selection. I have no allegiance to any brands. It sounds like DLSS is superior to FSR and ray tracing does better with nvidia. I also know some games call out the 1060 6 gig variant over the 3 gig variant, meaning VRAM has some level of importance. At this point in time, rumors are suggesting the 4060 would have 8 gigs of VRAM. There is a 3060 variant with 12 gigs. Thinking about longevity, which is the more likely stumbling block? I have no problem waiting until July or so if rumors are true on the 4060, but not knowing if there would be a 12/16 gig variant later hurts. That would likely be the best answer.

In terms of games I play - I am all over the place. The main thing is I rarely buy games for more than $20 - I'll wait a couple of years even for a price drop. My backlog is intense, though have games leap to the front of the line regularly. I do enjoy AAA and indie games - Hokko Life, My Time at Portia, Overcooked, Two Point, etc. I do have the LOTR Mordor games, Tomb Raider trilogy, etc., Spider-Man Remastered (Miles Morales is needing to hit a sale ASAP!), but all bought on deep sales or in a bundle. I do love some FIFA and Madden as well.

Ultimately, I'm not opposed to hearing a Radeon GPU would have better long term usage, but it's really a question of finding that VRAM/horsepower balance and where we think 1080p gaming heads with ray tracing, upscaling tech, etc. In that regard, it seems like a question between RTX 3060 12 gig and waiting on an RTX 4060 8 gig variant.

Thank you!
 

IDProG

Distinguished
I am looking at upgrading my 2016 build (i5 6500/GTX 1060 6 gb/8 gigs RAM) in the next couple of weeks - I got some gift money or else I would consider holding out even a little longer. My previous build was late 2008/early 2009 and that's around my timeline that I typically follow. I know GPUs are a bit more expensive than 2016, but typically like to spend in that $200-$300 range IF it means I can get long-term use out of the hardware. I might bump to $350 given the current state of things. I do not plan on upgrading my 1080p/60 Hz monitor. I do like cranking up visual settings and RT sounds awesome. This said, I have no problem playing something later in a system's life at like 720p/low settings in order to have it be playable. I don't game competitively and can't really notice FPS differences so long as it is probably north of 30 or so - grew up playing Nintendo (NES up), which was historically 30 fps.

While this might sound like REALLY distant planning - my hope is to get hardware now that I can slightly tweak and turn into a retro gaming machine for perhaps PS3 era with minimal power draw, I'm currently looking at:
Ryzen 5 7600 (superior iGPU vs Intel)
2x16 gigs of DDR5 RAM
ASROCK B650M-HDV/M.2 motherboard

While I would remove it when it comes to its retirement plans as a retro gaming machine, I am torn on GPU selection. I have no allegiance to any brands. It sounds like DLSS is superior to FSR and ray tracing does better with nvidia. I also know some games call out the 1060 6 gig variant over the 3 gig variant, meaning VRAM has some level of importance. At this point in time, rumors are suggesting the 4060 would have 8 gigs of VRAM. There is a 3060 variant with 12 gigs. Thinking about longevity, which is the more likely stumbling block? I have no problem waiting until July or so if rumors are true on the 4060, but not knowing if there would be a 12/16 gig variant later hurts. That would likely be the best answer.

In terms of games I play - I am all over the place. The main thing is I rarely buy games for more than $20 - I'll wait a couple of years even for a price drop. My backlog is intense, though have games leap to the front of the line regularly. I do enjoy AAA and indie games - Hokko Life, My Time at Portia, Overcooked, Two Point, etc. I do have the LOTR Mordor games, Tomb Raider trilogy, etc., Spider-Man Remastered (Miles Morales is needing to hit a sale ASAP!), but all bought on deep sales or in a bundle. I do love some FIFA and Madden as well.

Ultimately, I'm not opposed to hearing a Radeon GPU would have better long term usage, but it's really a question of finding that VRAM/horsepower balance and where we think 1080p gaming heads with ray tracing, upscaling tech, etc. In that regard, it seems like a question between RTX 3060 12 gig and waiting on an RTX 4060 8 gig variant.

Thank you!
You look like the type of guy who will be okay with gaming at Low settings and/or using older hardware if it's cheaper. IMO, if you can't wait to save more money for a GPU, just buy either an RTX 3060 12GB or an RX 6700 XT. In 2 years, games will need 16GB of VRAM, but that's only if you want to play at higher settings. I think you'll be fine with 12GB VRAM and 16GB of DRAM.

If you can wait, however, here's what I would do:

Buy B550M DS3H motherboard instead (around $80 cheaper)
Buy a 2*8 DDR4-2400MHz instead (around $60 cheaper) or 2*16 if you can afford it.
Buy 5800X3D instead of 7600 using the $140 extra money

Wait and save money for 3-6 months. IMO, your GPU will be fine. There is a moderate possibility (25-40%) that there will be a 7700 XT 16GB or 7600 XT 16GB. IF 7600 XT 16GB exists, there is a chance that a 4060 16GB will also exist.

Combined with the declining price of graphics cards, that will be the best time to buy a GPU.
 
It is not a sound idea to base purchases on specs.
For example, your gtx 10606 gb is stronger than the 3gb version, not only because of the 6 vs. 3gb vram difference, but you have1280 CUDA cores vs. 1152.
Vram is a performance specs and works somewhat like real ram.
Your game needs to be able to keep most of what it needs in vram or suffer a bit fetching from ram or a drive.
Also, one can not compare nvidia vram with radeon vram. They are managed differently.
Best to compare two cards via benchmarks with the characteristics of YOUR games.

On the cpu side, you will find 13th gen intel to be price/competitive across the line.
You might find that even the i3-13100 processor would be a big and inexpensive upgrade.
You could even start with your current DDR4 ram.
 
May 8, 2023
10
1
15
You look like the type of guy who will be okay with gaming at Low settings and/or using older hardware if it's cheaper. IMO, if you can't wait to save more money for a GPU, just buy either an RTX 3060 12GB or an RX 6700 XT. In 2 years, games will need 16GB of VRAM, but that's only if you want to play at higher settings. I think you'll be fine with 12GB VRAM and 16GB of DRAM.

If you can wait, however, here's what I would do:

Buy B550M DS3H motherboard instead (around $80 cheaper)
Buy a 2*8 DDR4-2400MHz instead (around $60 cheaper) or 2*16 if you can afford it.
Buy 5800X3D instead of 7600 using the $140 extra money

Wait and save money for 3-6 months. IMO, your GPU will be fine. There is a moderate possibility (25-40%) that there will be a 7700 XT 16GB or 7600 XT 16GB. IF 7600 XT 16GB exists, there is a chance that a 4060 16GB will also exist.

Combined with the declining price of graphics cards, that will be the best time to buy a GPU.
To clarify, I would very much like to look at ultra 1080p with ray tracing up right now. I do not have the funds for my PC to upgrade every cycle or two, but do want to experience games. Honestly - I graduated for grad school and some family gave me money as a present, otherwise I'm not sure this would be happening now. Similarly, I would totally love to have my PC set up to a 4k or 1440 display, but it is obviously less resource intensive to game at a lower resolution. As a working adult with two little kids, I usually play on easy or something because I'd rather get to experience a full game than have to try mastering certain aspects in order to proceed, which was something I enjoyed when I was younger.

The problem with the 5800X3D is it lacks integrated graphics for the future retro emulation station. At that point, my hope is that I have a lower power set up. So save for an RPi 5 or 6 being available near MSRP that can emulate even up to PS2, I'm not sure the 5800X3D would make sense. I will consider looking at Ryzen 7 or i7's from the previous generations. I have been messing around on PCPartPicker with 12th and 13th gen i5s and DDR4/DDR5 as well as Zen 3 and Zen 4, but I hadn't looked at anything without iGPUs with any of them.
 
The question of how important VRAM is still boils down to the games in question. Yes there are a few games that do like to gobble up oodles of VRAM, but I'm not entirely convinced that all that usage is strictly necessary.

However, VRAM is still important in three key areas:
  • Render resolution
  • Texture resolution
  • Lighting resolution (which can be replaced with ray tracing depending on how much of it is done)
So if you do run into VRAM consumption issues, lowering any setting related to these should help stabilize performance. And even then, I'm also of the opinion that maximum image quality isn't worthwhile most of the time. You have to do a lot of pixel hunting at times between maximum quality and the next step down on still images to find differences.
 
May 8, 2023
10
1
15
It is not a sound idea to base purchases on specs.
For example, your gtx 10606 gb is stronger than the 3gb version, not only because of the 6 vs. 3gb vram difference, but you have1280 CUDA cores vs. 1152.
Vram is a performance specs and works somewhat like real ram.
Your game needs to be able to keep most of what it needs in vram or suffer a bit fetching from ram or a drive.
Also, one can not compare nvidia vram with radeon vram. They are managed differently.
Best to compare two cards via benchmarks with the characteristics of YOUR games.

On the cpu side, you will find 13th gen intel to be price/competitive across the line.
You might find that even the i3-13100 processor would be a big and inexpensive upgrade.
You could even start with your current DDR4 ram.
I hadn't realized that about my 1060 and the 3 gb variant. Good to know! Within 720/1080 gaming then, is it far more likely the processing abilities will go before VRAM caps out? I guess that's where I'm really going with this question. Reading the Best GPU's article on Tom's, I know certain generational upgrades have fewer specs in one way, but have better performance due to other improvements. As you mentioned, the amount of VRAM only matters when you are hitting your max. Do you have any suggestion on where I would place that kind of a concern? I realize every game is different, but looking at minimum specs for newer games, it is hard to tell if it is processing speed or VRAM needs that are pushing specs. For example, the 1060 6 gig has more CUDA cores than a 1650, but typically a xx6x gets bested by the xx5x of the next generation. Games don't typically give multiple GPUs when listing minimums so I'm left confused.
 

IDProG

Distinguished
To clarify, I would very much like to look at ultra 1080p with ray tracing up right now.
Everyone likes to look at Ultra settings. But not everyone can pay for it. This is what I meant.

From your comment, it seems like turning your system into a retro emulation system is a high priority.

First of all, the plan that I told you does not change, not a lot, anyway. Just swap 5800X3D to 5700G and you're good to go. The reason why you'll want to have an 8-core CPU is because games are developed around consoles, and consoles have an 8-core CPU. Also, emulation is very CPU-heavy, because you need to have around 10 times more processing power than the original system.
Oh yeah, one more thing, the 5700G loves fast RAM (if you play games using the integrated graphics, not a dedicated graphics card).

Second, I don't know if a desktop PC is a good idea for a retro emulation system. IMO, Steam Deck is a better device for that.
 
I am looking at upgrading my 2016 build (i5 6500/GTX 1060 6 gb/8 gigs RAM) in the next couple of weeks - I got some gift money or else I would consider holding out even a little longer. My previous build was late 2008/early 2009 and that's around my timeline that I typically follow. I know GPUs are a bit more expensive than 2016, but typically like to spend in that $200-$300 range IF it means I can get long-term use out of the hardware. I might bump to $350 given the current state of things. I do not plan on upgrading my 1080p/60 Hz monitor. I do like cranking up visual settings and RT sounds awesome. This said, I have no problem playing something later in a system's life at like 720p/low settings in order to have it be playable. I don't game competitively and can't really notice FPS differences so long as it is probably north of 30 or so - grew up playing Nintendo (NES up), which was historically 30 fps.

While this might sound like REALLY distant planning - my hope is to get hardware now that I can slightly tweak and turn into a retro gaming machine for perhaps PS3 era with minimal power draw, I'm currently looking at:
Ryzen 5 7600 (superior iGPU vs Intel)
2x16 gigs of DDR5 RAM
ASROCK B650M-HDV/M.2 motherboard

While I would remove it when it comes to its retirement plans as a retro gaming machine, I am torn on GPU selection. I have no allegiance to any brands. It sounds like DLSS is superior to FSR and ray tracing does better with nvidia. I also know some games call out the 1060 6 gig variant over the 3 gig variant, meaning VRAM has some level of importance. At this point in time, rumors are suggesting the 4060 would have 8 gigs of VRAM. There is a 3060 variant with 12 gigs. Thinking about longevity, which is the more likely stumbling block? I have no problem waiting until July or so if rumors are true on the 4060, but not knowing if there would be a 12/16 gig variant later hurts. That would likely be the best answer.

In terms of games I play - I am all over the place. The main thing is I rarely buy games for more than $20 - I'll wait a couple of years even for a price drop. My backlog is intense, though have games leap to the front of the line regularly. I do enjoy AAA and indie games - Hokko Life, My Time at Portia, Overcooked, Two Point, etc. I do have the LOTR Mordor games, Tomb Raider trilogy, etc., Spider-Man Remastered (Miles Morales is needing to hit a sale ASAP!), but all bought on deep sales or in a bundle. I do love some FIFA and Madden as well.

Ultimately, I'm not opposed to hearing a Radeon GPU would have better long term usage, but it's really a question of finding that VRAM/horsepower balance and where we think 1080p gaming heads with ray tracing, upscaling tech, etc. In that regard, it seems like a question between RTX 3060 12 gig and waiting on an RTX 4060 8 gig variant.

Thank you!
So you're going to want more than 8GB of VRAM and heaven knows that I don't blame you. You're making a smart move by trying to get a card that will last longer.

The RTX 3060 12GB is a pretty bad value in the current market. Sure, it has 12GB of VRAM but you're not going to find one for less than $340:
Zotac GeForce RTX 3060 Twin Edge 12GB - $340

I know that you said that you'd increase your budget to $350 if needed but if you're going to spend that, you'd be far better off with an RX 6700 XT for the same price:
ASRock Radeon RX 6700 XT Challenger D OC 12GB - $340

The RX 6700 XT is 27% faster than the RTX 3060 and has the same 12GB of VRAM so the RTX 3060 is pretty pointless right now.

However, if you want to stay under $300 (and, considering your use-case, I don't blame you) but still want more than 8GB of VRAM, I don't think that you can go wrong with an RX 6700:
Sapphire Radeon RX 6700 Pulse 10GB - $280

Despite being $60 less expensive than the RTX 3060 12GB, the RX 6700 is 19% faster. Right now, the RX 6700 is one of the best video card deals on the market and it's what I would ultimately recommend for you because, for 1080p gaming, even at max settings, this card will last for a dog's age because of its 10GB frame buffer. You can't go wrong getting this card for only $280.
 
  • Like
Reactions: Order 66

Eximo

Titan
Ambassador
To clarify, I would very much like to look at ultra 1080p with ray tracing up right now.
I'm sorry to have to tell you this but you're not going to be able to look at ultra 1080p with RT with the budget that you have.

To be honest, I tried RT ultra at 1080p with my RX 6800 XT and I really had to look hard to see any difference. When I was actually playing the game (Arkham Knights), I couldn't tell if it was on or off. It's a way over-hyped feature that really isn't ready for prime-time yet.

If you want to play 1080p ultra with RT, then you will need at least an RX 6800. I hope that you're able to cough up some major moolah though because they cost far more than the $350 hard limit that you set. There's also the fact that your PC case will have to be large enough to fit a triple-fan card:
PowerColor Radeon RX 6800 Fighter 16GB - $480

Also, if you're going to blow your budget like that, you may as well go all the way and pay the extra $30 to get the RX 6800 XT which is 14% faster for 6% more money:
ASRock Radeon RX 6800 XT Phantom Gaming D - $510

Ultimately though, for your purposes, the RX 6700 is the card that I would get if I were you. You're not going to find many games under $20 that even support RT anyway.
 
  • Like
Reactions: Order 66
May 8, 2023
10
1
15
The question of how important VRAM is still boils down to the games in question. Yes there are a few games that do like to gobble up oodles of VRAM, but I'm not entirely convinced that all that usage is strictly necessary.

However, VRAM is still important in three key areas:
  • Render resolution
  • Texture resolution
  • Lighting resolution (which can be replaced with ray tracing depending on how much of it is done)
So if you do run into VRAM consumption issues, lowering any setting related to these should help stabilize performance. And even then, I'm also of the opinion that maximum image quality isn't worthwhile most of the time. You have to do a lot of pixel hunting at times between maximum quality and the next step down on still images to find differences.
Totally! Since I have no experience with ray tracing, this is where I'm most confused. I would rather have a spatial audio experience than standard surround sound because it is more authentic. From what I can tell with ray tracing, it seems like it gives more authentic experience. I am guessing it is a game by game basis as well, but is there typically a difference with a lower RT setting vs the top? YouTube comparisons typically don't get into all of the settings, just an "On, Off" comparison - maybe I need to do more YouTube checks? When I started playing WoW back in the days of Vanilla, I had everything super low settings wise, the screen would tear like crazy in capitals, etc. When I got a gaming PC, I learned of so many details that I had been missing, it was seriously like I was looking at a different game.
So you're going to want more than 8GB of VRAM and heaven knows that I don't blame you. You're making a smart move by trying to get a card that will last longer.

The RTX 3060 12GB is a pretty bad value in the current market. Sure, it has 12GB of VRAM but you're not going to find one for less than $340:
Zotac GeForce RTX 3060 Twin Edge 12GB - $340

I know that you said that you'd increase your budget to $350 if needed but if you're going to spend that, you'd be far better off with an RX 6700 XT for the same price:
ASRock Radeon RX 6700 XT Challenger D OC 12GB - $340

The RX 6700 XT is 27% faster than the RTX 3060 and has the same 12GB of VRAM so the RTX 3060 is pretty pointless right now.

However, if you want to stay under $300 (and, considering your use-case, I don't blame you) but still want more than 8GB of VRAM, I don't think that you can go wrong with an RX 6700:
Sapphire Radeon RX 6700 Pulse 10GB - $280

Despite being $60 less expensive than the RTX 3060 12GB, the RX 6700 is 19% faster. Right now, the RX 6700 is one of the best video card deals on the market and it's what I would ultimately recommend for you because, for 1080p gaming, even at max settings, this card will last for a dog's age because of its 10GB frame buffer. You can't go wrong getting this card for only $280.
Much appreciated! So with the AMD GPUs, how do they fair with FSR and ray tracing? While I NEED the newer technology of an AMD 6000 or 7000 series or nvidia 3000 or 4000 series from a horsepower perspective, ray tracing seems like one of those "awesome" breakthroughs that is just in its early days still.

I love Tom's Hardware so much, but the Best GPU guide has the 6700 XT with cons of FSR2 can't defeat DLSS and "Weaker RT performance". When I look at the hierarchy, I do see the improvement in non-RT settings, but it looks like the 3060 beats even the 6700XT by a little bit. It is tricky since I realize they do not factor in DLSS/FSR with these things so I feel like the 3060 with DLSS would be much stronger than the 6700 XT with FSR when looking at 1080 gaming.
 
An argument could be made for Intel? Also $340 and gets you 16GB VRAM, latest AV1 video encoding. HDMI 2.1 and Display Port 2.0 for potential monitor upgrades (I know you didn't want to, but you never know, might want to hook it up to a 4K 120Hz TV someday)


Not perfect with all games though, you would certainly want to look at some reviews.
From a hardware standpoint, an argument could definitely be made for Intel. The 16GB A770 is an incredible value from a hardware standpoint and I would personally be interested in one but that's because I can troubleshoot problems. Most people don't want to have to do that and while I know that Intel's software suite is getting better all the time, I am still reluctant to recommend their cards to people because a lot of people would be just lost if there were a software issue.

If the OP is willing and able to deal with some problems, then Intel would be a great option. IIRC, they've mostly fixed their issues with DX9 titles.
 
  • Like
Reactions: Order 66
May 8, 2023
10
1
15
I'm sorry to have to tell you this but you're not going to be able to look at ultra 1080p with RT with the budget that you have.

To be honest, I tried RT ultra at 1080p with my RX 6800 XT and I really had to look hard to see any difference. When I was actually playing the game (Arkham Knights), I couldn't tell if it was on or off. It's a way over-hyped feature that really isn't ready for prime-time yet.

If you want to play 1080p ultra with RT, then you will need at least an RX 6800. I hope that you're able to cough up some major moolah though because they cost far more than the $350 hard limit that you set. There's also the fact that your PC case will have to be large enough to fit a triple-fan card:
PowerColor Radeon RX 6800 Fighter 16GB - $480

Also, if you're going to blow your budget like that, you may as well go all the way and pay the extra $30 to get the RX 6800 XT which is 14% faster for 6% more money:
ASRock Radeon RX 6800 XT Phantom Gaming D - $510

Ultimately though, for your purposes, the RX 6700 is the card that I would get if I were you. You're not going to find many games under $20 that even support RT anyway.
I mean, I see 3050 YouTube videos where they are running Spider-Man Remastered in 1080 and it looks awesome! I will say I image ray tracing is a lot like Dolby Atmos/DTS:X, which I love. A lot of movies "use" them, but their usage is poorly incorporated. When a movie implements it well though, it is sooooooooooo awesome. I just checked out the nvidia list of games/apps with RT and DLSS support and see a number of things I would have thought supported one or the other do not (yet?). The big unknown is what will support like moving forward? I suppose, like any new standard, it is a game of wait and see. I'm still waiting on NextGen TV/ATSC 3.0 devices to really hit the market in mass to try it out, and yet here we are...years after it was released, and many new TVs still lack it. I'm still waiting on movies constantly coming out with Atmos/DTS:X support and taking full advantage of it. The stuff that does though...incredible. It seems like the two Spider-Man games really show it off with all of the reflections from buildings.

As far as pricing of games goes, I'm totally fine waiting on games. I have plenty to play. The main reasons for upgrading now is I was gifted some money recently and some games that I REALLY want to play are hitting different hardware caps (I think a couple even system RAM, not GPU VRAM). It's just a matter of this is when I have enough gift money I can be guilt-free in upgrading my computer as those gifting it to me specifically said to use it for something fun for me. Believe me, I could find a lot of ways to spend that money on other things, but they wouldn't bring me any joy/relaxation.
 
May 8, 2023
10
1
15
From a hardware standpoint, an argument could definitely be made for Intel. The 16GB A770 is an incredible value from a hardware standpoint and I would personally be interested in one but that's because I can troubleshoot problems. Most people don't want to have to do that and while I know that Intel's software suite is getting better all the time, I am still reluctant to recommend their cards to people because a lot of people would be just lost if there were a software issue.

If the OP is willing and able to deal with some problems, then Intel would be a great option. IIRC, they've mostly fixed their issues with DX9 titles.
I had been reading a little about Intel. I'm not sure I am sold on them yet. I would LOVE to see a third viable player in the market. I'm just not convinced Intel will get things fixed soon enough and feel profitable enough to stay in the market. I think we need a third viable player for GPUs (in most markets, really). I'm just not ready to buy a card from a company that could realistically be out of the market before I upgrade again after this time. Similar to my gaming choices to play on easy, I try to avoid troubleshooting as it can eat up my would be gaming time. I'm not opposed to troubleshooting though, just not if it gets in the way or has a level of normalcy to it.
 
  • Like
Reactions: Avro Arrow

Eximo

Titan
Ambassador
Intel is pushing ahead with no signs of cancellation. Second generation Arc cards are going to double the core count and likely show up in about one year.

More recent views compare the A770 8GB (not the 16GB) to the between the RTX 3060 and 3060Ti and similar to an RX6700XT. Intel does better with ray tracing than AMD and they have their own DLSS/FSR technology called XeSS.

It mostly has issues with older games. DX9 now has a DXVK layer that converts it to Vulkan to run on the card. DX12 and Vulkan capable titles have always run pretty well.

I have an A380, and I threw a few games at it. Unreal engine 4 worked quite well even with the early drivers.

Here is a more recent video that covers the specific Acer 16GB card.

View: https://www.youtube.com/watch?v=4TagHVhF3to
 
Last edited:
Totally! Since I have no experience with ray tracing, this is where I'm most confused. I would rather have a spatial audio experience than standard surround sound because it is more authentic. From what I can tell with ray tracing, it seems like it gives more authentic experience. I am guessing it is a game by game basis as well, but is there typically a difference with a lower RT setting vs the top? YouTube comparisons typically don't get into all of the settings, just an "On, Off" comparison - maybe I need to do more YouTube checks? When I started playing WoW back in the days of Vanilla, I had everything super low settings wise, the screen would tear like crazy in capitals, etc. When I got a gaming PC, I learned of so many details that I had been missing, it was seriously like I was looking at a different game.
Ray tracing settings typically affect the following:
  • The maximum distance at which objects are considered
  • Resolution at which rays are cast (is it casting rays based say 100% vs 50% rendering resolution)
  • What lighting aspects are considered, such as:
    • Reflections
    • Ambient occlusion
    • Direct light shadows
    • Indirect lighting (AKA global illumination)
Though note that not every game with ray tracing does all the above. Earlier titles typically only used RT for ray tracing, shadows, or ambient occlusion.

A special case is in something like Cyberpunk 2077, where the highest ray tracing setting also includes direct lighting. Though this video showcases various things ray tracing handles:
 
Totally! Since I have no experience with ray tracing, this is where I'm most confused. I would rather have a spatial audio experience than standard surround sound because it is more authentic. From what I can tell with ray tracing, it seems like it gives more authentic experience.
Yeah, but here's the thing. The only time you can tell RT from non-RT is if you're looking at two images side by side. If you're not using it, you won't miss it but if you are using it, you will miss your extra FPS, especially at your budget level.
Much appreciated! So with the AMD GPUs, how do they fair with FSR and ray tracing?
Radeons fair as well with FSR as any other card would and they fair worse with RT than GeForce cards. The thing is, at your budget level, good RT performance doesn't exist. Remember that there's no difference between 15FPS and 25FPS because they're both unplayable so you may as well ignore it in the first place.

My RX 6800 XT is quite capable of RT at 1080p and I tried it out in Gotham Knights and then CP2077. It had no positive impact on my gaming experience. It did have a negative impact because 1440p ultra with RT off looks a lot better than 1080p ultra with RT on.
While I NEED the newer technology of an AMD 6000 or 7000 series or nvidia 3000 or 4000 series from a horsepower perspective, ray tracing seems like one of those "awesome" breakthroughs that is just in its early days still.
Not at 1080p60Hz you won't. For a 1080p60Hz monitor, your budget cap of $300 is very sensible and it's why I recommended the 10GB RX 6700.
I love Tom's Hardware so much, but the Best GPU guide has the 6700 XT with cons of FSR2 can't defeat DLSS and "Weaker RT performance". When I look at the hierarchy, I do see the improvement in non-RT settings, but it looks like the 3060 beats even the 6700XT by a little bit. It is tricky since I realize they do not factor in DLSS/FSR with these things so I feel like the 3060 with DLSS would be much stronger than the 6700 XT with FSR when looking at 1080 gaming.
Don't ever factor in DLSS or FSR when buying a new card. Those are crutches that should only be used by cards that are already old. If you need to use upscaling on day one of buying a card, you should have bought They don't look as good as native resolution so what's the point of reducing image quality just to turn on ray-tracing? I mean, you can if you want to, but it just makes no sense to me.
 
  • Like
Reactions: Order 66
I mean, I see 3050 YouTube videos where they are running Spider-Man Remastered in 1080 and it looks awesome!
They're using upscaling and Spider-Man Remastered looks awesome no matter how you play it. Try watching it at 1440p without RT or upscaling and you'll see that it looks even more glorious.
I will say I image ray tracing is a lot like Dolby Atmos/DTS:X, which I love. A lot of movies "use" them, but their usage is poorly incorporated. When a movie implements it well though, it is sooooooooooo awesome.
I agree. When RT becomes properly implemented, it will be fantastic. I just don't see it happening soon enough to be worried about supporting it right now. The technology just isn't there yet. Like I said, I tried it while playing Gotham Knights, Cyberpunk 2077 and I also tried the Witcher III RT DLC.

I found that I literally had to stop and look closely at everything to notice that RT was on. When I was actually doing something in-game, I couldn't tell at all. Since most of the time I'm doing something in-game, I can almost never tell. When I stop and look at the scenery, yeah, I can see it in some places but it doesn't give me that "ooooo" feeling like I got when I first saw the Duchy of Toussaint or when I first looked at the mountains in Skyrim. Neither situation required RT (because RT didn't exist).

Like, what does RT add to this?:
maxresdefault.jpg

I just checked out the nvidia list of games/apps with RT and DLSS support and see a number of things I would have thought supported one or the other do not (yet?). The big unknown is what will support like moving forward?
Moving forward, more and more games will support upscaling and RT but I suspect that FSR and XeSS will be more adopted than DLSS because they work with all cards, not just GeForce cards. As it is now, yeah, the vast majority of extant games don't support upscaling or ray tracing at all.
I suppose, like any new standard, it is a game of wait and see. I'm still waiting on NextGen TV/ATSC 3.0 devices to really hit the market in mass to try it out, and yet here we are...years after it was released, and many new TVs still lack it. I'm still waiting on movies constantly coming out with Atmos/DTS:X support and taking full advantage of it. The stuff that does though...incredible. It seems like the two Spider-Man games really show it off with all of the reflections from buildings.
Sure, but once you've seen it, you've seen it. It then just becomes a part of the background and you forget that it's there. When you become focused on accomplishing a task, the scenery no longer matters (and as I said, Spider-Man looks glorious no matter what).
As far as pricing of games goes, I'm totally fine waiting on games. I have plenty to play. The main reasons for upgrading now is I was gifted some money recently and some games that I REALLY want to play are hitting different hardware caps (I think a couple even system RAM, not GPU VRAM). It's just a matter of this is when I have enough gift money I can be guilt-free in upgrading my computer as those gifting it to me specifically said to use it for something fun for me. Believe me, I could find a lot of ways to spend that money on other things, but they wouldn't bring me any joy/relaxation.
I completely understand where you're coming from. All I'm trying to do is help you make the most out of that gift. I saw a video of Jedi: Survivor where the narrator had to check to make sure that RT was turned on! If RT was as amazing as nVidia tries to pretend, you would never have to check.

Back in the day, a real game-changer came out to revolutionise gaming and that was called hardware tessellation. Let me show you what a real revolutionary graphics technology looks like and we'll compare the effects of hardware-accelerated ray-tracing to hardware-accelerated tessellation.

One of the best implementations of hardware-accelerated ray-tracing, to date, has been a game called Control:
maxresdefault.jpg

So, there's your difference between RT Off and RT On. Now, I'll show you the difference between hardware-accelerated tessellation off and on using Unigine Heaven.

Tessellation Off:
UNIGINE-HEAVEN-SCREENSHOT.jpg


Tessellation On:
00004.jpg


Now THAT is what a real game-changing graphics technology looks like. All the really young gamers never saw the transition to tessellation so they're all too-easily impressed by things like the current state of ray-tracing.
 
  • Like
Reactions: Order 66
I had been reading a little about Intel. I'm not sure I am sold on them yet. I would LOVE to see a third viable player in the market. I'm just not convinced Intel will get things fixed soon enough and feel profitable enough to stay in the market. I think we need a third viable player for GPUs (in most markets, really). I'm just not ready to buy a card from a company that could realistically be out of the market before I upgrade again after this time. Similar to my gaming choices to play on easy, I try to avoid troubleshooting as it can eat up my would be gaming time. I'm not opposed to troubleshooting though, just not if it gets in the way or has a level of normalcy to it.
I completely respect your position and it's pretty much what I expected. I didn't think that you'd be comfortable with an Intel card at this time. I was thinking about it but I also reasoned that 16GB of VRAM wouldn't do much to help with playing at 1080p60Hz.
 
  • Like
Reactions: Order 66

Firestone

Distinguished
Jul 11, 2015
99
18
18,535
Like I said, I tried it while playing Gotham Knights, Cyberpunk 2077 and I also tried the Witcher III RT DLC.

I found that I literally had to stop and look closely at everything to notice that RT was on.
yep exactly my experience as well

Ray tracing is a stupid gimmick. Its literally imperceptible in every case I have seen IRL.

Sure, the demo videos will try to sell you on some pre-selected best-case-scenario beautiful graphics effects, but when you actually play the games yourself you pretty much never see anything besides a decrease in FPS
 
  • Like
Reactions: Avro Arrow
May 8, 2023
10
1
15
the RTX 3060 12GB is a good video card

worrying about running out of VRAM is futile without first having a use case that does not have enough VRAM
While this is true, the difficulty is my hope is to buy something that will allow me to continue using it well into the future. So while there is not a use case currently, what about in 5 years? I remember when I set up my system in 2016, I thought 8 gigs of RAM would surely be enough for its life. While it is for most things still, I do know there are games out now that min requirements are 16 gigs (aka Hogwarts Legacy). Is my plan for 32 gigs of RAM in this next build going to bite me towards the end? Thankfully Hogwarts Legacy is doing well enough that it wouldn't be something I would likely buy any time soon so I would likely be onto the next build anyway.

At this point, I'm going to wait for the next announcements of the RX7600 and RTX 4060. I did some tweaking with my graphics settings yesterday and in-game metrics before work today. I discovered weirdness within games (ex: Madden 22 was stuck at 30 fps until I disabled vsync) as well as potential CPU bottlenecking, depending on my settings. Long story short, I think my 1060 is still doing okay enough to bring it with me for some of the next build. At this point, I'm going to move forward with a new CPU/RAM/motherboard and will stash some cash until the later summer/fall.

One thing I hadn't realized my question was going to touch was TDPs. I think the power requirements scare me away from the RX 6700 XT. Our utility bills are stupid already, particularly in the winter (whole subdivision, not just our house). While I'm sure the furnace/awful insulation are a bigger impact, I try to be mindful. Given the expected announcements are anticipated within a few weeks, I imagine the RX 6700 or RTX 3060 will drop in price after their replacements are announced. If nothing else, grab one of those then or evaluate the price/performance of the 7700/7600 and 3060. I appreciate everyone's help!
 
May 8, 2023
10
1
15
One of the best implementations of hardware-accelerated ray-tracing, to date, has been a game called Control:
maxresdefault.jpg

So, there's your difference between RT Off and RT On. Now, I'll show you the difference between hardware-accelerated tessellation off and on using Unigine Heaven.

Tessellation Off:
UNIGINE-HEAVEN-SCREENSHOT.jpg


Tessellation On:
00004.jpg


Now THAT is what a real game-changing graphics technology looks like. All the really young gamers never saw the transition to tessellation so they're all too-easily impressed by things like the current state of ray-tracing.

Question for you - without the side by side comparisons, is the tessellations off bad? Isn't your point that without a side by side true here, too? I think that's true with anything visual/audio. Look for far floors have come since the days of Doom/Wolfenstein. The fact we can now get accurate light reflections compared to solid colors or solid textures is incredible.
 
One of the best implementations of hardware-accelerated ray-tracing, to date, has been a game called Control:
Saying that Control is the one of the best implementations to me feels disingenuous when there are other games that use more aspects of ray tracing.

I will argue that while there are some noticeable graphical benefits if you will that ray tracing provides to the end user, a much bigger benefit is to the level artist. A few things that a level artist no longer has to worry about, though this is nowhere near the only things they don't have to worry about:
  • Placing fake lights to simulate global illumination
  • Light leakage, where some geometry is lit up from a light source that doesn't make sense.
    • The prominent example I remember from this was in Skyrim, where I saw the sun's reflection off water... while it was in the shadow of a building.
  • Tweaking with real time shadow rendering cutoff
  • How many dynamic lights they can use.
    • This is a big one, because dynamic lighting is a major factor in limiting how fast rasterization can perform, and I don't think rasterization can ever escape from M objects + N lights of complexity. While ray tracing itself may be computationally heavy, the only thing limiting its performance is simply geometry complexity
So, there's your difference between RT Off and RT On. Now, I'll show you the difference between hardware-accelerated tessellation off and on using Unigine Heaven.


Now THAT is what a real game-changing graphics technology looks like. All the really young gamers never saw the transition to tessellation so they're all too-easily impressed by things like the current state of ray-tracing.
Except in practice no game ever jacked up the sliders that much. Looking at say the cobblestone road, this immediately breaks my immersion because it looks no where near realistic:

oxbl6NZ.jpg


A cursory glance around the interwebs for games that had a toggleable tessellation feature tells me:
Allegedly NVIDIA also pressured partnered game developers to do what Crytek did with Crysis 2: jack up the tessellation factor for no apparent benefit. Why? Because AMD's tessellation engine wasn't as good at the time, ergo, makes their cards look better.

Also I don't believe tessellation was really used to a high degree because adding geometry to a scene increases the rendering complexity no matter how you slice it. If you need say a brick wall to look like it has depth, a parallax occlusion map is plenty to get "good enough" as long as the person isn't pressing the camera up against it and looking at the wall in weird angles.

Question for you - without the side by side comparisons, is the tessellations off bad? Isn't your point that without a side by side true here, too? I think that's true with anything visual/audio. Look for far floors have come since the days of Doom/Wolfenstein. The fact we can now get accurate light reflections compared to solid colors or solid textures is incredible.
Not necessarily. There are ways to fake the look of extra geometry (such as parallax mapping) and these days hardware has more than enough horsepower to render detailed models from the get-go.

I would also argue that mesh shading is making tessellation moot, because it both simplifies the geometry rendering pipeline and provides developers more control over it. This allows billions of triangles to exist in a scene, but only render those that are really necessary:

(note this isn't an NVIDIA only feature. It's a part of DX12 Ultimate.)
 
yep exactly my experience as well

Ray tracing is a stupid gimmick. Its literally imperceptible in every case I have seen IRL.

Sure, the demo videos will try to sell you on some pre-selected best-case-scenario beautiful graphics effects, but when you actually play the games yourself you pretty much never see anything besides a decrease in FPS
Absolutely! It's because, when you're gaming, you're focused on what you're doing in-game and the background just becomes a blur. If I'm looking for an enemy NPC and I know that they're looking for me, I'm not going to notice how many leaves are on a tree or if the shadowing is 100% photorealistic. This is just some gimmick that Jensen is pushing to sell more video cards at higher prices.

One day, it will be a phenomenal thing, but here we are, five years after its release and it's still an albatross around gamers' necks.
 
Last edited:
  • Like
Reactions: Order 66