News AMD's RX 7900 XT Is the Second-Best Selling GPU on Amazon, at 17% Off

On the subject of RT, if the 3070 is any indication, 12GB won't really cut it in about 3 years from now if Devs will target 4K resolutions for all textures and poly-complexity.

Remember that it's not just the textures that reside in memory, but a plethora of other things. And RT just doubles down on the complexity side of texture mapping and its constructs.

Well, time will tell, but I'd say it's a safer bet getting the 7900 siblings, the 6800 siblings or the 4090 at the top for full eye candy in the years to come.

Regards.
 
  • Like
Reactions: PEnns
On the subject of RT, if the 3070 is any indication, 12GB won't really cut it in about 3 years from now if Devs will target 4K resolutions for all textures and poly-complexity.

Remember that it's not just the textures that reside in memory, but a plethora of other things. And RT just doubles down on the complexity side of texture mapping and its constructs.

Well, time will tell, but I'd say it's a safer bet getting the 7900 siblings, the 6800 siblings or the 4090 at the top for full eye candy in the years to come.

Regards.
The previous 3070Ti 8GB was the worst value for VRAM by far, the same case with the 4070Ti, 4080 is a joke and 7900XTX is...OK at $1K , because of the Huge VRAM and price/performance, though RT might matter a lot for buyers at this range. For me, it is the best card ever, I managed to get one for $900 (used, someone sold me on OLX saying they were buying a 4090) , I wanted a sapphire but this Aorus Elite turned out to be a beast at UV, 1030mV and it's a silent monster now. 7900XT, like the 6900XT have meh MSRP but in practice they were not bad at all. 6900XT was the same price as the 6800XT for quite a while in my local area during crypto boom. Wonder why the 4070Ti isn't dropping, Nvidia perhaps gives AIBs very slim margins so they cannot afford to drop prices, they'll be forced to either way though to move stock.
 
  • Like
Reactions: PEnns
My next card will be an A770 - 16gb of ram, active driver development, and a $349 cost means I'll be set for 1440p gaming until battlemage releases.
 
While I own this card, and performance is superb, i do not recommend it. The fans are loud AF. They seem to have fitted the least efficient fans in existence. If you spin the fans up past 35%, good luck explaining it to the neighbors. I've had loud GPUs before, but this one takes the cake.
 
While I own this card, and performance is superb, i do not recommend it. The fans are loud AF. They seem to have fitted the least efficient fans in existence. If you spin the fans up past 35%, good luck explaining it to the neighbors. I've had loud GPUs before, but this one takes the cake.
Yea I agree, also the coilwhile is immense.

Performance tho is very nice!
 
On the subject of RT, if the 3070 is any indication, 12GB won't really cut it in about 3 years from now if Devs will target 4K resolutions for all textures and poly-complexity.

Remember that it's not just the textures that reside in memory, but a plethora of other things. And RT just doubles down on the complexity side of texture mapping and its constructs.

Well, time will tell, but I'd say it's a safer bet getting the 7900 siblings, the 6800 siblings or the 4090 at the top for full eye candy in the years to come.

Regards.
Fun fact, most people don't give a hoot about 4k, and the 4070Ti is advertised for 1440p anyways. And I have yet to find a game that isn't playing on a 4070Ti, and I'm 100% sure it will be that way a quite a while... plus, a 6800XT can't really do RT even today, (no, that sorry excuse it produces does not count) and RT is literally the worst thing hogging VRAM.
 
Fun fact, most people don't give a hoot about 4k, and the 4070Ti is advertised for 1440p anyways. And I have yet to find a game that isn't playing on a 4070Ti, and I'm 100% sure it will be that way a quite a while... plus, a 6800XT can't really do RT even today, (no, that sorry excuse it produces does not count) and RT is literally the worst thing hogging VRAM.
I half agree, as I personally don't find 4K something I'd want for myself. I'm a 1440p preacher, hehe.

That being said, I do own a VR headset and I'm planning on getting whatever Valve releases next, so I'll be needing the extra VRAM for sure when that happens and, additionally to VR, think about it from Publishers and Developers targeting consoles: they have plenty shared memory that they don't need to "dual pool" (in the console) and this is important to understand as there's a lot of nuance in there which translates roughly to, at worst, having to duplicate all that memory pool in PC ports on both VRAM and RAM so your game can just load. This is not even taking into account resolution; it's just game core elements you put in memory. So, what I'm trying to get at here is: once they get it running on, say, the PS5 using 100% of it (memory, CPU, etc) the PC port will definitely will use 2X of the PS5 resources if history serves as guide.

I've been listening to a lot of podcasts with game developers and they're all pretty darn vocal about not just wanting more RAM all around (system and GPU), but needing more because the consoles just set the floor for the next gen of games, whether we like it or not as PC enthusiasts. The fact they "target" 4K is just a baseline for complexity and graphical fidelity on screen as a baseline, which also includes textures, which is what is the easiest thing to translate to memory usage, but there's a lot more than just textures stored in VRAM; way more.

Regards.
 
While I own this card, and performance is superb, i do not recommend it. The fans are loud AF. They seem to have fitted the least efficient fans in existence. If you spin the fans up past 35%, good luck explaining it to the neighbors. I've had loud GPUs before, but this one takes the cake.
That's odd.
While it's louder than the 4090, just by a bit, it's literally less loud than the 3070-3090 series of cards, according to benchmarks.
The decibel level (about 37.2) is about the same as the rest of the fans on a typical mid case, unless you have expensive, extremely quiet fans.