News Mystery AMD Radeon GPU cooler spotted on Chinese forums is larger than RX 7900 XTX, with a massive heatsink and 3x 8-pin connectors — possibly the...

Admin

Administrator
Staff member
Would have bought this 7999XTX or nowadays 9999XTX instead of 4090 if any of 'em were true.
4090 is a very performant beast and I like the performance, double like video upscaling, but I'm too fed with their terrible drivers and multi-monitor issues which never were the case on AMD cards (this was my first nV card in 10 years...). Having flakey and easily burnable power connector also does not add much expectations for the future even if I seem to be lucky [and careful] enough to avoid it for now.
 
Last edited:
It is a shame that AMD willingly surrenders the top end and simply fails to even try.

Especially in light of Nvidia's melting connectors. AMD couldn't get a better opportunity. "Our GPUs don't melt"
 
Just like others have said : Maybe the purpose of this cooler was for AMD to determine where they could stand against nVidia with AIB cards. They already knew what AIBs would do with the reference card, so they created an AIB card themselves.
 
  • Like
Reactions: rluker5
Short of something more problematic than ray tracing coming forward in gaming, I am thinking that in three generations of video cards, it will be trivial for a modest video card to drive a 4k game at 180 fps. So, someone will have to come up with the next eye candy need that will require ever more powerful graphics cards or in 5 generations integrated graphics will suffice for your average gamer with most of today's eye candy on full show.

I am at the end of my vision capability at 5120x2160 on a 40 inch monitor, so smaller pixels is not really an option. It already covers the vast majority of my direct vision and some of my peripheral vision. While some games benefit from apparently 600 frames per second, and some people vision is definitely better than mine, there has to be a point at which diminishing returns become non existent returns.

Of course, I was mostly happy with my 1080p for quite a while, but I never felt it was optimal. I am pretty close to feeling that monitors are near optimal in many ways, even if no individual monitor yet hits the perfect yet.

What kinds of technology would be good for driving more computer power?

Near Hemispheric monitors with the same pixel density as current high density monitors with a spherical 800R or 1000R would give some serious ability to feel part of the environment depicted. That could drive pixels up 6 or more times 4K.
 
It is a shame that AMD willingly surrenders the top end and simply fails to even try.
There’s been rumors of multi-die GPU designs being worked on at AMD that don’t just move the memory controller off the compute die, but actually utilize multiple compute dies… but the rumors are also that AMD has run into a ton of trouble getting them to seamlessly behave like a single mega-GPU as would be necessary for gaming. Supposedly a multi-die config was the original plan for a third RDNA 4 die that would sit above Navi 48... but it presented more problems than expected and didn’t make it into production.
 
Short of something more problematic than ray tracing coming forward in gaming, I am thinking that in three generations of video cards, it will be trivial for a modest video card to drive a 4k game at 180 fps.
And that would still kinda be not enough to drive 8K@60 at any decent quality settings...
 
Short of something more problematic than ray tracing coming forward in gaming, I am thinking that in three generations of video cards, it will be trivial for a modest video card to drive a 4k game at 180 fps. So, someone will have to come up with the next eye candy need that will require ever more powerful graphics cards or in 5 generations integrated graphics will suffice for your average gamer with most of today's eye candy on full show.

I am at the end of my vision capability at 5120x2160 on a 40 inch monitor, so smaller pixels is not really an option. It already covers the vast majority of my direct vision and some of my peripheral vision. While some games benefit from apparently 600 frames per second, and some people vision is definitely better than mine, there has to be a point at which diminishing returns become non existent returns.

Of course, I was mostly happy with my 1080p for quite a while, but I never felt it was optimal. I am pretty close to feeling that monitors are near optimal in many ways, even if no individual monitor yet hits the perfect yet.

What kinds of technology would be good for driving more computer power?

Near Hemispheric monitors with the same pixel density as current high density monitors with a spherical 800R or 1000R would give some serious ability to feel part of the environment depicted. That could drive pixels up 6 or more times 4K.
next tech that will justify trading your 5090 for a 8090 is gonna be path tracing, requiring much more power for so little graphical upgrade; but with enough marketing, you will be led to believe you need it.
 
  • Like
Reactions: A Stoner